From: N8TM AT aol DOT com Subject: Re: Re: long long vs long 24 Jul 1998 22:44:41 -0700 Message-ID: <7d7bcb1d.35b889c6.cygnus.gnu-win32@aol.com> Mime-Version: 1.0 Content-Type: text/plain; charset=US-ASCII Content-Transfer-Encoding: 7bit To: smorris AT xionics DOT com, owner-gnu-win32 AT cygnus DOT com, gnu-win32 AT cygnus DOT com In a message dated 7/24/98 1:24:02 AM, smorris AT xionics DOT com wrote: >Octal makes sense in an environment with >word sizes of multiples of 3 bits. Hex is only useful with multiples of 4 >bits. I found hex useful on 36-bit word machines like GE600/Honeywell6000 where floating point sub-fields were aligned on 4-bit boundaries, but of course characters were aligned either on 6-bit or 9-bit boundaries. It wasn't difficult to have hex display software recognize patterns which made more sense in character or octal. So why did octal persist so long on 16/32 bit machines? >The IBM 360 was 36 bits. The 704 and 7094 were 36 bits, as were some would-be competitors of the 360. These competitors apparently believed, mistakenly, that IBM customers would switch brands rather than switch word lengths, and that the superiority of 36-bit binary floating point over 360-style 32-bit hex would carry the day. As it turned out, of course, prices of 32-bit memory dropped so fast that 64-bit double precision was more affordable than 36-bit single with extended registers. - For help on using this list (especially unsubscribing), send a message to "gnu-win32-request AT cygnus DOT com" with one line of text: "help".