Date: Tue, 18 Mar 1997 17:16:29 -0500 (EST) From: "art s. kagel IFMX x2697" To: Erik Max Francis Cc: djgpp AT delorie DOT com Subject: Re: Weird problem In-Reply-To: <33297822.4DD773FA@alcyone.com> Message-Id: Mime-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII You're right, my apologies. However, my point still holds. Use short or long if you care about storage size or value range and int only when the range is irrelevant, as almost any wordsize will do, and/or performance is paramount. Any use of int is frought with portability dangers. Especially since most 64-bit compilers actually have long as the machine word size and int as a 32-bit integer. Art S. Kagel, kagel AT ts1 DOT bloomberg DOT com On Fri, 14 Mar 1997, Erik Max Francis wrote: > art s. kagel IFMX x2697 wrote: > > > In general, it is good coding practice top explicitely use long and short > > rather than int since short is guaranteed to be 'at least 16 bits' and > > long is guaranteed to be 'at least 32 bits' while int is only defined as > > 'the efficient word size' which varies from one machine and compiler to > > another. > > Since when? All you're guaranteed is that sizeof(short) <= sizeof(int) and > sizeof(int) <= sizeof(long), and, as you say, an int "has the natural size > suggested by the architecture of the execution environment" (ANSI C > 6.1.2.5). None of the values are grounded in absolute sizes. In fact, a > char isn't even guaranteed to be 8 bits! > > -- > Erik Max Francis, &tSftDotIotE / email: max AT alcyone DOT com > Alcyone Systems / web: http://www.alcyone.com/max/ > San Jose, California, United States / icbm: 37 20 07 N 121 53 38 W > \ > "I am become death, / destroyer of worlds." > / J. Robert Oppenheimer (quoting legend) >