From: Erik Max Francis Newsgroups: comp.os.msdos.djgpp Subject: Re: Weird problem Date: Fri, 14 Mar 1997 08:09:06 -0800 Organization: Alcyone Systems Lines: 21 Message-ID: <33297822.4DD773FA@alcyone.com> References: NNTP-Posting-Host: newton.alcyone.com Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit To: djgpp AT delorie DOT com DJ-Gateway: from newsgroup comp.os.msdos.djgpp art s. kagel IFMX x2697 wrote: > In general, it is good coding practice top explicitely use long and short > rather than int since short is guaranteed to be 'at least 16 bits' and > long is guaranteed to be 'at least 32 bits' while int is only defined as > 'the efficient word size' which varies from one machine and compiler to > another. Since when? All you're guaranteed is that sizeof(short) <= sizeof(int) and sizeof(int) <= sizeof(long), and, as you say, an int "has the natural size suggested by the architecture of the execution environment" (ANSI C 6.1.2.5). None of the values are grounded in absolute sizes. In fact, a char isn't even guaranteed to be 8 bits! -- Erik Max Francis, &tSftDotIotE / email: max AT alcyone DOT com Alcyone Systems / web: http://www.alcyone.com/max/ San Jose, California, United States / icbm: 37 20 07 N 121 53 38 W \ "I am become death, / destroyer of worlds." / J. Robert Oppenheimer (quoting legend)