Mail Archives: djgpp/1997/07/10/17:36:37
In article <33C3E804 DOT 6526 AT cs DOT com>, "John M. Aldrich" <fighteer AT cs DOT com>
writes
>ANSI defines int as the native word size of the compiler, and rules that
>sizeof(short) <= sizeof(int) <= sizeof(long). It says that short must
>be at least 16 bits and long must be at least 32 bits. It allows the
>implementation to define the actual size of these types. Every Unix
>system uses 32-bit integers; 16-bit ints are a DOS peculiarity that
>DJGPP fortunately does not share. ;) If you write a program that
>depends on having 16 or 32 bit numbers, you should use short and long
>explicitly.
16 bit ints are a natural consequence of real mode, where the default
size of operands is 16 bits. 32 bit values run slower and produce longer
code. This is why int is defined as the native word size, it allows you
to assume that an int will generate the best code on most compilers.
If you need to depend on *exactly* 16 or 32 bit integers, conditionally
#define INT16 and INT32 to the correct native type. That way your new 64
bit alpha next year will run the program correctly ;)
---
Visit www.dukepsx.com: see what I do all day.
Paul Shirley: my email address is 'obvious'ly anti-spammed
- Raw text -