Mail Archives: djgpp/1997/07/18/09:21:35
Chirayu Krishnappa wrote:
> I wanted to write a types.h file #defining in2, int4 & int8
> to maintain portability.
Something most of us do at some time...
> However, gcc chokes on the #if sizeof(int)==2 line. Is'nt
> sizeof() a compile time operator?
Sigh. Yes, sizeof is a compile-time operator in that it doesn't
generate calculations at run-time. However, it's not a
pre-processing operator.
The reason for this is that preprocessing (as the prefix 'pre'
implies) is something which is done (at least logically) before
the actual compilation. A compiler is free it in fact implement
it as a separate step (many older compilers did so) producing
an intermediate file (oftemn with extension.i) which is fed into
the compiler.
Because of this, it cannot be assumed that the preprocessor knows
anything about the compiler, and that includes the sizes of types.
It also (and this is one which annoys me frequently) can't know
anything about typedefs.
Some early compilers where the preprocessor was "tightly-bound"
to the compiler did allow sizeof in preprocessor directives. This
was one of the areas where the ANSI committee restricted the
language because of portability (I know, the very reason you want
to use it!).
In summary, you can't do it.
However, all is not lost. There is a very useful header file
defined in the ANSI standard which contains things about the sizes
of variables. It's called limits.h. Among the things it defines
are the minimum and maximum numbers which can be held by the
standard types. This enables you to write a portable header file
which depends not on the size (in bytes) of an 'int' etc. but on
the values it can hold (somewhat more portable; some machines
have had 'bytes' with 9 bits in them). As it happens, I have
one of these files lying around:
#if INT_MIN < -2147483647 && INT_MAX >= 2147483647
typedef int S32;
typedef unsigned int U32;
#define INT_MIN_S32 INT_MIN
#define INT_MAX_S32 INT_MAX
#else
typedef long S32;
typedef unsigned long U32
#define INT_MIN_S32 LONG_MIN
#define INT_MAX_S32 LONG_MAX
#endif
#if SHRT_MIN < -32767 && SHRT_MAX >= 32767
typedef short S16;
typedef unsigned short U16;
#define INT_MIN_S16 SHRT_MIN
#define INT_MAX_S16 SHRT_MAX
#else
typedef int S16;
typedef unsigned int U16
#define INT_MIN_S16 INT_MIN
#define INT_MAX_S16 INT_MAX
#endif
typedef signed char S8;
typedef unsigned char U8;
Note that I use the number of bits in the name (16, 32 etc.)
instead of the number of bytes. There are arguments for and
against both naming conventions, so feel free to rename them.
It's a matter for Holy Wars which you use.
Note also that the C preprocessor is defined to use long
arithmetic throughout, and that a 'long' in C is defined to
hold at least 32 bits, so the above will work on any ANSI
C compiler (and on all C++ ones I've tried).
Also, for portability, you need to specify 'signed' and
'unsigned' char - one of the sillier ANSI backwards
compatabilities is that a 'char' can be either signed or
unsigned at the whim of the compiler writer, and can change
depending on compiler options.
(Note that several Unix systems already have a "types.h"
file, it might be an idea to call yours something different.)
The above code is public domain as far as I'm concerned -
feel free to use, modify, spindle and mutilate to your heart's
content, I acknowledge no liability for it doing anything
useful, useless or dangerous. Specifically, if it starts
a nuclear exchange I wasn't here, OK? If, on the other hand,
you make pots of money out of it I'd appreciate you mentioning
me in your will (and so will Da Boyz with the machine guns in
violin cases)...
Chris C
(BTW, this really should have been in a C newsgroup, it's not
specific to DJGPP. Something like it is probably in a FAQ
somewhere as well, but I can't direct you to it because I don't
know where either...)
- Raw text -