Mail Archives: djgpp/2000/01/25/22:31:20
Hans-Bernhard Broeker (broeker AT acp3bf DOT physik DOT rwth-aachen DOT de) wrote:
: Alain Magloire <alain AT qnx DOT com> wrote:
: > Hans-Bernhard Broeker (broeker AT acp3bf DOT physik DOT rwth-aachen DOT de) wrote:
: > : For short: char, like all the C data types, is implementation defined.
: > : You should never assume it's 8 bits, if you can help it, as that
: > : renders your program unportable.
: > Yes. But If I recall ISO C requires that the size/range of the
: > char types be defined in <limits.h>. Althought 8 bits is widely use
: > I've heard of implementations that used 9 bits.
: Exactly my point: no-one should silently assume char is 8 bits, but
: instead look into CHAR_BITS, as defined by the implementation's
: <limits.h> to *check* if it really is 8 bits. Or, even better, look up
: that value and use it to scale your arrays, if you really feel you
: can't allow wasting the bits beyond the 8th that are present in a
: char.
Yes and I agree, with you. But I don't think, no Compiler vendor in
there right mind would redefine char to be anything else the 8 bits,
nowadays. I have code that do image manipulation from 8 bits to 32 etc ...
and I admit using (char) and (int) to do this ... not good.
Essentially, I agree to everything you say, even if the bad coder
in me is screamning "bloody murder" ;-). One should use
things like inttypes.h where 8 16 32 64 types are defined correctly.
Thanks for this reminder, as I was doing the same mistake in my
present work.
--
au revoir, alain
----
Aussi haut que l'on soit assis, on est toujours assis que sur son cul !!!
- Raw text -