Mail Archives: djgpp/2000/01/22/12:31:30
David Cleaver wrote:
>
> I was wondering if there were any plans on changing the number of bits
> in char from 8 to 16 or something like that?
No. I don't think any compiler in the world does, or is about to do, that.
Programs that need more than 8 bits in a character, should use the wide
character (wchar_t) data type provided by the ANSI standard, and the
multibyte character representation that goes with it. (Unfortunately, the
DJGPP library doesn't support this type very well, the associated functions
only work in the "C" locale, i.e. for single-byte characters. Volunteers are
welcome to work on this.)
> Ok, well, the real problem is that I'm writing a program that is very
> dependent on the char data type being 8-bits.
I'm not sure I understand the context (so please feel free to explain more),
but the usual way to handle these situations is to define a data type (with
`typedef') that is as wide as you want, and then use it for those variables
which need to be of that width.
> Which brings me to my second (completely unrelated) point... Why can't
> the primitive data type names reflect how many bits are being used in
> their storage?
I think the next C Standard (C9X, as it is called) addresses this to some
extent. Can someone who has the draft handy please confirm?
> I think this would help out
> alot so that programs could be written more with storage in mind.
Keep in mind that many programs don't care much about the storage in many
cases. For example, when you code a simple for loop which is going to be
executed a few dozen times, you want the fastest data type, no matter what
the storage is.
- Raw text -