Message-ID: From: Shawn Hargreaves To: djgpp AT delorie DOT com Subject: Re: Length of Chars... Date: Mon, 24 Jan 2000 12:12:11 -0000 MIME-Version: 1.0 X-Mailer: Internet Mail Service (5.5.2650.21) Content-Type: text/plain; charset="iso-8859-1" Reply-To: djgpp AT delorie DOT com Eli Zaretskii writes: >> I was wondering if there were any plans on changing the number >> of bits in char from 8 to 16 or something like that? > > No. I don't think any compiler in the world does, or is about to > do, that. There are a few obscure processors where all data types are 16 or 32 bits, including chars. But certainly no PC compiler (or indeed anything for a mainstream computer architecture) is likely to do that. > Keep in mind that many programs don't care much about the storage > in many cases. In my experience, even when people do think they care about it, they are usually wrong, or at least could be wrong if they made a few minor changes that would improve their code anyway. It is common to require at least a certain range in your data types, but almost never matters if you get a variable larger than you were expecting. The usual times this can make a difference are: - If you rely on the specific point at which a variable will wrap. Simple solution is just not to rely on that, and if you do need to know the max/min values for anything, use the defines from limits.h. - If you hardcode the size of structs when allocating memory. But that is bad for other reasons as well, and should be done using sizeof(). - If you fwrite() and fread() large structures directly to and from disk. But doing that is a bad idea quite apart from variable sizes, because it causes problems with compiler padding and endianess, that prevents your program being portable. It's better to have functions that write out larger values as a series of individual bytes (using shift operations to split and recombine the contents), and once you do that, it doesn't matter what size of variable you are reading into. - Some OS services are passed structures which must be in a particular layout. This is indeed a problem if the compiler gives a layout other than you are expecting, but most programs never need to call the OS directly, and for those that do, the method of doing this tends to be different for each platform in any case, so this is rarely a major portability problem in practice. So, don't worry about this. If you are developing 32 bit code (which is a valid assumption to make for anything other than very ancient systems), you could safely assume that ints are at least 32 bits in size, and that chars have a range of at least 0-255. It really doesn't matter if someday a compiler gives you variables bigger than this: a well written program will go on working regardless. Shawn Hargreaves.