Date: Tue, 10 Feb 1998 16:49:12 +0100 (MET) From: Hans-Bernhard Broeker To: Vik Heyndrickx cc: DJ Delorie , djgpp-workers AT delorie DOT com Subject: Re: char != unsigned char... sometimes, sigh (long) In-Reply-To: <34E00C53.175A@rug.ac.be> Message-ID: MIME-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII Precedence: bulk On Tue, 10 Feb 1998, Vik Heyndrickx wrote: > - EOF is an element of the 'signed char' range which means that no > matter what trickery is applied, only 256 distinct values can be > represented of which EOF is one. > This has as a consequence that locales > that have a real character defined for value (char)255 (i.e. EOF) cannot > be supported by ANY is* macro's, no matter how smart implemented. I may sound repetitive, but that's not the full truth on this: they *can* support it. The only thing they can't do is to magically fix up non-portable programs that blindly call isalpha(c) for a 'char' value, instead of the correct isalpha((unsigned char)c). As I consider such programs to be buggy, I fail to see a problem with not supporting them. I fully agree with Eli: we shouldn't change such a rather fundamental design decision just to make broken user program turn un-broken. > - many users do not expect that '`' (the Greek letter alpha with EASCII > value 224 in DOS CP 437 or 850) is not equal to 224. IMHO, this is > strongly counterintuitive. This triggers unexpected (and never wanted) > outputs in printf("%d", ...) and printf ("%u", ...) statements. I don't think we should pay that much attention to what users _expect_, even more so if they expect the wrong things... E.g., I can't see any good reason why anyone should 'printf("%d", '`'). If you want a char to represent a *number*, then why (other than for the IOCCC) would you want to assign a value in '' notation to it? Users who mix up the use of 'char' for actual characters, and for 'small integers' might as well get what they asked for. > - All DOS compilers that I know about (not many), use 'unsigned char' by > default. SGI uses 'unsigned char' ;) But I think all of them can be told to use 'signed char' instead. Even way old Turbo C 2 could. > - A char can be used as an array subscript, especially in translation > tables. Most of the time (99%) the user does not expect that this value > can be negative. I doubt you have enough statistical data to justify such a claim :-) Hans-Bernhard Broeker (broeker AT physik DOT rwth-aachen DOT de) Even if all the snow were burnt, ashes would remain.