Date: Thu, 18 Dec 1997 08:39:09 -0500 (EST) Message-Id: <199712181339.IAA11366@delorie.com> From: DJ Delorie To: Vik DOT Heyndrickx AT rug DOT ac DOT be CC: molnarl AT cdata DOT tvnet DOT hu, djgpp-workers AT delorie DOT com In-reply-to: <3498D668.3B38@rug.ac.be> (message from Vik Heyndrickx on Thu, 18 Dec 1997 08:53:12 +0100) Subject: Re: char != unsigned char... sometimes, sigh Precedence: bulk > > > Since getc is only supposed to be used for text files, I think we should > > > change it to return chars in the range [-128..127], so that comparisons > > > work. > > > > I think this is a bad idea, because most programs assume that EOF is -1. > > I meant 'return ints in the range [-128..127] or -1'. Do I always have > to be *this* exact? I think it was obvious what I meant to say. Changing getc to return char instead of int makes DJGPP different from other compilers, which return int even when chars are unsigned. Given how long compilers have been doing this, I think it would be a mistake to make this change, no matter how well-intentioned it may be. Returning char instead of int also makes it *impossible* to reliably detect EOF when reading binary files (without using feof()). Programs expect -1 to mean EOF, and forcing them to use feof() would mean extra work to port most programs - work which would not be needed if we just leave getc() alone.