Mail Archives: djgpp-workers/1997/12/18/08:40:45
> > > Since getc is only supposed to be used for text files, I think we should
> > > change it to return chars in the range [-128..127], so that comparisons
> > > work.
> >
> > I think this is a bad idea, because most programs assume that EOF is -1.
>
> I meant 'return ints in the range [-128..127] or -1'. Do I always have
> to be *this* exact? I think it was obvious what I meant to say.
Changing getc to return char instead of int makes DJGPP different from
other compilers, which return int even when chars are unsigned. Given
how long compilers have been doing this, I think it would be a mistake
to make this change, no matter how well-intentioned it may be.
Returning char instead of int also makes it *impossible* to reliably
detect EOF when reading binary files (without using feof()). Programs
expect -1 to mean EOF, and forcing them to use feof() would mean extra
work to port most programs - work which would not be needed if we just
leave getc() alone.
- Raw text -