From: Eli Zaretskii Newsgroups: comp.os.msdos.djgpp Subject: Re: Using cgets (a !FAQ) Date: Thu, 27 Apr 2000 18:08:50 +0200 Organization: NetVision Israel Lines: 19 Message-ID: <39086612.15C4E78A@is.elta.co.il> References: <01bfafc6$387a7c80$da06017e AT gr-356146> NNTP-Posting-Host: ras1-p52.rvt.netvision.net.il Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit X-Trace: news.netvision.net.il 956848115 29052 62.0.172.54 (27 Apr 2000 15:08:35 GMT) X-Complaints-To: abuse AT netvision DOT net DOT il NNTP-Posting-Date: 27 Apr 2000 15:08:35 GMT X-Mailer: Mozilla 4.7 [en] (Win98; I) X-Accept-Language: en,ru,hebrew To: djgpp AT delorie DOT com DJ-Gateway: from newsgroup comp.os.msdos.djgpp Reply-To: djgpp AT delorie DOT com Joel Hunsberger wrote: > > The problem was that this code extensively corrupted the stack > every time. It turns out that stating 255 as the buffer length > was causing an implicit conversion overflow when converted for > use in the string, and subsequently for use by cputs. Alas, > it came out as buffer length of -1, which caused manifest > stack corruption (for reasons I can only imagine.) > > When I reduce the (largely arbitrary) requirement to 127 > for console line input... things are fine! > > No hints in the info documentation for cgets, unfortunately., I don't understand: the library docs explicitly says that the first character in the buffer is used as the buffer size. So what is missing? Do you mean to say that it was not known to you that the char data type is signed, and that therefore 255 is actually -1?