Message-ID: <32A02DD1.1157@pobox.oleane.com> Date: Sat, 30 Nov 1996 13:51:29 +0100 From: Francois Charton Organization: CCMSA MIME-Version: 1.0 To: Morten Welinder CC: djgpp AT delorie DOT com Subject: Re: Problems with DJGPP V2.01 - atof() function References: <329e68a5 DOT 10316617 AT news DOT ua DOT pt> <57mtq1$4mo AT vidar DOT diku DOT dk> Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Morten Welinder wrote: > > afonso AT inesca DOT inesca DOT pt writes: > > > char string[]="1.13"; > > int result; > > ... > > result = (int)(atof(string)*100); > > ... > > > I've got result = 112!!! not 113 as I wished, because > >the function atof() return is 1.29999... not 1.13 (and I only have > >an old i386). > > Getting 112 is well within the C standard. If your program does > not work in this situation then you have a bug. > Sorry to disagree but this *is* a bug : to be sure try the following program : int main(void) { char ch[8]="1.13"; int result, otherresult; float f; result=(int)(atof(ch)*100.0); f=atof(ch)*100.0; otherresult=(int) f; printf("result: %d otherresult:%d\n", result, otherresult); return 0; } On my machine I get result: 112 and otherresult: 113... Francois