delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/1992/02/15/18:29:57

Date: Sat, 15 Feb 92 17:42:51 -0500
From: pstephan AT mcs DOT kent DOT edu
To: djgpp AT sun DOT soe DOT clarkson DOT edu
Status: O

I have found a problem that only occurs when I use the -O gcc option flag
to optimize my code.  I have distilled the problem down to the following
example.  I have also attached the output that results when this program is
executed with "go32 test.x".  The output I expect is
    y:   2.00000, int y: 2, int 2.0: 2.
However, the value being output for i (int y) is "1" rather than "2".
 
I compiled this program with "gcc -O -o test.x test.c -lm".
 
Note that when I do NOT use the -O flag, I get the expected result.
 
Has anyone else seen this?  Is there a patch/fix/work-around available (right
now for a work-around I am simply not optimizing).
 
--------------------------------------------------------
 
#include <math.h>
#include <stdio.h>
 
 
int main (argc, argv)
 
int argc;
char *argv[];
 
{
   double y;
   int i;
 
   y = log ((double) 4.0) / log (2.0);
   i = (int) y;
 
   printf ("y: %9.5f, int y: %d, int 2.0: %d.\n", y, i, (int) 2.0);
}
 
---------------------- Output --------------------------
 
y:   2.00000, int y: 1, int 2.0: 2.
 
--------------------------------------------------------
 
Paul H. Stephan
pstephan AT mcs DOT kent DOT edu

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019