Mail Archives: djgpp/2001/12/27/17:32:53
Hi All,
I have am having a problem with the way #define's are being handled. I
have some sample code to demonstrate the problem.
I have commented the source to explain the problem.
Basically when i do some math with the #define's the compiler looks
like it is selecting the wrong data type.
I have upgraded from 2.954 binaries to v3.03 binaries but this didn't
fix the problem.
If anyone has a suggestion i would appreciate it.
Regards,
Matt
(matt AT mattshouseofpain DOT com)
/**START**/
#include <stdio.h>
#define BASEVAL1 233
#define BASEVAL2 BASEVAL1 + 29
#define BASEVAL3 233
#define BASEVAL4 BASEVAL1 + 57
int main(void)
{
printf("baseval1 [233] = %d\n", BASEVAL1);
printf("baseval2 [262] = %d\n\n", BASEVAL2);
printf("baseval3 [233] = %d\n", BASEVAL3);
printf("baseval4 [290] = %d\n\n", BASEVAL4);
/* up to this point the define's are ok but when the math is done
on
the next line it seems to be treating the result as an unsigned
char
data type. */
printf("baseval4 - baseval2 [28] = %d\n\n", BASEVAL4 - BASEVAL2);
/* here is a sanity check */
printf("290 - 262 [28] = %d\n", 290 - 262);
return(0);
}
/**END**/
- Raw text -