Mail Archives: djgpp/1996/05/06/17:54:34
nmarrone AT smartlink DOT net (Nicholas Marrone):
(<4mikhp$pa5 AT frodo DOT smartlink DOT net>)
>Hullo,
>[...] The
>program compiles and runs without error, but it gives me #s far larger
>than integers when I print out some results. I'm using DJGPP v2.
>Here's some of the code...
[...]
> int *origp;
> if ( !(origp = (int *)malloc(NUM * 2)) ||
> !(onesp = (int *)malloc(NUM * 2)) ||
> !(zerop = (int *)malloc(NUM * 2)) )
> {
> printf("Not enough memory to run this program!!!");
> }
[...]
> for (ctr = 0; ctr < 20; ctr++)
> printf("%d\n", origp[ctr]);
[...]
(Please excuse my poor english)
Try running the following sample program:
------------------------SNIP-----------------------
#include <stdio.h>
void main() {
printf("%i\n",sizeof(int));
}
------------------------SNIP-----------------------
This should give you the hint you were looking for: int defaults to
32 Bit on most machines, at least with gcc and its compagnions. So
the results you get are perfectly normal, try "short int", if you
want 16 Bit integers.
One thing that isn't normal and will eventually break your code is
the allocation of memory with a fixed assumption on the size of your
basic datatype ("malloc(NUM*2)"). You'd better write something like
"malloc(sizeof(datatype)*NUM)". This yields a constant expression,
too, but is much more portable.
The code you posted will overwrite data in memory following the memory
allocated for origp with NUM/2<=ctr<NUM.
Regards,
Jo
--------------------------------------------------------------------------
- Raw text -