Mail Archives: djgpp/1995/10/17/23:10:20
In article <resta-1310951051530001 AT mac-resta DOT iei DOT pi DOT cnr DOT it>
resta AT iei DOT pi DOT cnr DOT it "Giovanni Resta" writes:
> I will like suggestions on how to solve this problem.
> I made a program that make use of an external binary file of, say, 200k-300k.
> That is, the program starts, reads in memory this file,
> that is always the same (it does not change between executions)
> and then proceed.
> I was wondering if I can use just one file, that is putting somehow the
> data inside the executable.
>
> I remember that some years ago, I tried to do something similar with another
> compiler. I tried to statically initialize a big array of chars.
> In practice I wrote a program that given a file will create a source *.c
> that statically initialize an array with the data of the file.
> However this approach failed on larger dataset, due to some limitation of
> the compiler.
> I wonder if djgpp, with its lovely flat model,\r has limitations of this kind.
>
> Another approach that I'm considering is to append physically the data file at
> the end of the executable and then finding a way to access it directly,
> but I do not know if in that case the data will be automatically loaded
> somewhere in the memory (where !?) or if I must read it from the disk
> peeking toward the end of the "extended" executable file.
>
> Suggestions are all wellcome. I'm quite a good C programmer but also quite new
> to djgpp and absolutely ignorant about my 586 asm.
You could define a large char array called x in assembler like this :
..globl _x
..data
_x:
.byte 1,2,3,4
.byte 5,6,7,8
etc.
(you can put more bytes per line if you want)
This is then assembled with a command line like
as z.s -o z.o
Your C program would then have a declaration like
extern char x[];
and you would link the file z.o with your C.
I just tried this and produced a 300k array very easily.
--
Martin
- Raw text -