Mail Archives: djgpp/1999/05/11/09:55:43
vcarlos35 AT juno DOT com writes:
> I have a series of rather large data files (~2mb) each that I would
> prefer to be embedded within the main executable for the purposes
> of simplicity and such.
[...]
> The problem is that gcc takes a rather long time to compile these
> files. Is there an easier way?
I'm actually surprised that you can compile these files at all: when
I tried such things, I got a compiler stack overflow after the first
meg or so of initialised data :-)
The solution is to output assembler sources (a .s file) instead of C,
which will build far more efficiently.
Alternatively you could write a utility to convert a binary directly
into a COFF object file, but that is rather more complex and requires
you to know the details of the object file format. This method used
to be very common in the 16 bit DOS compiler world, but I haven't seen
anyone doing it with djgpp.
A third, slightly different option is to append the data onto the end
of your .exe file after you link it, rather than including it during
the compile. This way still requires to you open the file and read
the data yourself (eg. fopen(argv[0], "rb")), but you only need to
distribute a single executable, and it is more flexible if you ever
want to modify the appended data without having to recompile the
program.
Plug: if you use Allegro, the exedat utility can do this appending
data onto your program, with optional compression, and you can then
read this back by using the Allegro function
pack_fopen("#", F_READ_PACKED). Also, Allegro has a dat2s utility
that converts datafile format archives into .s source files, if you
want to link them directly into your program, and a dat utility for
building a datafile archive out of any number of binary files.
Shawn Hargreaves.
- Raw text -