Mail Archives: djgpp/1997/06/25/01:48:30
Robert Humphris wrote:
> Trouble is, that the 10 odd Megs that need to be down
> loaded, takes the same connection time if it comes in
> one file, or 20... in fact 1 file takes less time, as
> you have no latency between files.
However, not everyone has a net connection on the same
machine as they want to run DJGPP. At the moment, with
one exception the files are small enough to fit on a
single disk each (several on one disk in some cases).
I'm on a DEC Alpha at work, on the net, so have to take
everything home on floppies. One 10Mb file would have
been a real pain.
Also, if you have them in separate files it makes it a
lot easier to upgrade to new versions. I don't want to
have to d/l 10Mb for a new version of (for instance)
It's also easier for people with limited free disk
space. They can load only those parts which are
necessary, and don't have to have 30Mb of free space
to hold the ZIP file while they extract the files
(10Mb for the ZIP file and 20Mb for the extracted
> Is it that the problem lays with the fact that this
> is GNU stuff and thus has to be distributed the way
> that they decree?
FSF and the GPL don't say anything about how the source is
distributed, only that it must be available on request
either from the same place as you got the binaries or
from a place you can determine (if necessary by
contacting the author or the FSF directly). If someone
wants to get all the DJGPP stuff and put them into one
big ZIP file (or ARJ, TGZ, etc.) and stick that on an
FTP site or Web page they are welcome to do so.
- Raw text -