X-Authentication-Warning: acp3bf.physik.rwth-aachen.de: broeker owned process doing -bs Date: Wed, 2 May 2001 10:19:53 +0200 (MET DST) From: Hans-Bernhard Broeker X-Sender: broeker AT acp3bf To: djgpp-workers AT delorie DOT com Subject: Re: sbrk() storing the size of memory blocks In-Reply-To: <3.0.1.32.20010502150222.0069637c@wingate> Message-ID: MIME-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII Reply-To: djgpp-workers AT delorie DOT com Errors-To: nobody AT delorie DOT com X-Mailing-List: djgpp-workers AT delorie DOT com X-Unsubscribes-To: listserv AT delorie DOT com Precedence: bulk On Wed, 2 May 2001, Nimrod A. Abing wrote: > At 02:49 PM 04/30/2001 +0200, you wrote: > >OTOH, I suspect DJ may have done this on purpose, to keep the bandwidth > >requirements reasonably low. [...] > Very true. But it would be better if there were some other way of getting > prerelease sources besides using CVS. This would enable others to test it > more fully. > >(Even Sourceforge doesn't provide automated snapshot tarballs, by default. > >You have to do that yourself, as a project admin, if you want it). > > Is there even a remote possibility that nightly/weekly/monthly DJGPP CVS > archives would be put in SourceForge with all concern to decreasing > bandwidth demands on DJ's server? There's always the possibility to register a project for the sole purpose of using the filespace SourceForge provides. I.e. someone with the type of net connection to be able to do it could run a weekly cron job to 'cvs -z3 extract' the sources off DJ's public server, package them up into a zip file, and upload that to SourceForge. All automatically. If DJ did that himself, it might even save some bandwidth. -- Hans-Bernhard Broeker (broeker AT physik DOT rwth-aachen DOT de) Even if all the snow were burnt, ashes would remain.