delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/1997/08/17/03:45:50

Newsgroups: comp.os.msdos.djgpp
From: Peter Berdeklis <peter AT atmosp DOT physics DOT utoronto DOT ca>
Subject: Re: Precompiled Headers?
Message-ID: <Pine.SGI.3.91.970815105627.27969A-100000@atmosp.physics.utoronto.ca>
Nntp-Posting-Host: chinook.physics.utoronto.ca
Sender: news AT info DOT physics DOT utoronto DOT ca (System Administrator)
Mime-Version: 1.0
Organization: University of Toronto - Dept. of Physics
In-Reply-To: <01IMH39G40YI8WVYJS@mail>
Date: Fri, 15 Aug 1997 15:38:15 GMT
References: <01IMH39G40YI8WVYJS AT mail>
Lines: 68
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp

On Fri, 15 Aug 1997, Hans-Bernhard Broeker wrote:
> How much more time *does* it take for a rather small program, using,
> say the STL string classes compared to the same program using C string
> functions? Of course 8M is really not too much memory for gcc
> compiling C++ with some largish headers (I've seen it take 8M and more
> on Linux, using C++ X libraries and similar stuff), so you might want
> to just shell out the money to upgrade those boxes to 16 MB (or dump
> the -33 one completely and put its SIMMs into the -100 box :-)


The following file took 16 sec's ave. to compile (from the command line, 
with gxx) with USE_CPP defined, and 10 sec's ave. to compile without.  
The file was compiled to an object file, so link time is not a factor 
(the times were the same anyway).

#ifdef USE_CPP
#include <iostream.h>
#else
#include <stdio.h>
#endif

int main()
{
 #ifdef USE_CPP
 cout << "This is a test of compile times.";
 #else
 printf( "This is a test of compile times." );
 #endif

 return 0;
}


The file is absurdly small enough that the 6 sec difference can only be
attributed to the size of the header file.  While 6 sec's isn't a lot,
large template files like algo.h and vector.h take much longer.  That time
is spent for each file to be compiled that includes algo.h and vector.h,
rather than just once, so it adds up.  (I didn't try a test with algo.h
and vector.h, because there's nothing in the C library that, alone, would
give the same functionality.  That's why STL is so nice.) 

Furthermore, when the actual code file is more complicated, even though it
might only be 200 lines, the caches on my small machines with the slow
hard drives start to thrash and that 6 sec's multiplies to give waits of
more than a minute extra per file. 

By the way, the 486-33 is the computer at work, the 486-100 is at home, 
so dumping the 33's ram into the 100 isn't an option.  As for shelling 
out some money, I wish I had some to shell out. :)


> I don't think I understand your reasoning here: why do you think you
> need debugging *libraries* to use '-g' for gcc? I mean: you're not
> planning to routinely debug the standard C/C++ libraries themselves,
> are you? What prevents you from linking .o files compiled with '-g' to
> the stock libraries?

I don't think I explained this right, because I don't really understand 
it.  When I pass the -g option to gcc on the SGI it tells me something 
like '-g not available in this configuration'.  I didn't realize that not 
having -g was a configurable option.  Anyway, I said debugging libraries 
because I didn't know how else to explain it.  Either way, I can't debug 
on the SGI, so I have to do it on the PC.

---------------
Peter Berdeklis
Dept. of Physics, Univ. of Toronto

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019