Mail Archives: pgcc/1998/07/07/16:31:37
On Mon, 6 Jul 1998 23:41:03 +0200 Marc Lehmann wrote:
Another idea: we could add a project-local database where gcc can store
information about compiled objects. We could store "really static functions"
tags for functions we compile, so gcc is able to optimize better on the next
run without seperating tree from rtl). The disadvantage is that we would,
again, have two passes over each file.
This is an incredibly good idea, even when the monumental implementation effort
is concidered.
I'm an application type geek that spends a lot of time on the "Bleeding
Edge". This has made me very sensitive to system level issues. Mapping
application logical requirements into platform computing topologies has
delivered performance levels beyond what many customers/users/associates
believed posible.
If increased optimization can be made available without eliminating or reducing
the benefits of current implementations the number of passes it might take and
the addition of optimization databases is a small price. Your approach may
in-fact become mandatory for "mainstream" development with the transition to
Merced and other parallel architectures.
I hope the compiler gods hear your musings and take up the challenege. The
scope, quanity and QUALITY of the software I seen produced by GNU/Linux
generation is truely awesome. From my perspective this has in part been due to a
more egalitarian ethic leading to a more efficient and effective marketplace of
ideas among participants who have been empowered to create through their
collaboration.
A fundamental expansion/redefinition of compiling software could bring increased
recognition and opportunity to shape the future.
POWER to the WEB or NET or something like that.....
David Ross
Toad Technologies
davidr AT toadtech DOT com
"I'll be good. I will...I will"
- Raw text -