|Re: Writing fast compilers... firstname.lastname@example.org (1991-08-11)|
|Re: Writing fast compilers... email@example.com (1991-08-11)|
|Re: Writing fast compilers... firstname.lastname@example.org (1991-08-13)|
|Re: Writing fast compilers... email@example.com (1991-08-13)|
|Re: Writing fast compilers... firstname.lastname@example.org (1991-08-13)|
|Re: Writing fast compilers... email@example.com (1991-08-13)|
|Re: Writing fast compilers... firstname.lastname@example.org (1991-08-14)|
|Re: Writing fast compilers... email@example.com (1991-08-16)|
|[5 later articles]|
|From:||firstname.lastname@example.org (Preston Briggs)|
|Organization:||Rice University, Houston|
|References:||<1991Aug10.email@example.com> <20167@helios.TAMU.EDU> 91-08-041|
|Date:||11 Aug 91 21:13:39 GMT|
firstname.lastname@example.org (Piercarlo Grandi) writes:
>Many "optimizing" compilers have impossibly expensive optimization
>phases; for example the SPARC and MIPS compilers have a global
>optimization algorithm with a time complexity of O(n^3), and the GNU CC
>compiler has a space complexity of O(n^2). These are well past any
>reasonable cost/benefit ratio... Clearly anything well done compares
>well against these monsters.
I just love "optimizing" compilers. Why not just drop the horror-quotes, and
_optimizing_ for that matter, and say _compilers_. If you want to talk about
compilers without optimization, say _dumb_ perhaps, or _naive_, or
The complexities you mention aren't really unusual. Just doing global data
flow analysis (no optimization, just collecting information) is at least O(n)
bit vector steps, where "n" is the number of basic blocks. Since the length
of the bit vectors and the number of basic blocks are probably proportional to
the procedure length, we're already up to O(n^2) space and time.
Of course, bit vectors are pretty dense and working over basic blocks is
pretty quick, so most compilers are able to accomplish data flow analysis very
quickly, not withstanding the time complexity.
It's perhaps harder to discuss the cost/benefit ratio. I will note that I
spend much more time writing and using programs than compiling them.
>If you want to read something really interesting on the "small is
>beautiful" technology front, do a literature search for TWIG, the
>Princeton code generator tool, and its successors.
^^^^^^^^^ Bell Labs
Code Generation Using Tree Matching and Dynamic Programming
Aho, Ganapathi, and Tjiang
TOPLAS, October 1989
Return to the
Search the comp.compilers archives again.