|Questions about partial redundancy elimination baxter@Austin.slcs.slb.com (1993-04-30)|
|Re: Questions about partial redundancy elimination email@example.com (1993-05-01)|
|Re: Questions about partial redundancy elimination firstname.lastname@example.org (1993-05-03)|
|Re: Questions about partial redundancy elimination email@example.com (1993-05-04)|
|Re: Questions about partial redundancy elimination firstname.lastname@example.org (1993-05-05)|
|Questions about partial redundancy elimination email@example.com (1993-05-17)|
|Re: Questions about partial redundancy elimination firstname.lastname@example.org (1993-05-18)|
|From:||email@example.com (Max Hailperin)|
|Organization:||Gustavus Adolphus College, St. Peter, MN|
|Date:||Tue, 4 May 1993 17:46:50 GMT|
baxter@Austin.slcs.slb.com (Ira Baxter) writes: ...
>The paper contains the remark: "It is assumed that commands computing
>expressions deliver their results to psuedo-registers, and that all
>computations of the same expression always use the same psuedo-register as
>the result register."...
firstname.lastname@example.org (Preston Briggs) writes:
Here's our rules for intermediate code generation to support PRE:
Whenever generating an expression (say r10 + r11), see if that
expression has been generated before. If so, make the new
expression assign to the same result register as the old one.
Thus, whenever we see r10 + r11, it will always be assigned to
the same register. This is easily accomplished by maintaining
a hash table during code generation. ...
Right. The earliest use I know of for this idea is the PL.8 compiler.
(This compiler is described in .)
Back to PRE for a brief parting shot. The recent papers that try
to minimize register pressure by placing common subexpressions as
late in the code as possible don't work. ... [by computing
earlier] We lengthen the lifetime of [the result], but we shorten
the lifetimes of [the operands], winning in the balance. Of
course, if [the operands] have other uses, then my version may be a
little worse than theirs. How to choose between them? Certainly
an NP-hard problem. I don't believe anyone has attacked it. ...
This roughly echoes the comments of Rosen, Wegman, and Zadeck :
"When we move a computation upwards we may shorten the live range of
its operands, but we may lengthen the live range of its result. We
have no statistical information on whether these changes in live
ranges are generally helpful or harmful. A topic for future
research is to find an algorithm which moves computations in order
to aid register allocation by decreasing the live ranges."
I'm not sure about the NP-hardness if you try doing it simultaneously for
all expressions and/or pick a tough optimality standard, but I think I see
how to do a good heuristic job in a rank-ordered  slotwise 
algorithm. (I had just started work on exactly this when this posting
Speaking of rank ordering and slotwise: you never did explain the
following (which looks like a start on rank ordering):
Make the result of an expression have a higher register number
than its operands (so, we get r12 = r10 + r11, never r1 = r2 + r3).
 M. Auslander and M. Hopkins, An overview of the PL.8 compiler,
pp. 22-31 of the SIGPLAN '82 proceedings (volume 17, number 6 of
 Barry K. Rosen, Mark N. Wegman, and F. Kenneth Zadeck, Global
Value Numbers and Redundant Computations, pp. 12-27, POPL '88.
 Dhananjay M. Dhamdhere, Barry K. Rosen, and F. Kenneth Zadeck, How
to analyze large programs, efficiently and informatively, pp.
212-223 of the SIGLPLAN '92 proceedings (volumne 27, number 7 of
Return to the
Search the comp.compilers archives again.