|Java compiler optimizations? firstname.lastname@example.org (suganya) (2001-01-04)|
|Re: Java compiler optimizations? email@example.com (2001-01-05)|
|Re: Java compiler optimizations? firstname.lastname@example.org (Zhiyong Wang) (2001-01-09)|
|Re: Java compiler optimizations? C.vanReeuwijk@twi.tudelft.nl (Kees van Reeuwijk) (2001-01-09)|
|Re: Java compiler optimizations? email@example.com (Eliot Miranda) (2001-01-09)|
|Re: Java compiler optimizations? firstname.lastname@example.org (2001-01-11)|
|Re: Java compiler optimizations? email@example.com (Eliot Miranda) (2001-01-18)|
|Re: Java compiler optimizations firstname.lastname@example.org (David Chase) (2001-01-19)|
|Re: Java compiler optimizations email@example.com (Allyn Dimock) (2001-01-26)|
|From:||Eliot Miranda <firstname.lastname@example.org>|
|Date:||18 Jan 2001 01:07:38 -0500|
|Organization:||SBC Internet Services|
|References:||01-01-012 01-01-018 01-01-050 01-01-066|
|Posted-Date:||18 Jan 2001 01:07:38 EST|
"Ian L. Kaplan" wrote:
> >"Ian L. Kaplan" wrote:
> >> For a given class of compiler, I suspect that there is no difference
> >> between Java optimization and C++ optimization, for example.
> Eliot Miranda <email@example.com> wrote, in reply:
> >IMO, this is very far from the truth. But it is not the source to
> >bytecode compiler that typically does "exotic" optimizations.
> >Instead, there are Just-In-Time Java, Self and Smalltalk bytecode to
> >native code "compilers" that perform optimizations not seen in static
> >C++ compilers. These are virtual machines that include optimizing
> >compilers used at run-time. The basic strategy is for the JIT to
> >quickly generate unoptimized code that includes profiling facilities.
> >Based on profiles the JIT then aggressively optimizes code that is
> >found to be long running via profiling. Visit the Self compiler pages
> >at Sun (http://www.sun.com/research/self/). See proceedings of
> >SIGPLAN PLDI over the past 5 years, especially on Java virtual
> >machines. See the Proceedings of the ACM SIGPLAN Workshop on Dynamic
> >and Adaptive Compilation and Optimization (Dynamo '00). January 18,
> >2000, Boston, Massachusetts, a.k.a. SIGPLAN vol. 35, no 7.
> The point another poster made about optimizing range checks and null
> pointer references is a valid difference between C++ optimization
> and Java optimization. The issue of optimization in the presence of
> threads is another important difference and a big challenge in
> Java compilation.
> But trace or profile driven optimization is hardly new. In fact
> this is an active area for IA64 optimization as well. Many of the
> techniques that were used on systems like the Multiflow super-
> computer (almost twenty years ago) are now being applied in IA64
I didn't claim that it was. APL compilers have been doing these
things for longer than 20 years. I think APL dynamic compilation and
optimization is at least 30 years old. And there are many, many more
examples from graphics to cobol to operating systems.
> JIT optimization is probably different from normal compiler
> optimization since the JIT has to be designed with performance in
> mind. Some algorithms may be considered too time consuming. I
> think that you'll find that the "agressive optimization" that JIT
> native compilers apply is classical optimization.
Well, one standard dynamic optimization in Self, Smalltalk, Cecil and
Java JITs is essentially inlining of polymorphic method invocation.
Type inference is done by in-line caches (both single and polymorphic
inline caches) that collect type information as a side-effect of
optimizing method lookup. See the Self papers at
> I consider the applications that can benifit from JIT compilation a
> fraction of those that could benifit from Java. Not all
> applications are long running servers.
Adaptive optimizing JIT compilation is used in scripting and
interactive applications, not just long-running servers. As such JIT
has very broad applicability. The essential difference that a JIT
provides is the ability to do dynamic optimization. You can boil this
down to parameter based strength reduction, where the repeated
application of a function to a variable (e.g. method lookup applied to
an instance of a class, or a boolean function derived from a
combination rule in bitblt/rasterop). If the function is applied to
the variable sufficiently often then one can improve performance by
compiling a special version of the code sequence which assumes the
variable is in fact a constant.
Since static optimizations can still be applied in a JIT context its
reasonable to assume that JIT optimization is more widely applicable.
That this is not the case currently merely reflects the immaturity of
the field, its lack of familiarity (not much taught, etc) and lack of
good architectural support (cheap user-level icache flushing, etc).
> The issue of JIT compilation
> to native code is a side issue in this discussion. This is not what
> I understood the topic to be. The issue that I was trying to
> address was Java optimization, especially in the context of static
Its not clear that the original poster had a focus on static
optimization. Java, unlike C++, is a dynamic language.
> Optimization in the presence of exceptions (e.g., how should control
> flow graphs be built in the presence of exceptions) is an important
> issue shared by both Java and C++. This, and the issue mentioned
> above with threads, is a complex problem. Unlike classical
> optimization which is well covered in books like Robert Morgan's
> "Building an Optimizing Compiler", optimization in the presence of
> exceptions and threads is harder to find material on. Most compiler
> groups seem to solve these problems a new every time. Since they
> are working on commercial products, there is less material in print.
Agreed. Recent PLDIs have good papers on Java exceptions,
synchronized methods and thread-safety. But as I recall these are all
in JIT contexts.
> Java is a complex language compared to C, but it is far simpler than
> C++. I have been a heavy user of C++ for many years and I still
> find the complexity of the language staggering. The compiler
> support needed for C++'s object model and features like operator
> overloading is subtle. Then there are templates, etc...
> Java provides an interesting model for modern optimization since it
> is simpler than C++ but still preserves much of C++'s power.
The dynamic facilities of Java (class loading) and its reflective
facilities put it in a different class along dynamic languages like
Smalltalk. There JITs provide a sound basis for optimization since
JITs can handle changes in the code while a program is running. This
is something C++ doesn't have to address. For me a dynamic language
is more powerful than a static language.
Eliot Miranda Smalltalk - Scene not herd
Return to the
Search the comp.compilers archives again.