|Re: Thompson's Plan 9 C compiler email@example.com (1991-08-12)|
|Re: Thompson's Plan 9 C compiler firstname.lastname@example.org (1991-08-13)|
|Re: Thompson's Plan 9 C compiler email@example.com (1991-08-13)|
|Re: Thompson's Plan 9 C compiler firstname.lastname@example.org (1991-08-13)|
|Re: Thompson's Plan 9 C compiler email@example.com (1991-08-14)|
|Re: Thompson's Plan 9 C compiler firstname.lastname@example.org.COM (1991-08-15)|
|giving away the store by Factors of Two email@example.com (1991-08-16)|
|Re: Thompson's Plan 9 C compiler firstname.lastname@example.org (1991-08-16)|
|Re: Thompson's Plan 9 C compiler email@example.com (1991-08-20)|
|Organization:||Computer Science & Engineering, U. of Washington, Seattle|
|Date:||Tue, 13 Aug 91 18:02:09 -0700|
Excuse me, I'm going to step away from my normally pseudo-scientific approach
and get up on my bandwagon, made of a carefully constructed soap box.
>firstname.lastname@example.org (Byron Rakitzis) writes:
>>[current Unix vendors' compilers are a crock.]
>[Optimizing compilers don't make as much improvement on well-written
> code as on student first attempts; common subexpressions have been
> done in the source, invariants are out of loops, ...]
Right, I gave up using macros and adt's, too. But what about
everybody else who thinks they're still a good idea ?-)
>[Current compilers serve the needs of the programmer and of the
> employer who buys the compiler for the buyer's code.]
On the one hand, there is an argument that current optimizers are lousy, on
the other an argument that good code will be manually ``flattened'' where
necessary to get good execution speed, so the optimizers are good enough.
I will make a different argument: the case for optimization is overstated.
Sure, number crunchers need every bit, er, MFLOP they can get. But they're
not the entire market. Us Joe coders want programs that work the first time;
a factor of 2 is nice, but not essential.
``A program that produces incorrect results
twice as fast is infinitely slower.'' -- John Osterhout
I can't speak for the average programmer, but I can speak for myself: I am
willing to give up substantial speed of execution (a factor of 2, to pick a
number), even in the final product, in order to gain productivity. I'm not
worrying about compiler bugs, I'm saying that things which are inherenltly
inefficient are often OK with me if I can get my program up and running --
correctly -- in a decent amount of time.
It can even be argued (in fact I think I'll do it here:) that the better the
programmer's productivity, the more brain capacity they have left over to
worry about optimization.
Soap box off.
Speaking of Thompson's compilers, check out:
%A Ken Thompson
%T Regular Expression Search Algorithm
%J Communications of the Association for Computing Machinery (CACM)
%D June 1968
%X Describes a regular expression search generator that generates IBM
7094 assembly code.
(pg 419): ``The compiling phase of the implementation does not
detract from the overall speed since any search routine must translate
the input regular expression into some sort of machine accessable
* Includes the whole compiler.
* Described via examples in 7094 assy (yuck!).
* Uses fine-grained code generation.
Runtime generates a threaded interpreter for the regular expression
dfa. Generated once per dfa. Thus, it is static over a given
dfa/search. Keeps a state list with one state for each pending
(partial) match; never backtracks.
The list of pending matches is the threading of the interpreter.
Rather than keep a list of code addresses, the thread is actually
machine instructions. Thus, the threaded interpreter also does RTCG!
;-D on ( Thompson's .plan compiler ) Pardo
Return to the
Search the comp.compilers archives again.