Re: parser performance, was Popularity of compiler tools, was LRgen

Ian Lance Taylor <ian@airs.com>
12 Apr 2008 08:28:56 -0700

          From comp.compilers

Related articles
Re: Seeking recommendations for a Visual Parser to replace Visual Pars mjfs1@cam.ac.uk (Marcel Satchell) (2008-03-28)
Re: LRgen, was Seeking recommendations for a Visual Parser to replace paul@paulbmann.com (Paul B Mann) (2008-03-31)
Popularity of compiler tools, was LRgen anton@mips.complang.tuwien.ac.at (2008-04-06)
Re: Popularity of compiler tools, was LRgen wclodius@los-alamos.net (2008-04-11)
Re: parser performance, was Popularity of compiler tools, was LRgen ian@airs.com (Ian Lance Taylor) (2008-04-12)
Re: parser performance, was Popularity of compiler tools, was LRgen ian@airs.com (Ian Lance Taylor) (2008-04-12)
Re: parser performance, was Popularity of compiler tools, was LRgen derek@knosof.co.uk (Derek M. Jones) (2008-04-12)
| List of all articles for this month |

From: Ian Lance Taylor <ian@airs.com>
Newsgroups: comp.compilers
Date: 12 Apr 2008 08:28:56 -0700
Organization: Compilers Central
References: 08-03-107 08-03-119 08-04-024 08-04-046
Keywords: parse
Posted-Date: 12 Apr 2008 11:51:17 EDT

> [My understanding is that GCC switched to a hand-written parser
> because of the difficulty of parsing the awful C++ grammar with
> anything other than hand-written hacks. The new parser may be a
> little faster but that wasn't a big issue, since parse time is never a
> bottleneck in a compiler. -John]


I want to disagree with our esteemed moderator a little bit. Parsing
time is not a bottleneck when optimizing. But the speed of the
compiler matters more when not optimizing, and in that case the parser
can indeed be a bottleneck. When compiling C++ with gcc with a lot of
header files, the parsing time can be up to 50% of the total
compilation time when not optimizing.


Ian
[Are you including tokenizing in the 50%? Lexers often do take a lot
of time, since they have to do something to each character. But once
the lexer has shrunk the input from a stream of characters to a stream
of tokens, the parser rarely takes an appreciable amount of time.
Opinions vary about the relative performance of DFA lexers vs ad-hoc
hand written ones, which I think means that the implementation is more
important than the technique. -John]



Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.