Martin Bravenboer <>
28 Nov 2004 23:21:03 -0500

          From comp.compilers

Related articles
EBNF (2004-11-20)
Re: EBNF (2004-11-28)
Re: EBNF (Martin Bravenboer) (2004-11-28)
Re: EBNF (2004-12-01)
Re: EBNF (2004-12-11)
Re: EBNF (Vidar Hokstad) (2004-12-16)
Re: EBNF (Chris F Clark) (2004-12-17)
| List of all articles for this month |

From: Martin Bravenboer <>
Newsgroups: comp.compilers
Date: 28 Nov 2004 23:21:03 -0500
Organization: Dept of Computer Science, Utrecht University, The Netherlands
References: 04-11-089
Keywords: syntax
Posted-Date: 28 Nov 2004 23:21:03 EST

VBDis wrote:
  > I would like to hear opinions about the differences
  > between formal lexer and parser grammars.

Usually, the difference is in the expressiveness of the grammars: the
lexical syntax is a set of regular expressions and the 'parser grammar'
is a context-free grammar over the kind of tokens defined by the lexical
syntax. This separation is inspired by performance concerns: by
separating the input in tokens, the number of elements the parser has to
consider is reduced.

In scannerless parsing there is no separate lexical analysis phase:
every input character is a token. I think that this is conceptually even
*cleaner* then using a separate scanner. The problem of using a separate
scanner is that the context of the token in the input cannot be considered.

VBDis wrote:
> IMO it's not a good idea to
> use the same meta language for both kinds of grammars

Why isn't this a good idea? Scanners

> even if it were
> possible to construct such a super language?

It is possible: SDF ( ) integrates the
definition of the 'lexer grammar' and 'parser grammar'. SDF is
implemented by scannerless generalized LR parsing.

Martin Bravenboer

Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.