Re: Yacc parsers: Cleaning up the wreckage!

Markus Armbruster <>
20 Aug 1997 23:52:28 -0400

          From comp.compilers

Related articles
Yacc parsers: Cleaning up the wreckage! (1997-08-16)
Re: Yacc parsers: Cleaning up the wreckage! (Dwight VandenBerghe) (1997-08-19)
Re: Yacc parsers: Cleaning up the wreckage! (Markus Armbruster) (1997-08-20)
Re: Yacc parsers: Cleaning up the wreckage! (1997-08-24)
| List of all articles for this month |

From: Markus Armbruster <>
Newsgroups: comp.compilers
Date: 20 Aug 1997 23:52:28 -0400
Organization: Some lucky fish from Karlsruhe, Germany
References: 97-08-047
Keywords: parse, yacc, errors (Martin Harvey) writes:

> However, the parser is bottom up, and as a result of the failure of
> rule e:_f then c,_f and S will not have been created and the parser
> will abort. This now means that I have a whole load of allocated
> objects on the parser stack, and no syntax tree at all :-(
> How on earth do I go about clearing up?? I suspect I need to comb
> through the parser stack finding all the parts of the syntax tree and
> deallocating them. However, this presents problems for me:
> What methods are there for picking up the pieces in situations like
> this??
> [Yacc isn't very helpful there, is it? I usually use my own malloc()
> wrapper than chains together the space it allocates so I can run down the
> chain and free everything after the parse completes one way or the other.
> -John]

Alternatively, have a look at Obstacks from GNU libc or at

    author = {David R. Hanson},
    title = {Fast allocation and deallocation of memory based on object lifetimes},
    journal = {Software Practive and Experience},
    year = 1990,
    volume = 20,
    number = 1,
    month = Jan,
    pages = 12

The basic idea is grouping objects with common life time together and
freeing them together. This simplifies programming and can be much
faster than calling malloc() and free() for every object.



Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.