|[QUERY] Incremental dependencies firstname.lastname@example.org (1997-01-09)|
|Re: [QUERY] Incremental dependencies email@example.com (David L Moore) (1997-01-12)|
|Re: [QUERY] Incremental dependencies firstname.lastname@example.org (Jan Gray) (1997-01-14)|
|Re: [QUERY] Incremental dependencies email@example.com (Lassi Tuura) (1997-01-15)|
|From:||Lassi Tuura <firstname.lastname@example.org>|
|Date:||15 Jan 1997 11:31:02 -0500|
|Organization:||CERN, European Laboratory for Particle Physics|
John Lilley <email@example.com> asked about incremental compilation,
to which "Jan Gray" <firstname.lastname@example.org> responds:
> I designed or codesigned several such VC++ features. These include:
> 1. precompiled headers (C/C++7.0)
> 2. program database and incremental debug info update (VC++ 1.0)
> 3. incremental linking (VC++ 2.0)
> 4. incremental recompilation (VC++ 4.0)
> 5. "minimal rebuild" (VC++ 4.0)
The only other compiler I have heard implementing similar features is
the Silicon Graphics (or is it MIPS?) Delta/C++ compiler (DCC). It
does similar analysis as outlined by Jan Gray. Here is an excerpt
from the manual pages:
>> In [the Smart Build/Delta-C++] mode, when the compiler sees that a
>> header file has been modified since it was last pre-compiled, the
>> compiler (as above) generates a new pre-compiled header file. In
>> addition, it also computes the differences between the two versions
>> of this header file, and computes whether any source files
>> dependent on this header file will need to be re-compiled to
>> conform to this change.
In addition, DCC has some magic that when a class/struct changes, the
dependent sources need not be recompiled. It does a lot more work at
link time to sort things out. If I have figured it out right, the
compiler first scans thru the classes and determines if they have
changed in a way that requires recompiling clients. If not, it quits
right there. Offsets to class member access are fixed at link time,
when the class definition is made available from the "implementation"
I have never used the compiler myself, but a collegue of mine did (and
still does, I believe). According to him the compile times were short
virtually all the time, since the compiler mostly skimmed thru the
files -- even if the classes in the headers had changed. The class
implementation is, of course, recompiled.
I am not sure if the way DCC handled class-modification-at-link-time
is entirely standard conforming. I think there were some extensions,
but it could be that they weren't to the language, but requiring a
different way of linking. But then, I could be misremembering.
> Also, when building incremental *anything*, it is imperative that you
> choose the appropriate granularity of change to handle incrementally.
> For C/C++, I believe that file granularity is too coarse and that
> statement level is too fine grained. Too many systems "in the labs"
> did not scale up in the real world because they were too clever and
> did too much work keeping too much state, trying too hard to avoid
> relatively inexpensive operations such as reparsing a single function
> or class declaration.
I believe DCC makes a choice of a class being that granularity.
Sounds a reasonable choice to me, assuming that one writes well
Lassi.Tuura@cern.ch There's no sunrise without a night
Return to the
Search the comp.compilers archives again.