Re: Machine code parsers (entropy of machine code)

jacob@jacob.remcomp.fr (Jacob Navia)
7 Dec 1996 23:10:34 -0500

          From comp.compilers

Related articles
Machine code parsers (entropy of machine code) andrey@ix.netcom.com (Andrey I. Savov) (1996-11-24)
Re: Machine code parsers (entropy of machine code) derek@knosof.co.uk (1996-11-26)
Re: Machine code parsers (entropy of machine code) andrey@ix.netcom.com (Andrey I. Savov) (1996-12-01)
Re: Machine code parsers (entropy of machine code) derek@knosof.co.uk (1996-12-03)
Re: Machine code parsers (entropy of machine code) jacob@jacob.remcomp.fr (1996-12-07)
| List of all articles for this month |

From: jacob@jacob.remcomp.fr (Jacob Navia)
Newsgroups: comp.compilers
Date: 7 Dec 1996 23:10:34 -0500
Organization: Compilers Central
References: 96-11-147 96-11-155 96-12-021
Keywords: code, theory

> Derek M Jones <derek@knosof.co.uk> wrote
>
> > I once did a little experiment. I measured how well gzip compressed
> > executable programs for a vareity of machines. There did seem to
> > be some correlation between compression ratios for the same programs,
> > compiled for different cpu's.
>
> > Do you think optmised code will have a higher or lower entropy?
>
> It's hard to say, I'd expect higher than non-optimized, because
> usually some redundancies are removed during the optimization.


Optimized for what?


If the code was optimized for speed, then we would expect that the
compiler performs in-line loop expansion for instance, when it can
determine that a loop will be executed at most n times. This would
increase the redundancy of the generated code since the loop would be
replicated.


If the code was optimized for space, tail-merging would decrease the
size of the code, so the redundancy would be lower.


This discussion could be more interesting if we define more precisely
the terms: optimizations are not ONE kind of thing. There are many
types, completely opposed. The redundancy level of RISC machines
machine code as has been already pointed out is lower than CISC's.


Besides here entropy is understood as redundancy level/information density.
As our old friend pkzip says when compressing:
freezing 'somefile.txt'
compressing LOWERS the entropy, because the degree of disorder (entropy)
decreases with compression... We should be saying that optimizing for
size decreases the entropy (disorder) and optimizing for speed could
increase it. Right?


P.S. Sorry but my last lesson in physics was quite a while ago...


--
Jacob Navia Logiciels/Informatique
41 rue Maurice Ravel Tel (1) 48.23.51.44
93430 Villetaneuse Fax (1) 48.23.95.39
France


--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.