Re: User definable operators

hrubin@stat.purdue.edu (Herman Rubin)
2 Jan 1997 23:15:26 -0500

          From comp.compilers

Related articles
[14 earlier articles]
Re: User definable operators burley@gnu.ai.mit.edu (Craig Burley) (1996-12-26)
Re: User definable operators mfinney@inmind.com (1996-12-26)
Re: User definable operators leichter@smarts.com (Jerry Leichter) (1996-12-27)
Re: User definable operators genew@mindlink.bc.ca (1996-12-28)
Re: User definable operators WStreett@shell.monmouth.com (1996-12-29)
Re: User definable operators adrian@dcs.rhbnc.ac.uk (1997-01-02)
Re: User definable operators hrubin@stat.purdue.edu (1997-01-02)
Re: User definable operators anw@maths.nottingham.ac.uk (Dr A. N. Walker) (1997-01-03)
Re: User definable operators WStreett@shell.monmouth.com (1997-01-03)
Re: User definable operators apardon@rc4.vub.ac.be (1997-01-07)
Re: User definable operators icedancer@ibm.net (1997-01-07)
Re: User definable operators wclodius@lanl.gov (William Clodius) (1997-01-09)
| List of all articles for this month |

From: hrubin@stat.purdue.edu (Herman Rubin)
Newsgroups: comp.compilers
Date: 2 Jan 1997 23:15:26 -0500
Organization: Purdue University Statistics Department
References: 96-12-088 96-12-163 96-12-181 96-12-185
Keywords: design

Wilbur Streett <WStreett@shell.monmouth.com> wrote:
>>>Notation has to be overloaded to be of reasonable length.


>So what is more important is that the notation is of "reasonable"
>length than it follows generally accepted and defined abstractions?


Whose generally accepted and defined abstractions? There was no
reluctance on the part of the computer language people to take quite
standard mathematical symbols, sometimes even overloaded in
mathematics, and use them with totally different meanings, and the
mathematical meanings were made unusable. I can come up with more
than a dozen such. At least the originators of Fortran apologized for
their restrictions, and the use of * and **, because of the
capabilities of their hardware.


Also, the mathematical conventions evolved. Anyone could put any
notation in his papers, WITH EXPLANATION, and some of them stuck. As
for his own use, why should anyone care? To introduce them in a
program would require telling the compiler what they mean, and I doubt
that the compiler would care.


>Suppose for a minute that I did the same with the English language?
>For the sake of demonstration I decided to change what each of the
>words in the previous sentence mean. Then you have to resort to the
>more extended reference to determine what the previous sentence means,
>because you have to check to be sure if the notation is the expected
>notation or the new extended notation. That means that the notation
>is NOT of reasonable length, but that you have to be sure to
>understand all of the supporting notation (which is not in front of
>you) in order to be able to understand the notation in front of you.


If one starts a problem by saying "let x be ...", etc., this can be
referred to WHEN NEEDED, but the logical structure understood. The
same holds for operators. But in many cases, these "new" operators
are the standard ones in mathematics, which the user already knows, or
are such simple things as abbreviations for "pack" and "unpack".


The compiler has to convert things to machine language, anyhow, and
the user should be able to get at that, and not have to use the
assembler language, designed with the idea of making it difficult to
for a person to use. I would have little problem with doing that,
myself. A versatile macro expander, with the macro structure being up
to the user, and using weak typing, would go a long way here and
elsewhere. This would be a totally non-optimizing mini-compiler for a
language with few constants.


..................


>> By the time one reads the lengthy variable names which seem to
>> delight the computer people, the structure of the expression is
>> lost.


>So long words confuse you? I don't like them either. But it's not
>the length of the words that make a program structure readable or
>unreadable.


I suspect that most mathematicians would disagree highly. We start
teaching the use of short symbolic formulation early in algebra, and
so this is not a problem. We can then look at the structure of the
statements without worrying about the usually irrelevant meaning of
the variables involved.


>> It is necessary to let the user invent notation, if necessary, and
>> for the language and compiler to help, not hinder.


>The user can invent notation in most computer languages. The question
>is not whether or not they can invent notation, but in what fashion
>they will be allowed to invent that notation and what safeguards there
>are in the language design to insure that they are clearly documented
>as being invented notations as opposed to intrinsic ones.


A few languages allow the introduction of new types, but do not allow
them to be called types. There were older languages which did. And
introducing functions is not the same as introducing operators.
Fortran compilers did optimizations at compile time for the power
operator which become far more expensive if attempted at run time; the
compiler is highly branched already, as it has to parse. Also,
functions must be in prenex form, while operators can be in any order.
--
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
hrubin@stat.purdue.edu Phone: (317)494-6054 FAX: (317)494-0558
--


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.