|keywords and identifiers.. firstname.lastname@example.org (1999-09-11)|
|Re: keywords and identifiers.. email@example.com (1999-09-16)|
|Re: keywords and identifiers.. firstname.lastname@example.org (Chris F Clark) (1999-09-16)|
|Re: keywords and identifiers.. email@example.com (Armel) (1999-09-16)|
|Re: keywords and identifiers.. firstname.lastname@example.org (Jerry Leichter) (1999-09-20)|
|Re: keywords and identifiers.. email@example.com (1999-09-24)|
|Re: keywords and identifiers.. firstname.lastname@example.org (Leif Leonhardy) (1999-09-27)|
|From:||Jerry Leichter <email@example.com>|
|Date:||20 Sep 1999 12:08:40 -0400|
|Organization:||System Management ARTS|
| > Are there languages which allow a keyword to be
| >accepted as an identifier.
| Yes, most of the old (incomprehensible) languages did that (like PL/1
| i think but i'm not sure, i'm not very used to obsolete languages)....
Ah, so modern languages like C are "comprehensible"? Perhaps you should
spend some time looking at the submissions to the Obfuscated C contests.
Everyone loves to give PL/I examples like:
IF IF = THEN THEN THEN = ELSE ELSE ELSE = END END
[Actually IF IF = THEN THEN = ELSE; ELSE ELSE = END; END -John]
(which I think is valid PL/I - it's been years, but it's at least close)
but that simply proves that you *can* write incomprehensible code in
PL/I. So? Show me a language in which you *can't* write incomprehen-
What gets forgotten in all this is the *reason* PL/I didn't reserve its
keywords. The PL/I designers knew they were designing what was, for its
time, a very large language, which was intended to be useful to very
different groups of programmers. History lesson: PL/I and the 360
series were developed at the same time. Originally, PL/I was intended
to be *the* programming language for the 360. Before the 360, IBM and
others in the business had two lines of machines, one for business (for
IBM, the 1400 series), one for scientific work (the 709x). Business
machines often did decimal arithmetic, had instructions for string
manipulation, BCD support, and such, and were programmed in languages
like COBOL. Scientific machines usually used binary arithmetic,
supported floating point, and were programmed in languages like FORTRAN
(US) and Algol (Europe). One fundamental idea in the development 360
was to have one line of machines to serve both markets. PL/I was
similarly intended to replace both COBOL and PL/I. One of the great
comments of the era was that a committee of COBOL and FORTRAN experts
sat down to design PL/I - and came up with a language that looked more
like Algol than either.
Since PL/I was large and served disparate markets, it was likely that
most programmers would only be familiar with a subset of the language.
Experience with COBOL - which had a long list of keywords, all reserved
- showed that avoiding all the keywords was a headache. It would be
even more so if you had to avoid keywords that only applied to parts of
the language you didn't know. So the PL/I designers made the deliberate
decision to avoid reserved keywords. Rather than criticizing as
incomprehensible a language you know nothing about, take a look at it
and find places where the language was made ugly or artificial to avoid
reserving keywords. I'll bet you have a lot of trouble doing so.
Lest you think that long lists of (reserved) keywords is a feature only
of obsolete languages, I suggest you look at the list for C++. I
believe C++ reserves more keywords than PL/I had - and perhaps as many
as COBOL at its worst did. (Certainly, modern C++ is a larger language
the PL/I ever was.)
Over the years, another cost of reserved keywords has become apparent:
It becomes difficult to extend the language. You're faced with the
Hobson's choice of adding a new readable keyword, thus breaking existing
correct code; adding an unreadable keyword that's unlikely to conflict
with existing code; or reusing some existing keyword for a new purpose.
There's often a bias toward the last of these - and you can see the
results in languages like C and C++, where keywords like static and
virtual have a variety of only very loosely related meanings.
I suspect the main reason people have generally reserved keywords is
two-fold: Language "style" has become fixed, and people want to use
lexers and parser generators. Language "style" is hard to pin down, but
it's clear that, whatever C "style" is, it's been a big winner - C and
Java both have C "style". However, C "style" isn't compatible with non-
reserved keywords; the language becomes ambiguous. (The reason that
IF IF = THEN THEN IF = THEN ...
isn't ambiguous in PL/I is that what follows the IF in an IF statement
is an expression; what follows the THEN is a statement; and an assign-
ment is a *statement*, not an expression.)
Making languages without reserved keywords work with compiler generation
tools - the most popular of which were designed by people who wanted to
build C compilers - takes additional effort: Grammars become larger,
more complex, and much less natural. Besides, even if you're willing to
make the changes, *how* to make them isn't well-known - just try to find
a discussion in any of the standard texts. Most people simply dismiss
non-reserved keywords as impossible.
[The reserved word problem in Cobol is so severe that many programmers
only use identifiers that start with a digit or contain a hyphen, because
the Cobol standard promises no reserved word will. Makes for programs
ugly even for Cobol. -John]
Return to the
Search the comp.compilers archives again.