Re: Have we reached the asymptotic plateau of innovation in programming languages

glen herrmannsfeldt <gah@ugcs.caltech.edu>
Mon, 11 Jun 2012 22:13:23 +0000 (UTC)

          From comp.compilers

Related articles
[46 earlier articles]
Re: Have we reached the asymptotic plateau of innovation in programmin haberg-news@telia.com (Hans Aberg) (2012-06-10)
Re: Have we reached the asymptotic plateau of innovation in programmin martin@gkc.org.uk (Martin Ward) (2012-06-10)
Re: Have we reached the asymptotic plateau of innovation in programmin blog@rivadpm.com (Alex McDonald) (2012-06-10)
Re: Have we reached the asymptotic plateau of innovation in programmin robin51@dodo.com.au (robin) (2012-06-11)
Re: Have we reached the asymptotic plateau of innovation in programmin torbenm@diku.dk (2012-06-11)
Re: Have we reached the asymptotic plateau of innovation in programmin DrDiettrich1@aol.com (Hans-Peter Diettrich) (2012-06-11)
Re: Have we reached the asymptotic plateau of innovation in programmin gah@ugcs.caltech.edu (glen herrmannsfeldt) (2012-06-11)
Re: Have we reached the asymptotic plateau of innovation in programmin robin51@dodo.com.au (robin) (2012-06-13)
| List of all articles for this month |

From: glen herrmannsfeldt <gah@ugcs.caltech.edu>
Newsgroups: comp.compilers
Date: Mon, 11 Jun 2012 22:13:23 +0000 (UTC)
Organization: Aioe.org NNTP Server
References: 12-03-012 12-03-014 12-06-008 12-06-032
Keywords: design, i18n
Posted-Date: 12 Jun 2012 00:51:11 EDT

Torben Fgidius Mogensen <torbenm@diku.dk> wrote:


(snip, someone wrote)
>>>>Personally, I'd say there's been precious little new in programming
>>>>languages since Simula gave us OOP in the late 1960s.


> I wouldn't say so. Advanced type systems (bounded polymorphism and
> linear types to name a few) have enetred the picture since.


(snip about ASCII)


> As John mentioned, APL has been around for ages and used a lot of
> non-ASCII symbols. Algol was originally designed to use several
> non-ASCII symbols that could be encoded in different ways depending on
> the local symbol set. ASCII was by no means a standard then --
> FIELDATA and EBCDIC were common alternatives, so the choice was either
> to limit the language to use the common subset (which was rather
> small) or to use an ideal set of symbols and allow these to be
> encoded.


I thought ALGOL was older than both ASCII and EBCDIC.


EBCDIC, and its punched card coding, came with S/360 and the 029
keypunch. Before that, IBM had BCDIC (a six bit code) and the 026.


I (just barely) remember multipunching the codes needed for B5500
ALGOL on the 026. They put big charts on the wall (so you could
read them from across the room) showing the multipunch codes.


Was going from six bit codes to seven-bit ASCII a great awakening,
or a big mistake, not going directly to eight bits?


> ASCII certainly has the advantage of being easy to type using a
> standard keyboard, but with touch screens it is now not so difficult
> to have soft keyboard with various extensions.


It is convenient with a keyboard coded for ASCII characters!


There is the favorite quote, though I forget where it came from,
"The nice thing about standards is that we have so many to choose from."


> But if I were to go outside ASCII for a programming language, I
> would also use extended layout: subscripts, superscripts and more.
> Using a subset of HTML for program layout would work fine: Programs
> can be displayed in any browser and you can use HTML editors (and
> even ASCII editors) to edit programs if you don't have access to a
> dedicated IDE.


Java went to full Unicode, with \u escapes so you can enter codes
and edit on ASCII editors. I don't know if there are Unicode
editors yet, though.


-- glen


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.