Re: Have we reached the asymptotic plateau of innovation in programming la

BGB <cr88192@hotmail.com>
Tue, 13 Mar 2012 02:27:55 -0700

          From comp.compilers

Related articles
[11 earlier articles]
Re: Have we reached the asymptotic plateau of innovation in programmin cr88192@hotmail.com (BGB) (2012-03-09)
Re: Have we reached the asymptotic plateau of innovation in programmin haberg-news@telia.com (Hans Aberg) (2012-03-10)
Re: Have we reached the asymptotic plateau of innovation in programmin thomas.mertes@gmx.at (2012-03-11)
Re: Have we reached the asymptotic plateau of innovation in programmin gah@ugcs.caltech.edu (glen herrmannsfeldt) (2012-03-12)
Re: Have we reached the asymptotic plateau of innovation in programmin gah@ugcs.caltech.edu (glen herrmannsfeldt) (2012-03-12)
Re: Have we reached the asymptotic plateau of innovation in programmin haberg-news@telia.com (Hans Aberg) (2012-03-13)
Re: Have we reached the asymptotic plateau of innovation in programmin cr88192@hotmail.com (BGB) (2012-03-13)
Re: Have we reached the asymptotic plateau of innovation in programmin robin51@dodo.com.au (robin) (2012-03-11)
Re: Have we reached the asymptotic plateau of innovation in programmin jthorn@astro.indiana-zebra.edu (Jonathan Thornburg \[remove -animal to reply\]) (2012-03-14)
Re: Have we reached the asymptotic plateau of innovation in programmin gah@ugcs.caltech.edu (glen herrmannsfeldt) (2012-03-14)
Re: Have we reached the asymptotic plateau of innovation in programmin torbenm@diku.dk (2012-03-14)
Re: Have we reached the asymptotic plateau of innovation in programmin torbenm@diku.dk (2012-03-14)
Re: Have we reached the asymptotic plateau of innovation in programmin cr88192@hotmail.com (BGB) (2012-03-15)
[30 later articles]
| List of all articles for this month |

From: BGB <cr88192@hotmail.com>
Newsgroups: comp.compilers
Date: Tue, 13 Mar 2012 02:27:55 -0700
Organization: albasani.net
References: 12-03-012 12-03-014 12-03-022 12-03-027
Keywords: design, history
Posted-Date: 14 Mar 2012 00:28:32 EDT

On 3/12/2012 12:42 AM, glen herrmannsfeldt wrote:
> BGB<cr88192@hotmail.com> wrote:
>
> (snip)
>
>> but, I think the issue mostly is that both "innovation" and "pure
>> research" are often over-rated, and what is needed at this point may not
>> be the creation of fundamentally new (or even entirely consistent)
>> languages, but rather refinement, integration, and adaptation to new
>> domains.
>
> It seems to me that this is a big reason why we have the different
> languages that we do, and why we will never converge onto only one.
>
> Different needs are better met, in some cases, with different ways of
> expressing those needs.


granted, at least in the near term.


it does seem however, that on average, languages are becoming gradually
more general purpose, and the distance between "different" languages
seems to be gradually shrinking.


eventually, there may be some sort of convergence, but I think it is
more likely to be a sort of gradual and natural process. all dominant
languages become so close as to be virtually or practically identical,
rather than as some sort of more dramatic "one language to rule them
all" event.


in such a scenario, language convergence will have been so widespread
that people may have, by this point, ceased to clarify which language
they were using, since most would have become largely copy-paste
compatible anyways.




>> so, better I think is trying to invest effort in creating "solid"
>> languages which can effectively integrate much of what exists and seems
>> to work well in-general, even at the cost of many of the more
>> academically inclined are liable to make accusations of "blub" at such
>> things (mostly due to things like syntactic and semantic similarity with
>> mainstream languages).
>
> PL/I, the original all-in-one language, is still used, but much less
> often than some others. Among its goals, was to replace Fortran.
>
> Now, with Fortran 2003 and Fortran 2008, a large fraction of the PL/I
> features have been included, and more.


yes, ok.




> (snip)
>> I also tend to see needless minimalism as, well, needless. simpler
>> syntax doesn't mean a simpler or easier to use language, and more so
>> doesn't mean a simpler implementation.
>
> That seems, to me, hard to say. Too many features make a language too
> hard to remember, requiring more reference to documentation while
> programming. But also, as you indicate, needless minimalism doesn't
> help. It can make it harder to do some simple operations.




well, I think it depends some on how much simplicity or complexity (or
minimalism, or not) is in question.


for example, JavaScript is fairly small/simple if compared with C, Java,
or C#.
OTOH, it is fairly large and complex if compared with Scheme or Self.




what about a language which is more complex than JavaScript, maybe
roughly on-par with C or Java, and generally simpler than C++ and C# ?
what about a VM where the bytecode has 100s of unique operations?
...




but, OTOH:
in a language like JavaScript you can type "a+b*c", and get the expected
precedence.




this is different than typing, say (in Scheme):
"(+ a (* b c))"
or (in Self):
"b * c + a" (noting that "a + b * c" will give a different answer).




as I see it, the sorts of minimalism where one can't "afford" to have
things like operator precedence, or including conventional control-flow
mechanisms, is needless minimalism.


most real programmers have better things to do than sit around working
around awkward ways of expressing arithmetic, and figuring out how to
accomplish all of their control flow via recursion and if/else (and so
help you if you want something like sane file IO, or sockets, or
threads, ...).


(and, it is not necessarily a good sign when things like loops, file IO,
arrays, ... are supported by an implementation as... language
extensions...).




but, many people hate on C and C++ and so on, claiming that languages
"should" have such minimalist syntax and semantics. however, such
minimalist languages have generally failed to gain widespread acceptance.




likewise, although a person can make an interpreter with a small number
of total opcodes, typically this means the program will need a larger
number of them to complete a task, and thus run slower.




for example, a person could make an interpreter with roughly 3 opcodes
which fairly directly implements lambda calculus... but it will perform
like crap.


10 or 15 is a bit better, then one probably at least has "the basics".


with several hundred opcodes, arguably a lot of them are "redundant",
being easily expressible in terms of "simpler" opcodes, but at the same
time, a single opcode can express what would otherwise require a chain
of simpler opcodes.


like, "wow, there is this here 'lpostinc' opcode to load a value from a
variable and store an incremented version of the value back into the
variable". is this opcode justified vs, say: "load x; dup; push 1;
binary add; store x;"? I figure such cases are likely justified (they do
tend to show favorably in a benchmark).




OTOH, a person can go too far in the other direction as well.




>> some people also make accusations of "keeping every onion", but as I
>> see it, keeping common syntax and features by no means implies
>> that one slavishly follows every possible rule.
>
> PL/I included many features from Fortran, COBOL, and ALGOL, but
> overall kept a nice, consistent, usage. Very few of what seem to be
> arbitrary restrictions.
>
> Fortran, on the other hand, even as it has evolved has kept many
> restrictions that seem strange.




yep.


in my case, it was more about an ECMAScript-family language
incorporating some features from languages like ActionScript, Java, C#
and C++, among them: packages, classes, properties, pass-by-value
objects (structs and value-classes), ...


however, as I have seen it, I haven't really inherited their limitations.


for example, the language still supports loading from source, eval, ...
even despite incorporating features from languages which use static
compilation.


the implication is that if one borrows features from a traditionally
statically-compiled language, one will also inherit its limitations,
somehow forcing static compilation and an inability to use late-binding
or something, but this isn't really the case. like, people believe there
is some sort of mutual exclusion thing going on, like for every feature
one adds some other feature has to be removed or something?...




likewise, despite adding support for type annotations and classes, both
dynamic types and ex-nihilo objects still continue to work.


and, at the same time, there are value classes which may have things
like copy-constructors and destructors (but, at the same time, how they
work internally differs somewhat from the C++ analogue).




but, such things provoke negative commentary (value-classes are an
example of such an "onion").


but, I don't think it is too severe of an issue, or even that the
language complexity is necessarily even all that bad, granted, the
language makes no claim of being "minimal" either, which is the issue.




> (snip)
>
>> but, many people apparently see a C-family syntax and automatically
>> judge it negatively as a result, whereas I happen to feel that the
>> syntax works fairly well and personally see no "obviously better"
>> solution (either functionally or aesthetically).
>
> Well, reserved words do make it hard to extend a language and
> stay compatible with older programs.




a lot depends on how much of a legacy codebase their is, and how likely
the words are to clash, and how "severe" code-breaking scenarios would be.


in my case, the vast majority of reserved words came from "sibling"
languages, with relatively few "original" reserved words, although there
are some (fun/quote/unquote/... being notable examples). some others,
like "value_class", are unlikely to be used as names.


many borrowed reserved words have similar meaning as in their source
languages, but some others have somewhat different meanings. for
example, the "delegate" modified does something very different in my
language than it does in C# (which in my language is more handled by a
piece of syntax known as "typedef function ...", where ironically
"typedef" is used but is also behaviorally somewhat different from in C
or C++).


but (in BGBScript):
"typedef function foo(x:int):void;"
does something sort of like (in C#):
"delegate void foo(int x);"
and sort of like (in C):
"typedef void (*foo)(int x);"


which is very likely "good enough":
and the subtle differences are about like asking "in which ways is void
a value type or not a value type?".




at the same time, there is a relative lack of legacy code, and even less
that I really have to worry too much about breaking.




longer term, what will be the result of this? well, this will be an
issue for the future to deal with.



Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.