Archive

Archive for December, 2014

Occam, Oberon, Objective-C and OCaml

December 15th, 2014 No comments

O is for Occam, Oberon, Objective-C and OCaml.

Occam might be said to have been the assembly language for the Transputer. INMOS, the company who created the Transputer, did not want the world to know what the actual assembly language was in case people used this information to write compilers, thus making it difficult for INMOS to change the instruction set in future revisions (because they would have to continue to support the existing instructions that the exiting compilers were making use of). Your author knows this because he was at what today would be called a developer launch event (which we had to pay to attend, none of the freebies that are used to woo us today) and during the question&answer session asked why developers were restricted to using the INMOS approved compilers (he was out and about promoting his company’s compiler expertise at the time and also asked “… what should companies who wanted to use a decent compiler do?”).

The Transputer had a great deal of potential and I don’t know how much the work of the software thought police at INMOS nobbled that potential, but they certainly had an impact (we have come up with this world changing approach to how things are done and you must jolly well use it; yes, they really did use phrases like jolly well). A big play was made of the formal proof of correctness of the floating-point unit; when several faults were found in the hardware the thought police were quick to report that the problems were with the specification, not the proof of correctness of the specification (I’m sure this news allayed any concerns that Transputer users might have had).

Oberon created a bit of a problem for those promoting Modula-2 as the successor to Pascal. It was created as the successor of Modula-2 by the person who created Modula-2 as the successor of Pascal (which he also created). Surely nobody would be cruel enough to use the existence of Oberon to mock the Modula-2 true believers 😉

Objective-C has survived within its ecosystem because of Apple’s patronage. A language rising to dominance in an ecosystem because of the patronage of a major vendor is nothing new and in fact might be regarded as the natural state of affairs. Does a vendor need to control the fate of the dominant language in its ecosystem? Maybe not, but the patronage is not that expensive and does provide piece of mind.

OCaml is in many ways the alter-ego of Haskell, its fellow widely used (at least in their world) functional language. Haskell’s Wikipedia page looks so different one could be forgiven for thinking they had nothing on common. The major difference I have noticed is that OCaml gets used by developers creating larger programs that other people are expected to use, while Haskell gets used for the academic stuff where authors have little interest in the code having an external life.

Tags:

NATURAL, NewSpeak and NetLogo

December 14th, 2014 No comments

N is for NATURAL, NewSpeak and NetLogo.

NATURAL was the most unnatural name for a programming language I had encountered until Processing came along. Perhaps the name had a very different feel for the Germans who created it. I have some job advert counts from the early 1990s showing Natural in the top 20 of most frequently listed languages.

NewSpeak is a language generally associated with a book. The programming language I have heard about (whose memory appears to have almost faded away; it is a distance memory of those working in the lab that created it) is not the language listed on Wikipedia.

NetLogo looks like a fun agent based simulation language. It has been used for some interesting research and is on my very short list of languages to use, that I have not yet used in anger.

Tags:

Modula-2, ML, M4, MUMPS and Miranda

December 13th, 2014 No comments

M is for Modula-2, ML, M4, MUMPS and Miranda.

Modula-2 was seen as Pascal‘s successor in sections of the Pascal world; the reasoning was along the lines of Niklaus Wirth designed Pascal, which was a successful language, and then designed Modula-2 as Pascal’s successor, so it is also going to be a success (brand loyalty makes people believe the strangest things). Some Modula-2 believers had been involved with the standardization of Pascal and knew how to work the system and an ISO working group was soon formed to create a language standard. I was not a believer and would ask why people were going to the trouble of creating a standard for a language that hardly anybody used; replies were evenly distributed between “developers are waiting for the standard before switching” (which sounds like somebody had done some market research, but I think not) and “developers will use the language once there is a standard for it” (how do I get some of that medication?)

There seemed to be rather a lot of people writing operating systems using Modula-2, but I cannot recall anybody writing applications (unless you count the compiler and editor). Back in the day it was common for groups to create the complete stack of language/compiler/libraries/OS and sometimes even the hardware(e.g., Lisp, Pascal, Smalltalk) and I had a 68000 based system that could boot around 10 different OSes (of which only 4-5 were Unix based). The delivery of these Modula-2 OSes ran into Microsoft Windows and Linux.

What did grab every-bodies attention was using VDM to write the standard (the goal was to alternate between English prose on one page and the VDM for that prose on the opposite page, giving the best of both worlds; but the effort required to make this happen was not available, only the VDM work had any funding and in practice the two forms intermingle). It was hoped that source checking tools would be derived from the formal specification, but the only working tools produced were the one needed to format the document.

ML is the granddaddy of all functional languages.

M4 is one of those hidden in plain sight languages that nobody ever talks about.

MUMPS has been popping up, every now and again, in my life since university; fragments of overhead conversation, the word jumping out from a job advert in the computer press and the 115 pages of BS ISO/IEC 11759:1999 Information technology — Programming languages — M dropping through my letterbox (yes, there were naming issues). It has been cleverly letting Cobol take all the heat for being used by business people, or perhaps developers don’t want to tempt fate by ridiculing a language whose primary base is in the healthcare industry.

Miranda was THE functional language for a short period; its designer’s use of combinator graph reduction as an implementation technique for functional languages had created a major buzz. However, trying to commercialize the language implementation quickly killed its market position (if you think that selling to commercial developers is hard, try selling to CS academics) and Haskell rose to fame.

Tags:

Lisp, Logo, LaTeX and Lua

December 12th, 2014 11 comments

L is for Lisp, Logo, LaTeX and Lua.

Lisp lovers are rarely non-clever people, but not all clever people are Lisp lovers (declaration of self-interest: I am not a Lisp lover). Many of the Lisp lovers I know really enjoy playing Dungeons and Dragons and the common connection is surely getting joy from traversing labyrinthine structures. Lisp lovers must be the go to subjects for any PhD project looking for neurological predispositions for programming.

All those brackets were never intended to be, the early Lisp implementations just got left in a prerelease state and the world has been going cross-eyed ever since.

Lisp’s big selling point was being able to treat data and code interchangeably. Figuring out what to do and creating the code to do it is a surprisingly common requirement in a dynamically changing world and many Lisp applications involve real world interaction (just like Javascript does).

Logo is the original Scratch, or whatever computer language people think children should be learning these days. It was created 20 years before computers were cheap enough to allow children to touch them, so its primary use was in media related activities pontificating about what children will be doing in the future. Turtle graphics grew out of Logo and has proved to be a very intuitive way, for many people, of creating useful line drawings.

LaTeX is probably used by lots of developers without them ever thinking about the fact that it is a programming language (i.e., it is Turing complete). But then it is the kind of language that would make you want to run away screaming rather than use it to write ‘code’.

Lua might be the first language that received its starting impetus from a trade barrier (Brazil wanted to foster a home computer industry; I have no idea what has been going on with programming language in China, but I have encountered lots of Soviet fans of Algol 68). Lua has carved out a niche as an embedded scripting language and a major LaTeX system is now being migrated to a Lua framework.

Things to read

LISP 1.5 Programmer’s Manual by John McCarthy, Paul W. Abrahams, Daniel J. Edwards, Timothy P. Hart and Michael I. Levin.

The Computer Science of TeX and LaTeX by Victor Eijkhout.

I will probably gets lots of emails unless the following gets a mention: Structure and Interpretation of Computer Programs by Harold Abelson, Gerald Jay Sussman and Julie Sussman. If you have not read it yet and love playing Dungeons and Dragons, then you have a treat in store. MIT don’t use it as THE undergraduate CS book anymore, where its purpose was once to train MIT students in extracting information from high noise environments.

Tags:

KRL and the K semantic-framework

December 11th, 2014 2 comments

K is for KRL and the K semantic-framework.

KRL (Knowledge Representation Language) is a mechanism to feed facts and relationships to dumb programs to make them smart. Lists of facts and relationships were published in the appendices of reports and people would excitedly rush around waving them about, asserting that with a bit more effort this data would bring forth an intelligent system that would render human involvement in x and y a thing of the past. Many years and millions of dollars later Cyc is still putting in that little bit more effort.

The promise of what could be done with knowledge representation, and other AI related ideas, helped create a huge bubble around building human-like computer software systems in the early 1980s. Fear of what Japan’s fifth generation computer project might achieve also played a big part in fueling this bubble, however without the visible success of Japanese home electronics/cameras this project could well have gone unnoticed in the West (we had the Alvey programme in the UK, which managed to spend lots of money without delivering anything remotely useful; reported to me by several people involved in its projects).

The K semantic-framework gets a mention because it was used to write C-semantics, the most impressive (by far) formal semantic of C I have seen (it is also the only other language beginning with K that I know anything about). Think of it as Coq without users into soap powder advertising.

Tags:

Jovial, JCL, Java and Javascript

December 10th, 2014 No comments

J is for Jovial, JCL, Java and Javascript.

Jovial is rather like Cobol in that it is very widely used in one huge market (software for US DOD systems) and has almost non-existent usage outside that market. I have always heard it described as Fortran-like, however it shares many keywords with Algol 60 and some of the letters from its acronym name are the original name used for Algol (International Algorithmic Language). The design of Jovial was contemporary with both languages and I imagine that there was lots of cross fertilization between the design groups.

JCL (Job Control Language) is what the non-Unix world calls shell scripts.

Java is a child of the Internet. Yes, there were a lot of people unhappy with C++ and very willing to jump ship, but the Internet made the marketing slogan “write once, run anywhere” sound like it truly was the future (which it might well be for some language at some future date).

Java was a wake-up call to compiler vendors who had not being paying attention to the impact that Open source was having on the market. Those who followed the time honored tradition of bolting a new language on to the front of their existing product line soon found they could not sell against free; the sound of scales falling from eyes could be heard around the world.

Up until Java arrived the two ways of making money from a language were selling compilers and training; neither were big enough money spinners to be of interest to a company as large as Sun Microsystems. Word was that Sun thought they could make money from Java, exactly how was never spelled out (but Oracle are certainly going for it). The Java Study Group was set up to investigate the possibility of creating an ISO standard for Java. Sun’s enthusiasm for the work of this group became crystal clear during what was supposed to be a two day meeting at Sun’s offices in Cupertino; half way through the first day we were told that no meeting room was available to host us on the second day, a flurry of phone calls resulted in a meeting room being found for the second day, down the round at Apple. The minutes thank both hosts, with no mention of an abrupt switch of venue.

Javascript would not have existed without the Internet and its ‘design’ must be a contender for the most costly software mistake ever made.

Tags:

INTERCAL and Icon

December 9th, 2014 No comments

I is for INTERCAL and Icon.

INTERCAL is a parody of a programming language whose most widely known statement is COME FROM. With the storm in a teacup that once engulfing goto remembered as ancient history, COME FROM now feels like an in-joke for developers who write assembler.

Icon was created by one of the creators of Snobol 4 and so received more attention that it might otherwise have got. The language’s main claim to fame is that generators, known as iterators in more modern languages, are a fundamental building block of its design (the functionality will look very familiar to Python users); Icon is a late period Cambrian explosion language (CLU supported iterators and started life a few years earlier, but I don’t know enough about the language to make a call on the question of fundamental building block status).

Things to read

Books about Icon.

Tags:

Haskell, HTML and Hack

December 8th, 2014 No comments

H is for Haskell, HTML and Hack.

In the early 1980s every computer science department with any research pretensions had at least one member of staff who had invented and implemented their own functional language; in many cases PhD students were writing a thesis on some particular aspect of their supervisor’s language. One part of history that is universally ‘forgotten’ is the extent to which languages got redesigned to take advantage of neat implementation tricks discovered by these earnest students; yes, it really was implementation details that shaped the structure of so many of these languages. Eventually it dawned on the academics that perhaps the reason the world was not converting to using functional languages was because there was no obvious market leader, so they did what the commercial world had already tried to do several times, they created a language that was to be the market leader, i.e., Haskell (which has had as much success at attracting the rest of the world as its commercial counterparts). The number of new functional languages did drop significantly, but this was due to a change in fashion and nothing to do with the Haskell work.

HTML is the computer language that most non-programmers have heard of that many developers don’t consider to be a programming language (because it is not Turing complete).

Hack is a language I wrote about earlier in the year.

Tags:

GPSS and Go

December 7th, 2014 No comments

G is GPSS and Go.

GPSS (General Purpose Simulation System) is a domain specific language for discrete time simulation. This language gets a mention because it was created during the programming language Cambrian explosion that occurred during the late 60s/early 70s when people created all the programming language concepts we have today (pointers to new ideas that came later welcome); it also gets a mention because there are so few languages starting with G.

Go will be of interest to future software archeologists. Being created in to a world where everything is recorded its brief period of fame and decline into oblivion will be available for detailed scrutiny. How many young guns are creating, without management bothering to take an interest in technical details, the foundations of important corporate applications using Go? One common difference between Corporate source and Open source is that corporate people are often unwilling to give up what they see as substantial investment in existing code (open source developers are no different, but in this environment there are always plenty of other developers, with no stake in existing code, willing to jump in and write something ‘new’). Hopefully I will be around to see the results from analyzing the data from projects developed and maintained over Go’s history.

Tags:

Fortran, Forth, Frink and Flow

December 6th, 2014 3 comments

F is for Fortran, Forth, Frink and Flow.

Fortran is a member of the triumvirate of founding major languages. Its continuing success is due to it doing a good job addressing important application areas (i.e., big engineering and scientific number crunching problems) and the availability of high quality compilers (non-software engineers will pay for good tools and big science problems attract lots of government funding). Computer science types may turn their noses up at Fortran, but they are in the fashion business (which is why Algol 60 died out) and not interested in solving big real-world problems. For many years Fortran compilers reliably generated faster code than C compilers because lots of customers will willing to pay for high quality (C compilers assumed that optimization was the programmer’s responsibility) and because Fortran’s lack of pointer types made it much easier to figure out what code was doing. These days C compilers have been around long enough that high-end optimizations have trickled into them and they also have enough processing power and main memory to figure out what those pointers are getting up to.

Forth and BASIC were the minimalist languages of their day (and Forth still is in some niches). Forth was the techi’s language (who else would be content to write code using reverse Polish) that allowed developers to get that bit closer to the bare metal and talk technical stuff about threaded code. Perhaps the reason Forth has lost the general appeal it once had is because its core users have embraced respectability (e.g., creating a language standard and boasting about the availability of ‘proper’ compilers); returning Forth to its roots of bare metal, threaded code, hell-raising might bring back the following it once had.

Frink supports dimensional analysis of source code, something I am a big fan of (this kind of analysis is a step up from traditional strong type checking). Other languages designed, in a bygone age, to support dimension checking have disappeared, as have proposals to extend popular languages.

Flow adds gradual typing to Javascript; it is from Facebook who seem to be making a habit of this sort of thing, and I hope they succeed. A consequence of OO’s success is that most developers now think of types purely in terms of a mechanism for matching calls to the appropriate method. The idea of intentionally creating type information that is intended to fail to match, as a means of detecting coding mistakes, has been lost to a generation of developers. I would like to predict that languages supporting strong static typing are going to make a big comeback (once word starts to spread about the time saving benefits of static type checking), but things did not turn out that way last time such languages were in fashion.

Tags: