TECO, Tcl/Tkl and TeX

December 20th, 2014 (14 hours ago) 1 comment

T is for TECO, Tcl/Tkl and TeX.

TECO was present during creation and begat Emacs. Despite living on TECO’s home turf, as an undergraduate, my lack of beard and disdain for Dungeons & Dragons stopped me becoming a member of its inner sanctum.

Tcl/Tkl is practical combination of scripting language+GUI toolkit for adding a usable (if sometimes somewhat fragile) GUI wrapper around one or more command line programs. Tcl is one of those languages that gets the job done without anyone making a big fuss about it.

TeX is Donald Knuth’s original typesetting system, however the language built on top of it, LaTeX, is probably more well known these days. While creating TeX Knuth proposed an approach to combing code and documentation together and gave it the catchy name literate programming; unfortunately this approach was not strangled at birth and because of Knuth’s reputation and the catchy name it keeps regrowing heads that need to be cut-off. The literate programming approach requires code, comments and formatting instructions be jumbled together in a horrendous mess, which is then feed through a process that can produce visually beautiful output (if enough effort is put into inserting the appropriate incantations into the horrendous mess). When writing something that will primarily be read at the beautiful output stage it is worth investing in the horrendous mess stage, but most code is used in its native, horrendous mess, form. Knuth has the mathematician’s habit of using single letter identifiers and love of ‘neat’ tricks, he would be a disaster on any software engineering project where he could not be left alone in a room producing code that nobody ever needs to touch.

Tags:

Snobol 4, Simula 67, Smalltalk, sed, SQL, SETL, Scratch and Spreadsheet

December 19th, 2014 No comments

S is for Snobol 4, Simula 67, Smalltalk, sed, SQL, SETL, Scratch and Spreadsheet.

Snobol 4 was the go to language for text generation and pattern matching before PERL/Python became widespread in the late 1990s (if your PERL/Python code is too slow try Snobol, compilers generating machine code have been written for it since the 1970s). It is a boutique language that entrances many people who encounter the Green book (your author fell under its spell as an undergraduate).

Simula 67 was the first object oriented language. However, 30 years went by before object-oriented programming could be said to have entered general use (some people would claim that this has still not happened). Yes, lots of developers were using C++ compilers in the late 1980s, but only because they wanted to look trendy and be able to list the language on their CV. During the late 1980s and early 1990s I was heavily involved with the static analysis of C using an in-house tool and found that a lot of C++ source could be handled by adding a few extensions to the tool, such as handling new (rather than malloc) and allowing class to be used where struct should have appeared. It was not until the mid-90s that parsing C++ source with an extended C parser became problematic on a regular basis (and then often only because of the use of iostreams).

Back in the day it was said that only Scandinavians understood OO programming. I think the real reason was that only Scandinavians understood the English used in the book written by the language designers: “Simula BEGIN” by Birtwistle, Dahl, Myrhaug and Nygaard (there is a rave review on Amazon, I wonder if the reviewer is Scandinavian).

Smalltalk is invariably the language of choice for object-oriented purists. I know of several development groups in industry that choose to use Smalltalk because they wanted to really learn how to do things the OO ‘way’ (Smalltalk is all objects, its users cannot do anything in a non-OO way); I don’t know what those subsequently responsible for maintaining of the code thought about this rationale.

The ‘launch’ of Smalltalk in the UK by Goldberg and crew happened before Apple’s Macintosh made Xerox PARC famous, and the talk was all about message passing being the future and everybody being able to modify any behavior of their operating environment (we were shown Smalltalk’s flexibility by a speaker who changed the definition of an operation that defined how the borders of windows were displayed; they agreed that this was potentially dangerous but that users soon learned by their mistakes). The launch of the Macintosh changed the whole PARC narrative.

Thinking back, the only thing that stopped the Smalltalk team inventing Open Source was the fact they thought it so obvious to ship the source of everything that it was not worth mentioning.

sed gets a mention as the most unlikely language used to implement a Turing machine.

SQL is invisible to developer who don’t use it because publishing SQL source is generally pointless unless the appropriate data is also available. There is a lot of SQL lurking (as text strings) within the source of programs written in other languages. If you want to see some SQL then check out the Javascript embedded in a lot of the web pages you read.

SETL is a Cambrian explosion language which has had a small, but influential, user base.

Scratch suffers from being thought of as a language for children. If touch screens had been around when people first started inventing programming languages I’m sure things would be very different today, where most developers think that real programs are written in lines of text.

Spreadsheet languages have for a long time been a neglected topic from the perspective of tools to find faults and programming language research. Things are starting to change in industry and hopefully the work of keen academics will be appreciated in academia.

Things to read

The SNOBOL4 Programming Language by Ralph E. Griswold, J. F. Poage, and I. P. Polonsky.

Every language has its blind spot, the techniques that should never be written using that language. In the case of SQL it is tree data structures: “Joe Celko’s Trees and Hierarchies in SQL for Smarties” by Joe Celko.

Tags:

Ratfor, R, RUNOFF, RPG and Ruby

December 18th, 2014 No comments

R is for Ratfor, R, RUNOFF, RPG and Ruby

Ratfor is a structured form of Fortran from the days when structured programming was the in-thing and Fortran did not have much of it (lots got added in later revisions). I think its success came from allowing users to claim a degree of respectability that Fortran did not have, and which Fortran did not appear to gain when structure constructs were added to it (but then all successful languages are treated with suspicion in some circles).

The maintainers of R provide a valuable lesson on issues that are not important for a language to be widely used, such as the following (which I’m sure many of those involved with languages incorrectly think are important):

  • Technical language details are not important (e.g., functional, imperative, declarative, object-oriented, etc); as far as I can tell the language has hardly changed over the years, yet users are not bothered. What is important is how easily the language can be used to solve its users’ problems. There are umpteen different ways a language can be designed around R’s very useful ability to operate on vectors as a single unit or to dynamically create data-frames, it does not make much difference how things are done as long as they work.
  • Runtime efficiency is often not important; a look at the source of the R runtime system suggests that there are lots of savings to be had (maybe a factor of two). Users are usually a lot more willing to wait for a running program to complete than struggle with getting the program running in the first place; the R maintainers have concentrated on the tuning the usability of the ecosystem (intentionally or not, I don’t know). Also, R appears to be like Cobol in that the libraries are the best place to invest in performance optimization. I say ‘appears’ because I have noticed a growing number of R libraries completely written in R, rather than being a wrapper to code in C or Fortran; perhaps the efficiency of the runtime system is becoming an important issue.

    Most programs don’t use a lot of cpu resources, this was true back when I was using 8-bit cpus running at 4MHz and is even more true today. I used to sell add-on tools to make code run faster and it was surprising how often developers had no idea how long their code took to run, they just felt it was not fast enough; I was happy to go along with these feelings (if the developers could recite timing data a sale was virtually guaranteed).

plot is an unsung hero in R’s success, taking whatever is thrown at it and often producing something useful (if somewhat workman-like visually).

RUNOFF is the granddaddy of text processing systems such as *roff and LaTeX. RUNOFF will do what you tell it to do (groff is a modern descendant), while LaTeX will listen to what you have to say before deciding what to do. A child of RUNOFF shows that visual elegance can be achieved using simple means (maintainers of R’s plot function take note). Businesses used to buy computers and expensive printers so they could use this language, or one of its immediate descendants.

RPG must be the most widely used proprietary programming language ever.

Is Ruby’s current success all down to the success of one web application framework written in it? In its early years C claim to fame was as the language used to write Unix (I know people who gave this as the reason for choosing to use C). We will have to wait and see if Ruby has a life outside of one framework.

Things to read

“The R Book” by Michael J. Crawley.

Tags:

QCL

December 17th, 2014 No comments

Q is for QCL.

Quantum computers are still so new that creators of programs languages for them are still including the word quantum in their name. I shouldn’t complain because without them today’s post would be about existing languages with the word quick tacked on the front.

Conventional modern computers contain transistors and while a knowledge of quantum mechanics is required for a detailed understand of how these work, the layman’s explanation can be given using classical terms and does not involve any apparently weird behavior (tunnelling can be skipped as a technical detail). Quantum computers make direct use of superposition and entanglement, both of which are at the very weird end of quantum mechanical effects.

Conventional computers work by toggling the state of transistors between 0/1 (a bit), Quantum computers work by performing unitary transforms on Qubits.

Unitary transforms are reversible (by the definition of the transform). This means that feeding the output value from a Quantum computer back through the output will result in the value of the original input appearing at the input. Sounds interesting until you realise that this places a couple of major restriction on the structure of a quantum program:

  • the number of input bits must equal the number of output bits (its ok for some of these bits to be don’t-care bits from the point of view of interpreting them to give an external meaning),
  • variables cannot be overwritten (this would destroy information and is not reversible),
  • variables cannot be copied (there is no unitary transform available to do this).

Developers coding for parallel processors need to stop complaining, their life could be a lot harder. In fact figuring out how to get a quantum computer to do anything useful is so hard that people have taken to using genetic algorithms to obtain useful practical results (it was a search based software engineering workshop where I first learned about the realities of programming a Quantum computer).

These programming constraints actually make one feel relieved that the largest available Quantum computer only contains 128 Qbits that need to be used up.

It is possible to build a transistor-based reversible computer using Toffoli-gates; it has been shown that such a computer would be capable of performing any computation that a Turing machine could perform. An important theoretical result that shows that Quantum computers are at least as powerful as conventional computers.

Building a Quantum computer is very hard. The basic approach is to prepare a superposed state (perhaps using ion traps or microwaves), then apply unitary transformations in a way that takes advantage of quantum parallelism, followed by gathering up the result and making a measurement. Quantum computers are like ASICs (i.e., they need to be configured to perform one kind of calculation) and not general purpose programmable computers.

Quantum parallelism is what gives Quantum computers the potential for having a huge performance advantage over today’s computers. The poster child for this performance advantage is Shor’s algorithm (which finds the prime factors of very large numbers, such as those used in public key encryption).

Quantum computers are very delicate beasts that are destroyed by interaction with the world around them (e.g., being hit by a passing atom lurking within the near perfect vacuum in which they operate, or a stray photon bouncing around the cavity in which the computer is held).

I have a great idea for hackers wanting to stop a government agency cracking their communication keys using a Quantum computer running Shor’s algorithm. What the hackers need to do is convince the agency that some very large prime numbers are being used as keys, the agency will feed these numbers to the Quantum computer which it will be expecting a solution that contains two smaller numbers and the failure to obtain this expected result will lead the control software to think that the Quantum computer has developed a fault and shut it down (this idea has a few gaps that need filling in, not least is generating the very large prime numbers); Hollywood, I am available as a script adviser.

The commercially available Quantum computer gets around the delicate nature problem by accepting that there will be noise in the output and targeting problems whose solution has some degree of immunity to noise.

What about quantum computer languages? A variety have been created (all work by using simulators running on commodity hardware and including support for some very un-unitary transform like operations): QCL is based on a C-like syntax and supports various quantum operations, and of course the function folk know a good bandwagon when they see it.

Tags:

Pascal, Prolog, Perl, Python and PL/1

December 16th, 2014 No comments

P is for Pascal, Prolog, Perl, Python and PL/1.

Pascal was the love of my life when young (what else would cause anybody to think that a C to Pascal translator was a good idea). The language built up a substantial user base, was widely taught in universities and was widely considered to be a good language for writing robust software, yet by the end of the 1980s its market share had crashed. What happened?

I think Pascal lost out to C because it generates a lot more upfront friction. Stronger type checking, runtime array bounds checking; its just unreasonable to expect developers to deal with this stuff and deliver on time. Discovering faults in code is what customers are for. Besides, any language that forces developers to use vendor extensions because the language does not officially support separate compilation, bitwise operations and other ‘useful’ stuff just does not deserve to be taken seriously.

Languages like Pascal will not become mainstream until managers stop allowing developers to put private short term interests before the longer term reliability and cost interests of groups.

Prolog looks so easy to use, when reading examples in books. In practice the experience of using almost any existing languages (i.e., those based on control flow) means a whole way of doing things needs to be unlearned; declarative languages don’t have control flow, they work by developers expressing goals + relationships and leaving it to the implementation runtime to figure out how to reach the appropriate goals.

Perl shares with Lisp the ability to attract beards and Dungeons&Dragon enthusiasts, but differs in not repelling non-clever people (which is a serious design flaw given the amount of Internet infrastructure based on it). At times it feels like you can reach out and touch the Perl community, why is this so strong in Perl? Perhaps mastering the language syntax results in ascending to an astral plane where it is only possible to communicate with other initiates. Here is a self-documenting group of people just waiting to be studied for social anthropology PhD.

Perl 6 continues to provide a useful reminder that while developers hate language warts from early versions, they hate it even more when the warts are removed and the ability of existing code to continue to work becomes questionable.

Python is the language I use to work with the young generation of developers.

PL/1 was the first corporate language and because computer languages were still very new back then they could not adopt something a couple of kids had hacked together based on existing languages. Instead, there were committee meetings, academic work on a formal definition and magazine profiles by Very Important People.

In an attempt to be user friendly PL/1 was designed to try very hard to implicitly convert values to whatever type was required. Given its built-in support for Cobol business data-types and Fortran engineering data-types (in a bid to attract users from both those languages), plus other data-types that neither language had decided to include (e.g., bit strings), developers regularly encountered a level of surprising behavior that modern languages rarely achieve.

Things to read

Algorithms + Data Structures = Programs by Niklaus Wirth was a revelation in its day and is now basic introductory software engineering. Unless resources are very limited, developers use constructs that encapsulate the details illustrated in this book, such as iterators.

Tags:

Occam, Oberon, Objective-C and OCaml

December 15th, 2014 No comments

O is for Occam, Oberon, Objective-C and OCaml.

Occam might be said to have been the assembly language for the Transputer. INMOS, the company who created the Transputer, did not want the world to know what the actual assembly language was in case people used this information to write compilers, thus making it difficult for INMOS to change the instruction set in future revisions (because they would have to continue to support the existing instructions that the exiting compilers were making use of). Your author knows this because he was at what today would be called a developer launch event (which we had to pay to attend, none of the freebies that are used to woo us today) and during the question&answer session asked why developers were restricted to using the INMOS approved compilers (he was out and about promoting his company’s compiler expertise at the time and also asked “… what should companies who wanted to use a decent compiler do?”).

The Transputer had a great deal of potential and I don’t know how much the work of the software thought police at INMOS nobbled that potential, but they certainly had an impact (we have come up with this world changing approach to how things are done and you must jolly well use it; yes, they really did use phrases like jolly well). A big play was made of the formal proof of correctness of the floating-point unit; when several faults were found in the hardware the thought police were quick to report that the problems were with the specification, not the proof of correctness of the specification (I’m sure this news allayed any concerns that Transputer users might have had).

Oberon created a bit of a problem for those promoting Modula-2 as the successor to Pascal. It was created as the successor of Modula-2 by the person who created Modula-2 as the successor of Pascal (which he also created). Surely nobody would be cruel enough to use the existence of Oberon to mock the Modula-2 true believers ;-)

Objective-C has survived within its ecosystem because of Apple’s patronage. A language rising to dominance in an ecosystem because of the patronage of a major vendor is nothing new and in fact might be regarded as the natural state of affairs. Does a vendor need to control the fate of the dominant language in its ecosystem? Maybe not, but the patronage is not that expensive and does provide piece of mind.

OCaml is in many ways the alter-ego of Haskell, its fellow widely used (at least in their world) functional language. Haskell’s Wikipedia page looks so different one could be forgiven for thinking they had nothing on common. The major difference I have noticed is that OCaml gets used by developers creating larger programs that other people are expected to use, while Haskell gets used for the academic stuff where authors have little interest in the code having an external life.

Tags:

NATURAL, NewSpeak and NetLogo

December 14th, 2014 No comments

N is for NATURAL, NewSpeak and NetLogo.

NATURAL was the most unnatural name for a programming language I had encountered until Processing came along. Perhaps the name had a very different feel for the Germans who created it. I have some job advert counts from the early 1990s showing Natural in the top 20 of most frequently listed languages.

NewSpeak is a language generally associated with a book. The programming language I have heard about (whose memory appears to have almost faded away; it is a distance memory of those working in the lab that created it) is not the language listed on Wikipedia.

NetLogo looks like a fun agent based simulation language. It has been used for some interesting research and is on my very short list of languages to use, that I have not yet used in anger.

Tags:

Modula-2, ML, M4, MUMPS and Miranda

December 13th, 2014 No comments

M is for Modula-2, ML, M4, MUMPS and Miranda.

Modula-2 was seen as Pascal‘s successor in sections of the Pascal world; the reasoning was along the lines of Niklaus Wirth designed Pascal, which was a successful language, and then designed Modula-2 as Pascal’s successor, so it is also going to be a success (brand loyalty makes people believe the strangest things). Some Modula-2 believers had been involved with the standardization of Pascal and knew how to work the system and an ISO working group was soon formed to create a language standard. I was not a believer and would ask why people were going to the trouble of creating a standard for a language that hardly anybody used; replies were evenly distributed between “developers are waiting for the standard before switching” (which sounds like somebody had done some market research, but I think not) and “developers will use the language once there is a standard for it” (how do I get some of that medication?)

There seemed to be rather a lot of people writing operating systems using Modula-2, but I cannot recall anybody writing applications (unless you count the compiler and editor). Back in the day it was common for groups to create the complete stack of language/compiler/libraries/OS and sometimes even the hardware(e.g., Lisp, Pascal, Smalltalk) and I had a 68000 based system that could boot around 10 different OSes (of which only 4-5 were Unix based). The delivery of these Modula-2 OSes ran into Microsoft Windows and Linux.

What did grab every-bodies attention was using VDM to write the standard (the goal was to alternate between English prose on one page and the VDM for that prose on the opposite page, giving the best of both worlds; but the effort required to make this happen was not available, only the VDM work had any funding and in practice the two forms intermingle). It was hoped that source checking tools would be derived from the formal specification, but the only working tools produced were the one needed to format the document.

ML is the granddaddy of all functional languages.

M4 is one of those hidden in plain sight languages that nobody ever talks about.

MUMPS has been popping up, every now and again, in my life since university; fragments of overhead conversation, the word jumping out from a job advert in the computer press and the 115 pages of BS ISO/IEC 11759:1999 Information technology — Programming languages — M dropping through my letterbox (yes, there were naming issues). It has been cleverly letting Cobol take all the heat for being used by business people, or perhaps developers don’t want to tempt fate by ridiculing a language whose primary base is in the healthcare industry.

Miranda was THE functional language for a short period; its designer’s use of combinator graph reduction as an implementation technique for functional languages had created a major buzz. However, trying to commercialize the language implementation quickly killed its market position (if you think that selling to commercial developers is hard, try selling to CS academics) and Haskell rose to fame.

Tags:

Lisp, Logo, LaTeX and Lua

December 12th, 2014 9 comments

L is for Lisp, Logo, LaTeX and Lua.

Lisp lovers are rarely non-clever people, but not all clever people are Lisp lovers (declaration of self-interest: I am not a Lisp lover). Many of the Lisp lovers I know really enjoy playing Dungeons and Dragons and the common connection is surely getting joy from traversing labyrinthine structures. Lisp lovers must be the go to subjects for any PhD project looking for neurological predispositions for programming.

All those brackets were never intended to be, the early Lisp implementations just got left in a prerelease state and the world has been going cross-eyed ever since.

Lisp’s big selling point was being able to treat data and code interchangeably. Figuring out what to do and creating the code to do it is a surprisingly common requirement in a dynamically changing world and many Lisp applications involve real world interaction (just like Javascript does).

Logo is the original Scratch, or whatever computer language people think children should be learning these days. It was created 20 years before computers were cheap enough to allow children to touch them, so its primary use was in media related activities pontificating about what children will be doing in the future. Turtle graphics grew out of Logo and has proved to be a very intuitive way, for many people, of creating useful line drawings.

LaTeX is probably used by lots of developers without them ever thinking about the fact that it is a programming language (i.e., it is Turing complete). But then it is the kind of language that would make you want to run away screaming rather than use it to write ‘code’.

Lua might be the first language that received its starting impetus from a trade barrier (Brazil wanted to foster a home computer industry; I have no idea what has been going on with programming language in China, but I have encountered lots of Soviet fans of Algol 68). Lua has carved out a niche as an embedded scripting language and a major LaTeX system is now being migrated to a Lua framework.

Things to read

LISP 1.5 Programmer’s Manual by John McCarthy, Paul W. Abrahams, Daniel J. Edwards, Timothy P. Hart and Michael I. Levin.

The Computer Science of TeX and LaTeX by Victor Eijkhout.

I will probably gets lots of emails unless the following gets a mention: Structure and Interpretation of Computer Programs by Harold Abelson, Gerald Jay Sussman and Julie Sussman. If you have not read it yet and love playing Dungeons and Dragons, then you have a treat in store. MIT don’t use it as THE undergraduate CS book anymore, where its purpose was once to train MIT students in extracting information from high noise environments.

Tags:

KRL and the K semantic-framework

December 11th, 2014 2 comments

K is for KRL and the K semantic-framework.

KRL (Knowledge Representation Language) is a mechanism to feed facts and relationships to dumb programs to make them smart. Lists of facts and relationships were published in the appendices of reports and people would excitedly rush around waving them about, asserting that with a bit more effort this data would bring forth an intelligent system that would render human involvement in x and y a thing of the past. Many years and millions of dollars later Cyc is still putting in that little bit more effort.

The promise of what could be done with knowledge representation, and other AI related ideas, helped create a huge bubble around building human-like computer software systems in the early 1980s. Fear of what Japan’s fifth generation computer project might achieve also played a big part in fueling this bubble, however without the visible success of Japanese home electronics/cameras this project could well have gone unnoticed in the West (we had the Alvey programme in the UK, which managed to spend lots of money without delivering anything remotely useful; reported to me by several people involved in its projects).

The K semantic-framework gets a mention because it was used to write C-semantics, the most impressive (by far) formal semantic of C I have seen (it is also the only other language beginning with K that I know anything about). Think of it as Coq without users into soap powder advertising.

Tags: