zwischenzugs

‘Towards a National Computer Grid’ – Electronic Computers, 1965

Advertisements

Recently I picked up this book on my travels:

This is the second edition (1970) of a book originally published in 1965.

It’s a fascinating insight into the state of computing 50 years ago, and remarkably prescient.

Here’s a few highlights that piqued my interest.

Getting programs right

This is a fascinating insight into how testing and debugging worked when hardware was a bigger part of the equation. Ever used a cathode ray tube to figure out what’s wrong with your program?

In the early years the usual procedure was to run the program in slow motion, one instruction at a time, and observe what happened. Computers were provided with a number of visual indicators – rows of lights or cathode ray tubes – which enabled the contents of some of the arithmetic, control or storage registers to be inspected. This practice – sometimes known as ‘peeping’ – was soon found to be intolerably slow. It can be speeded up by inserting stop instructions at suitable points, thus enabling the operator to restrict the slow motion to selected parts of the program. Those parts that are above suspicion can be run at full speed. Even with those improvements this procedure is far too prodigal of valuable machine time …

…or waited for an electric typewriter to tell you what the error was?

… most modern installations are provided with a variety of ingenious diagnostic aids … special diagnostic programs, some of which are usually held permanently in the computer store. Their function is to provide the programmer with information that is likely to help him detect, locate, and diagnose any errors in his program. Such information is usually printed as an error message on a line printer or electric typewriter …

 

 

Programming Languages Will Proliferate

The idea of a programming language then was still a relatively new one at that time. ‘High level’ programming languages had only recently come into being – COBOL was about as old then as GoLang is now!

In spite of recent attempts to design a ‘universal’ language – for example the PL/1 language – the existing bifurcation into mathematical and commercial languages, typified by ALGOL and COBOL respectively, is likely to persist for some time yet; an economic combination of sports car and delivery van is rather unlikely.

This implied that languages would proliferate for different purposes, as indeed they have.

 

 

Towards a National Computer Grid

It’s interesting to hear someone edge towards the idea of the Internet in an age before DNS, TCP/IP, SNAT or indeed the idea of networking in general.

The scheme envisages an hierarchical arrangement of subscriber terminals (‘remote stations in our terminology), multiplexor devices to concentrate and sort out incoming messages and so save data transmission costs, local area computers and regional computers, all linked together by a network of communication lines of varying data carrying capacity. It is proposed to make use of some of the long distance lines already provided for the telephone network

 

Moore’s Law

A logarithmic graph shows Moore’s Law, though it hadn’t been given that name yet. His 1965 paper was published in the same year as this book. Not only transistors, but magnetic core capacity was seen as growing at a similar rate.

Note that it was the bit, not the byte, kilobyte or megabyte that was the unit of choice at the time.

Multi-Tenancy, 1960s Style

The idea of computer systems that could run multiple programs simultaneously was a novel one. As was BASIC.

In the ‘MIT’ system twenty programs can be ‘active’ simultaneously. … The user of course is not aware of this swapping although he may realise what it is that makes the computer work more slowly than if he had it entirely to himself.

 

Computer Programming and ‘Libraries’

It was dawning on practitioners at the time that ‘constructing a computer program is like building a house’:

It is clearly a great boon for the programmer to have at his disposal a collection of standard subroutines which have been thoroughly tested in advance and known to work correctly.

I wonder what the author would have made of NodeJS libraries?

 

 

Computers at Work

Software had begun to ‘eat the world’ even 50 years ago. I met my wife in the early 2000s through a more modern equivalent of the ‘marriage bureau’ (yes, that was a thing, and I remember them), but it’s interesting to consider that law and medicine arguably haven’t really been revolutionised by computer technology yet (leaving aside hardware innovations).

The table below estimates that there were 70,000 computers worldwide in 1968. Microsoft was founded seven years later with the vision of ‘a computer on every desk and in every home’, which seems tame today when I have more computers on me now than pockets in my clothes.

Here is a rough estimate of the global distribution of computers in the middle of 1968:

North America                        46000
United Kingdom                        3000
Western and Central Europe           11000
USSR, Eastern Europe and China        5000
Other areas                           5000
                                     _____
                           Total     70000

Author is currently working on the second edition of Docker in Practice 

Get 39% off with the code: 39miell2

 

 

Advertisements