Last week, Manchester honoured the 50th birthday of Baby, the prototype of the University of Manchester's Mark 1 computer. It is indeed high time to celebrate the era when the British were ahead, or on level-pegging, with the Yanks on nearly every computing front. It is time to tell all you teenage nerds about famous men, like Tom Kilburn and Freddie Williams, Baby's designers. I'll bet you never heard of them before reading last week's PC Week. Nor will you have heard of Maurice Wilkes, of Cambridge University, whose EDSAC started running in 1949, or John Pinkerton, whose LEO 1, derived from EDSAC, ran the first commercial program in 1951.
It didn't stop there. In 1961, Atlas, from the same Manchester University/Ferranti stable that produced Baby, introduced Virtual Memory to the world. It took IBM 11 years to catch up. In the 1970s, Britain led the world of parallel processing with the transputer and ICL's Distributed Array Processor.
A Briton, Donald Davis, invented packet-switching and TCP/IP, which, in 1959, the Yanks put at the heart of Arpanet, alias the Internet.
Why did our designers do so well then, but, other than Tim Berners-Lee, are rather pathetic now, and have successes only on the periphery of IT?
I see it this way. The teams that built the astounding array of computers of those days, had learnt how to invent during the war, working on Radar or with Alan Turing on Colossus. They had to get the product out of the door fast before the Germans, to win the war. When they went to university and onwards into industry, they remained driven men, driven to get the nuts and bolts together, and not to spend long on the theory.
Moreover, because they had all known each other at Malvern or Bletchley Park, the early designers of the Golden Age formed lifelong and close-knit mafias, swapping ideas, rather like today's denizens of Silicon Valley.
Two plus two made five.
Because they had all worked for government research organisations during the war, and industry desperately needed them to develop products for the new electronics industry, they found no hassle about partnerships between industry, government and academia. It was the natural order of things.
However, this cooperation broke down in the 60s when the Universities invented "computer science", where the end-product became, not a piece of usable hardware or software, but a learned paper. The war-time urgency was replaced by the soothing tempo of academe. The "computer scientists" became snooty and suspicious of both industry and government. Except in places like Cambridge, where graduates are encouraged to set up their own companies, the drive to get products out of the door disappeared.
Paradoxically, the Golden Age of the 50s was killed-off by the millions poured into computer science departments by Harold Wilson's "white heat of technology".
Clearly, we cannot remake the Golden Age as it was in the 50s. But we can put some fire back into the belly of the clever chaps at universities.
Rename the "Computer Science" departments "Information Engineering" for a start. Then, scare the pants out of the academics, by threatening to bring trainers in from private sector training companies to teach the students the nitty gritty of IT, if the university lecturers and their curricula do not keep up with the pace of change. Conversely, a good university department should be encouraged to tout for training business in the private sector. And, make all IT students spend a year in a software house on real projects with short deadlines. Or, threaten to close departments which fail to think up the ideas for marketable gizmos or for big long-term shifts in IT thinking. Or, maybe just start another war.
Are you paying attention?
Private equity firm Permira only acquired Magento from eBay for $200m three years ago
Before robots can take over from humans, we need more humans
It's not easy not being evil