"If you want short term predictions, go to the engineers; if you want long term predictions, go to the scientists," advised computing science godfather and Microsoft researcher Gordon Bell yesterday. Bell was opening the Association of Computing Machinery's conference on 'IT in the Year 2047' in San Jose, California.
Bell - best known for inventing the Digital Equipment PDP and VAX architectures - was one of the scientists, visionaries and analysts who gathered in San Jose this week to deliver their predictions on the next 50 years of computing to an audience of 2,000 delegates from both the IT industry and the general public.
He began with a reminder of how easy it is to get predictions wrong, but added that this was no reason not to speculate. "The big errors come in timing...unless someone is foolish enough to predict some violation of a physical law," he said, adding the proviso: "In the past, some laws of physics that were known at a given time weren?t adequate to explain how they were ultimately broken."
Bell?s opening address proved to be a summation of some of the ideas that were to articulated in greater detail by later speakers. These included computers that matched the processing power of the human brain; single chip computers that can be embedded in walls, light switches and telephones; and portable machines that can be worn by their users.
Before some of the more seemingly outlandish ideas were aired, several of the speakers reminded the audience of ?Moores Law? - named after Intel co-founder Gordon Moore - which states that the processing power of computers doubles every 18 months.
Whether Moores Law has a role to play in the next 50 years was a running theme underlying many of the speakers' presentations. Joel Birnbaum, head of research and development at Hewlett-Packard, thought not, citing recent announcements about cloning as proof. "There is a sheep bleating in Scotland which can show that the impossible isn?t always so," he said, arguing that any vision of the future must allow for the emergence of "disruptive technologies".
Birnbaum extrapolated a vision of computing in which machines will cease to be dependent on traditional silicon chip technology - which will reach the limits of its capablities within a matter of a few decades- and turn to more esoteric processing media.
These will include quantum computing, where calculations take place at a subatomic level; optical computing, which will use photons rather than electrons to produce machines that can process literally at the speed of light; and DNA-based computers, where a test tube of DNA strands is manipulated by molecular biologists to perform computing functions.
This last idea was picked up by Carver Mead, a professor at the Calfornia Institute of Technology. He specialises in researching the neurons of the human brain with a view to producing a ?neuron chip? that can make comparisons between stored images and those it receives from the real world, before drawing its own conclusions.
Taking the concept one step further, Nathan Myhrvold, Microsoft?s head of research and development, envisaged downloading the contents of a human brain into a computer. "In 50 years, I hope I won?t be talking about software," he said. "I hope I will be software."
Myhrvold also spoke of software that can ?breed? by developing its own replacement technology. "The brain has no Moores law. We?re not getting smarter every year. Computers are," he said, adding that, in 2047, people will nurture rather than develop software. "Instead of tending their flocks, we will be tending programs using sophisticated simulations of evolutions to create new software."
These ideas were not entirely to the liking of some of the speakers, who took a more philosophical look at the societal implications of such such advances. These included Brenda Laurel, a researcher with Palo Alto-based Interval Research, which studies the convergence of consumers, popular culture and technology.
Laurel was not impressed by the notion of ?human? computers. "It can speak with a human voice or display a human face, but we know it is not human," she warned. "It is a brain in a box, without body, soul, intuition, passion or morality. It is a severed head - severed from the body of what it means to be human."
Science fiction author Bruce Sterling said that the human race had been lucky to avoid some of the potential horrors of technology to date. "Atomic Armageddon was vapourware hype," he noted. "It never launched. It never became a real product. Just consider that, ladies and gentlemen, when you consider what a pleasure it is to put the future behind."
The next 50 years will bring new dangers, he went on, but don?t worry about demented supercomputers bent on world domination. The real danger comes from PCs that constantly display ?URL not found? he said, raising the spectre of a saturation of global networking bandwidth and the threatened collapse of the Internet.
Not a problem, claimed the so-called ?father of the Internet? Vinton Cerf, currently vice president of Internet architecture at telecomms giant MCI. His vision was that the Internet will find its way into every aspect of life by 2047. Users will access it through traditional vehicles, such as PCs and Web TVs, as well as futuristic wired spectacles, which will be manipulated using a built-in finger mouse.
A more pragmatic response to the issue of bandwidth shortages came from Reed Hundt, chairman of the Federal Communications Commission, which regulates US telephone and wireless communications networks, who endorsed proposals to skim $2.25 billion a year from the telecomms industry to pay for network expansion.
His vision for 2047 is the ability for end users to order up extra bandwidth as easily as they can order pizza today. But the creation of ?spare? bandwidth carries a high price tag that nobody is prepared to pay.
"Nobody wants to have the status quo shaken up so you can order bandwidth like pizza," he complained. "We have fine companies, but we don?t have big bandwidth networks. [The information superhighway] needs to be public, it needs to be widespread."
But, assuming the Internet manages to avoid meltdown, its expansion brings new problems, according to Nobel Prize-winning physicist Murray Gell-Mann. "It is important to realise, however, that most of what is disseminated is misinformation, badly organized information or irrelevant information," he warned.
His solution lies in the triumphs of the past. "In the long run, it is creative work in the sciences, the humanities, the arts and the professions that will help the most to extract knowledge and understanding from the immense sea of data that threatens to drown humanity," he insisted.
But ultimately the view of the three-day conference was that the best is yet to come: a brave new world in which users will live in homes run by ?thinking? computers that will cater for their needs, and access information from a global network that is seamlessly integrated into their living and working environments.
Science fiction? Maybe, but not according to Cerf. "I will not be alive in 2047," he said in conclusion. "But I will likely live to regret the timidity of all the predictions I?ve made."
Latest Tesla news: Tesla share price continues to fall after Saudi Arabia's sovereign wealth fund is linked to investment in rival
SEC 'probe' takes its toll on Tesla as new research suggests that Tesla loses $6,000 on every $35,000 Model 3
RTX 280 Ti will come with 11GB of fast GDDR6 video RAM with a 352-bit memory bus offering 616Gbps
The scale of jobs lost to automation will be at least as large as those in the first three industrial revolutions
10nm Cannon Lake Core i3-8121U CPUs make a rare outing with Intel's NUC mini PC