There is a story, probably apocryphal, that Gordon Bell, the designer of the Digital VAX, was asked if he read the predictions of IT analysts. Bell said he did read them - 10 years after they were published, to see how accurate they were.
In the past 20 years, glowing technical assessments, together with predictions of massive market growth, have been made for any number of technologies and business processes. Virtual reality, case tools, the ADA language, touch-screen technology, voice recognition and voice response systems, asynchronous transfer mode (ATM), systems integration and a whole raft of other technologies have been identified as being of major importance to users within the space of a few years.
If the channel had taken some of these predictions seriously, many resellers would have bankrupted themselves tying up their capital in technology that was destined never to be taken up.
Linux: the one that got away
On the other hand, analysts often miss an important trend or product, but never acknowledge they have overlooked something. Their speculation about Unix versus proprietary operating systems, which raged throughout the 1980s and into the 1990s, concentrated largely on the established players. Would Sun Microsystems' Solaris, IBM's AIX or SCO's version of Unix dominate the market? But none of the pundits noticed the Unix derivative that everyone now talks about: Linux.
Unix was flavour of the month in the early 1990s. UK consultant Ovum published a report in 1990, Unix in Europe: Users and Markets, which predicted the operating system's future over five years. Ovum said software vendors would have their lives made easier by the "availability of standards, and the convergence of Unix into one or two versions will free valuable resources previously assigned to porting and maintaining their applications on different platforms".
The one or two versions of Unix, which referred to those espoused by pressure groups such as the Open Software Foundation and Unix International, failed to become the standard. It remains to be seen whether the same fate will befall Linux.
When the dust finally settles long after the reports are first published, there is a discernible hump in the IT carpet. This hump is the detritus of the numerous reports whose predictions have proved false. Since analysts invariably give their predictions a five-year time scale, there are few who remember what was originally written or whether it was right or wrong.
Two categories of analyst
Predicting the future has never been easy, and with a technology as fast moving as IT, the difficulties are great. The analyst camp is divided into two distinct but sometimes overlapping categories: predictive analysts and product assessors.
Market analysts IDC, GartnerGroup and Frost & Sullivan and a number of US research companies are essentially predictive analysts in both technology and market terms. IDC, for example, has recently predicted that commercial transactions over the internet will reach $3tn (£1.9tn) by 2003.
In an analysis that was taken seriously enough by the City to make the front cover of The Economist in August 1999, US financial analyst Bill Parrish claims that far from being profitable, Microsoft is running at a loss. The arguments revolve around the way quarterly and yearly accounts are calculated and how stock options are counted in the scheme of things. It's a minority view, but one that seems to have commanded some attention.
Most reports are aimed at the corporate purchasers of IT equipment, essentially the IT manager, his peers and superiors. But they are likely to be read by resellers because they are as anxious as the manufacturers to know what the future trends and bigselling items are likely to be. The problem is that following the recommendations made in some reports could lead a reseller up the road to penury.
Phil Payne, director of Isham Research, says analysts' reports are not taken seriously by users in terms of shaping IT strategy. "Users treat analysts' reports as a cover-your-back measure. The ideal situation for the IT manager is to have reports from different analysts saying different things. If you have Gartner saying consolidate, and Forrester saying don't consolidate, and their managers come and ask them what is going wrong, they can pull the appropriate report off the shelf and say, 'If they can't get it right, how are we supposed to?'," he says.
PC vs NC: taking sides
In the past two years there has been a tremendous debate in the IT industry about the future of the PC, the total cost of ownership and the network computer (NC). Started by Larry Ellison, chief executive of Oracle, the debate about whether the NC would replace the PC as the standard business desktop tool was pounced on by the analyst community. Within a year of Ellison's 1995 announcement that the future of computing lay with the NC, and not in expensive PC-based systems, analysts were taking sides.
Bloor Research was a passionate advocate of NC architectures and its superiority in price performance terms over the Microsoft/Intel-dominated, PC-based systems. Bloor went so far as to suggest that the combination of an inexpensive NT, coupled with the Java language, would ultimately sound the death knell for Microsoft. Bloor believed the high cost of multiple PC ownership by corporations would lead to a return of mainframe-style computing based around massive servers, with NCs as the clients.
In 1996, a Bloor Research report made these predictions: "The PC market will go into decline as it is gradually undermined by thin client [NC] hardware. The corporate PC on the desktop will be largely superseded by thin client hardware. The PC operating systems will be superseded by a browser supporting a Java-enabled environment."
Of course, it is not too late for these predictions to be realised, but there is no sign of a massive decline in popularity of the desktop PC in the corporate environment. Robin Bloor, chairman of Bloor Research and author of the 1996 report, admits the report was not wholly accurate. "We were not actually wrong, but we were not right in the time scales in which we said things would change - which makes the predictions pretty useless," he says.
Paying for product evaluations
The analysts' published reports are only the tip of the iceberg. By paying a subscription, companies have access to product evaluations. Peter Gottlieb, director of alliances with software developer IT Factory, says big corporations, vendors and users subscribe to analysts' services just to get this comparative service.
Gottlieb, who was formerly director of strategy at Tetra, now a part of Sage, recalls how important such analysis can be to a vendor. "We were selling to a large corporation in the US and the deal was almost complete. The client asked Gartner for an evaluation of our product and Gartner replied that it had never heard of us. So we paid Gartner to do an independent evaluation of our product which they found met the necessary criteria - Y2K compliance and product availability and so on - and the client was satisfied," he says.
"Luckily we did not have to make any changes to the product, but if Gartner had found any deficiencies we would have had to make changes or lose the deal."
Martin Finn, commercial director at Hamilton Rentals, a Compaq and IBM reseller, agrees with Payne and Gottlieb. "We cannot ignore analysts' reports, but we don't set a great deal by them. What you have to remember is that analysts and consultants are commercial companies that are quite often commissioned on behalf of a client involved in a technology to produce a report on the future of that technology," says Finn.
It is from these commissioned reports that the published volumes are drawn. If an analyst is asked to evaluate the products of a certain company, it is certain to have to look at those of its client's competitors. The total report will therefore be a subset of a specific commission.
This in no way implies that the fully published report is biased towards an individual company, market or technology. But it does mean that it is a broad, sweeping introduction to the subject.
Finn says: "Eighteen months ago, the analysts were saying there are just two operating systems for the future: NT and Unix. Nobody mentioned Linux in particular."
Guidelines, not gospel
Finn believes Hamilton's customers take some notice of the trends indicated in analysts' reports, but are not reliant on them. "In the rental side of our business, we are in the same position as our customers in that we have to make buying decisions. One cannot afford to ignore the reports about trends and technologies as reported by the analysts, but you have to rely on experience as well."
There is no doubt that analysts' reports provide some guidelines to the future direction of a technology or product set. But it is easy to be misled. The US analyst community tends, not unnaturally, to think in North American terms. If they examine fields outside the US, Europe is treated as a single entity rather than as a number of disparate nations with different economic and social strata.
To paraphrase the old saying about the wisdom of following the rules, analysts' reports are for the guidance of wise men and the obedience of fools.
And, yep, it'll run Android rather than RiscOS
US engineering giant's cost-cutting outsourcing plan is on the rocks, according to insiders
HP Envy X2 laptop only affordable if you've got loadsamoney
Counterfeit code-signing certificates enabling hackers to hide malware being sold by cyber criminals
Certificates can be used as part of layered obfuscation to evade detection by anti-virus software