I bought my first PC - an Apricot twin-floppy machine with 256Kb RAM - in 1985 for #1,700. It had a word processor and spreadsheet that met most of my day-to-day needs. Within a couple of years, I'd purchased another Apricot, this time with a 30Mb hard disk, 1 Mb RAM and a professional accounting package. Everything had a monochrome screen, but I got what my company needed to run the business.
By 1989, I was managing networks with essentially the same software.
Most workstations had no hard disk, but we only needed to keep our production systems on a 200Mb, 8Mb RAM server running Novell NetWare. Some workstations had their own hard disks, but these were for spreadsheet use only. In that timespan, we never had real problems. Everything just worked. I'd seen Windows 286 and an early version of PageMaker and could see the potential for a GUI, but nothing could prepare me for the shock of Windows 3.0.
All of a sudden, none of the machines had enough power to run Windows applications and contrary to what we were led to believe, there was no way to get Excel or Word to run from the server without users complaining about performance. Today, for my #1,700, I'll get an extremely well specified PC, complete with 32Mb RAM, a 15in colour monitor, at least 2Gb hard disk and a raft of what is quaintly known as "office productivity" software.
But will I be able to do any more than I did back in 1985? Apart from the Internet connection, the answer is no.
Now before the thought-police from Microsoft throw their hands up in horror, let's be clear about what I'm saying. Most end users do a restricted range of relatively simple tasks. That is, word processing, Email and a production task like customer enquiries, order entry or sales entry.
Power users will need spreadsheets with a reasonable amount of functionality and in some cases may process information sucked from business intelligence systems locally on their own PCs. Therefore, for the vast majority of people who use PCs in the workplace, the fundamentals have not changed because composing a letter on a PC in 1997 should be no different to what it was in 1985. But, in order to do the job, end users now need a humungous piece of hardware that 20 years ago would have been capable of driving the London Underground signalling control system. We have been brought to the position where we are faced with, and often grudgingly accept, what many industry observers believe is nothing more than bloatware.
We, as consumers and business people, have bought into the idea that as technology progresses, we'll benefit from the "newest wave" of applications.
Up until a couple of year ago that might have been acceptable. This was because until 1995 it was probably true to say that many of the desktop applications commonly in use lacked sufficient functionality. But software today is full of functionality most users would hardly ever touch upon.
Now the people who craft these applications, the software development community, work to the latest processor specification in order to build in new functionality. It doesn't require a leap of understanding to see that for the end user to run this new piece of software is likely to require a hardware upgrade. The PC is perfectly suited to this because it is the only asset any business will buy that's been designed to be ripped apart and retro-fitted with larger hard disks, additional memory or a heart transplant in the shape of a new motherboard.
Industry observers point the finger at Microsoft and Intel as the evil twins sucking hard earned cash out of users' pockets to keep their shareholders happy. But they are not the only culprits. None of the processor or memory manufacturers appear to be complaining. Neither are the hard disk manufacturers pulling sad faces.
In a situation where bloatware is the norm, everyone in the computing game - be it hardware or software - gains handsomely. But the PC architecture is bursting at the seams. Many companies are faced with periodic re-formatting of their PC stock, just to get performance out of applications that fling files all over the PC's hard disk, leave memory leaks and generally screw up the kit. But, as consumers we contribute to the problem because we buy into it, even if we get marketed to death in the process.
Now that's not much of a problem if you're Joe Bloggs down the street because upgrading is not too dear on a once-every-five-years basis. In my case, a new board and disk plus a bit of memory means I got a "new" specification PC for about #550. But if your name is ICI or one of the large banks, then multiplying this figure by a factor of several thousand is a very big number indeed. Some companies, especially those with less than 25 users have taken the view that it just isn't worth the upgrade effort because the software they rely on - perhaps a character based application - is core to their success and there is no need to upgrade. Even some large companies are finding this to be the case.
Blue Circle Industries with over 600 users is only just making the move to 32-bit applications. But, it will maintain its host-based system rather than be faced with a wholesale change to other architectures on the grounds that the applications it uses are stable and meet its business needs in most cases. What's more, the change will be gradual as Blue Circle gain experience on what does and does not work for them.
Yorkshire-based chartered accountant Shepherd Baker which has around 25 PC users is changing its hardware because the software needed to cope with self assessment is Windows only. The company's trusty 386-based PCs cannot cope.
It's easy to laugh at accountants as the gatekeepers of IT, but there are times when they bring a sense of pragmatism to the equation. "We made a policy decision in 1989 not to go down the constant upgrade path, restricting our IT spend to essential support," said Shepherd Baker's Adam Dutton.
"At the time our suppliers were pretty mad with us, but we saved a fortune that far outweighs the current investment." Dutton said that the company neither had the time or inclination to spend on unnecessary frills. "We're only upgrading because we have to." That, basically, is the motivating nub of the argument for not upgrading.
However, businesses that adopt this stance are in a dilemma. Faced with issues like Year 2000-compliance, the Internet and the fact that no software house develops character-based systems any longer means many businesses now have to make fundamental decisions about their IT strategy. On the one hand, Microsoft would like everyone to use Windows NT plus Microsoft Office as their main productivity tool. The problem is that inevitably, companies have to throw out much of their old hardware and start again.
However, for the vast majority of users the NT and Office approach will be very tempting because they can buy their applications and operating systems from one supplier. What's more, the allure of having products that are closely integrated with the operating system means that in theory at least, those same products will perform well.
The alternative is to go down the component technology route where users pick and mix the required parts of the chosen application. For instance, a user might want a basic word processor that does not provide the means to insert graphics or tables. On another day, that same user might want a spreadsheet that has pivot table features but not data analysis.
Opting for this strategy requires a re-think that will be uncomfortable to the generation of users who have grown up expecting to have feature laden applications. This is because in a component-based world, most applications will be held on the server. Features available to a particular user will be pre-defined as part of the user's profile. In a sense, this removes some of the control end users have over their desktop, but the potential cost savings are enormous. In the component world, users don't need powerful desktop kit to run their applications.
This is possible using a combination of techniques, but leading the way are Java, CORBA-compliant object request brokers and other Internet technologies.
At the moment, Java applets are very slow and with a few exceptions there is little evidence of successful Java applications. Even Java stalwart Corel has temporarily given up with Corel Office for Java, opting instead to use Citrix like technology to get a thin client product out. Oracle, an early developer of web-based applications, is not seeing real interest in its own application line.
However, that's not to say these new technologies don't work, just that they are in the early stages of development. Performance will improve dramatically over the next 18 months.
Many users, faced with the prospect of server-based systems are resisting change because it is seen as removing power at the desktop. This is fundamentally wrong. With the right management tools in place, there is no reason why users should lose essential functionality. What users need to do is compromise on the level of functionality. Whatever happens, the upgrade cycle can and will be broken provided enough users recognise the long term benefits of lower cost of ownership.
JAVA: THE BLOATWARE CONSPIRACY
What Java and Corba bring to the party The applications software world is dominated by Microsoft, which controls around 80% of the office market. This is not an inherently bad thing, but in recent years it has led to massive product inflation requiring much more powerful machines to run the latest and greatest. What's more the pace of change means the hardware you buy today is worth a lot less tomorrow and scrap in less than two years. Microsoft's products tend to be proprietary to their hardware platform, thus hog users to them.
Critics argue that the weight of features is producing diminishing returns because you can only add so much value with new ways to, for instance, underline a word.
As a reaction, a loose consortium of Sun, Netscape, Oracle, Novell and IBM - the Gang of Five - has responded. Their alternative is to offer a computing framework which starts from the point of saying that all software should run on any hardware with the ability to be served and controlled world wide. To achieve this, they advocate the adoption of open standards across the computing landscape and promote Intranet technologies as the fruit of these standards.
Products developed using the Gang of Five's technologies have the potential to cut ownership costs dramatically and perhaps lend a new lease of life to older 486 machines. This is because they are generally run through browsers which are thin, requiring little processing capability at the client.
However, to gain access to the open standards world, software developers need to work with Java, Corba ORBs (object request brokers) and Corba's IIOP specification for sharing information between ORBs. These are the fundamental building blocks for distributed, lightweight applications that can be managed centrally.
Java is relatively easy for programmers to understand as it is like a variation of C, but is fully object oriented. This means the developer can re-use components to achieve faster development times. But working with an ORB requires the skills of highly trained C++ programmers. It's real rocket science. The ORB provides the complex messaging glue between server objects and the desktop and while it is based upon industry standards, not all ORBs behave the same.
Working with open-standards-based products provides users with the means to pretty well use whatever tools, development systems and applications that are appropriate to their working environment. In other words, the marriage of these technologies provides choice at a fraction of the cost of sticking with proprietary products that require both hardware and software upgrades. There is a little way to go before these technologies mature, but together they offer a tantalising promise developers cannot ignore.
SOFTWARE: THE BLOATWARE CONSPIRACY
Conspiracy theories can be interesting and entertaining, although in many cases they have only a casual relationship to the truth. A computer conspiracy theory that we have heard a number of times is that Intel and Microsoft actively plan for PC obsolescence. The theory goes like this: Intel makes a new chip, PC vendors provide more powerful machines and Microsoft writes software that consumes the available CPU and disk resource.
The net result is that you get to do very little extra with you new PC and the extra MIPs and disk acreage are happily consumed. The whole act is further supported by Intel phasing out older chips and Microsoft ceasing to supply older, less hungry applications. Thus Intel and Microsoft happily milk the PC market and they can continue to do this indefinitely while there is life left in it. The problem with this theory is that large organisations do not partner anything like as well as the theory suggests. It simply isn't practical. Neither Microsoft nor Intel could afford to organise its business in such a manner.
There are several better explanations of why software has got so fat.
It's simply a fact that programmers seek to occupy the space available.
If memory, CPU and disk are available, they use it. Programmers normally have top of the range PCs and their programs usually assume that everyone else must have one. In any event, they assume performance is not a consideration because the next generation of chips will solve it. Getting product to market fast is the overriding priority and there is no time to program for efficiency.
In reality, many software products contain areas of redundant code that never executes. When you load a new piece of PC software, you generally find all sorts of useless files have been loaded, running into many megabytes.
The applications themselves are overloaded with features most of which are never used and generally useless, and with each new version of an application many new features are added which will rarely if ever be used.
Adding new features is a necessity in order to justify selling you a new version of the product. Microsoft has happily led the market in this area by upgrading its operating system regularly and it is about to do so again. If you change the operating system, you give all the other software vendors an excuse to upgrade their programs to take advantage and in many ways you make this necessary, if you add significant features.
But the operating system has to change because the chip changes, adding new features, often significant ones. And the chip has to change because Intel is in a competitive market. In other words, we are not dealing with a conspiracy. What we have described are simply the inevitable results of a PC-based, consumer-led market. As the NC supersedes the PC, the equation will change. There will be merit in software that is economic in its use of resources and it will not regularly need upgrading, because it will not depend on the chip or the operating system.
12 of the 32 stars observed feature rings and gaps that are usually carved by planets in the process of formation
Overhaul to parachuting system and the ability to export clips from replays also coming to PUBG
The experiment is currently underway at South Korea's Yangyang Underground Laboratory
Exoplanet HAT-P-11b is located about 124 light years from Earth