Things were looking pretty good for Unix in the enterprise until about three years ago when Windows NT emerged from the Microsoft labs.
For years Unix has been carving a reputation for itself as the operating system of choice for downsizing from mainframes onto open systems. Now with NT, perpetuating the message that it can deliver the performance, scalability and availability of open systems at a fraction of the cost, Unix appears to have a serious battle on its hands.
Since the late 1970s, Unix has been the enterprise operating system of choice for running mission-critical business applications. It was conceived purely by accident in the late 1960s, when Dennis Ritchie and Brian Kernigan, working at AT&T's Bell Laboratories, found a disused DEC PDP-11 and put it to good use. They created Multics and the B programming language which evolved into Unix and C. AT&T adopted the operating system throughout the organisation and gave source-code licences to universities as part of an education initiative. This is how Berkeley Unix came into existence.
The elegance of Unix was that so long as there was a C compiler available on a particular machine, the operating system source could be ported to it. Universities were able to port the operating system to their own computer hardware. At the start, it was a true hacker's operating system. There were not many programs or utilities. So if it didn't have some feature or other, someone would write it themselves, then hand it over to the Unix community at large, so that all could benefit. Academic user groups employed their expertise to enhance the operating system. These enhancements were then put in to the public domain.
From these lowly beginnings Unix and the tools, utilities and applications surrounding it, evolved into the industry we now know as open systems.
In the open systems philosophy, users can choose the software they run without having to worry whether it is available on their particular hardware platform. In ideological terms, this state of affairs is supposed to drive forward Darwinian-style survival of the best-of-breed applications.
Mike Lambert, chief technology officer at the Open Group, believes open systems is the only way the industry will progress. "The information industry is growing faster than any single supplier can deliver solutions. The only way to overcome this is by a multivendor approach." Explaining why the open systems approach is better for everyone, Lambert claims that with a proprietary approach, the dominant supplier does not have any competition and therefore the motivation to keep innovating is diminishing.
This concept is fine for users, but for suppliers it can mean a recipe for commercial suicide. Over the years Unix has proliferated all over the place. This has led to the stage where the open systems message has been garbled and Unix applications are not available on every platform.
While Unix was growing and perpetuating the open systems message, Microsoft was busy developing the DOS operating system for the IBM PC, followed by a brush with OS/2 which later emerged as Windows 3.0. Only in 1993, with the help of Dave Custler, who was the chief architect behind Digital's VMS operating system, did Microsoft introduce Windows NT as a server operating system which offered a viable alternative to Unix.
Microsoft is to many people what IBM was in the 1970s. It has utter domination of the desktop market with three strains of Windows. Its approach to openness is one of the dominant supplier setting standards which others follow.
In this respect, Microsoft has been extremely successful. For instance, by popular consensus, ODBC (open database connectivity) is the agreed Windows standard by which applications talk to databases. Another is OCX and now ActiveX.
This started out when Microsoft's Visual Basic rapid application tool was introduced in 1991 as VBX components. To guarantee the popularity of the new component architecture, Microsoft invited third-party component developers to write their own VBX controls. VBX evolved to OCX and now to ActiveX. Today there are literally thousands of controls which conform to these de facto standards.
As a server operating system, NT certainly has a lot going for it. For a start, there is the cost of the server platform itself. On the price/performance scales, NT on Intel-based PC architectures beats traditional Unix/Risc architectures hands down. Then there is the ease-of-use argument. Using the popularity of Windows 3.x as a linchpin, Microsoft positions NT as the enterprise operating system which offers the familiar Windows look-and-feel.
But rather than competing directly with Unix, Windows NT Server is more of a threat to Novell's dominance in the departmental and file and print server market with NetWare, according to Robin Bloor, chairman and CEO of Bloor Research. "NT is winning a lot in the market for servers which cost less than $25,000. It sells very well as a small database, file and print server." Bloor believes while this may be causing problems to SCO Unix, NT is in no position to compete in the middle and top end of the market.
Bloor cites high availability and scalability as two attributes of Unix, where NT lags behind. In terms of scalability, in Bloor's experience, "NT is not a large server operating system. Even with four processors, it's a bit iffy." He also points out availability is not at present an option on the operating system, although Microsoft is working on clustering technology for high availability NT servers. Bloor adds that on NT, if a system administrator wishes to reconfigure the server, the whole system has to be taken off-line. "Imagine how much it will cost if a mission-critical server has to be shut down in order to add, say, more hard disk space."
Bloor believes only when NT improves will it offer a serious threat to Unix. However, he feels Microsoft is playing a catch-up game. "Microsoft has only just entered the 32-bit world. All variations of Unix are now 64-bits."
As a company which is often cited when the scalability card is played against Microsoft, Sequent positions NT as a mid-tier solution according to David Chalmers, Sequent's regional technology group manager. NT will currently run on Sequent machines with up to 12 processors, claims Chalmers.
But he asks, "What about the applications? I haven't seen any that can take advantage of this extra processing potential." In his experience, storage on Windows NT servers is in the order of tens of gigabytes; whereas on Unix storage it is in the terabytes, mainframe space.
As a cross-platform 4GL tools company, Uniface must provide tools which target both platforms. But last year the company decided to drop its Unix-hosted development environment. "It's a case of what the market wants," according to Ian Meakin, Uniface's product marketing manager. "Customers were asking less and less for the Unix development environment." But, in terms of where Uniface applications are actually being deployed, "the market for mission-critical Unix applications is growing." Meakin adds: "In our experience, the enterprise is choosing Unix. It is definitely the mainframe replacement."
But where Windows NT picks up Brownie points is in the ease with which applications can be developed. The very fact that Unix is available on many different platforms makes life difficult for some companies. Magic, a cross-platform 4GL tools company, admitted that porting to all the different flavours of Unix was "a real problem". Barry Zigner, director of technical services at Magic, is concerned about the difficulty writing and porting for Unix: "Our QA department is larger than development. Porting (on Unix) is a real effort. We have to run the complete system test for all variations of the operating system we support." Zigner added that it takes Magic at least a month to port and test the Magic development tool on each of the 12 flavours of Unix it supports. "Life is easier on Windows NT. There is only one API."
For Magic, Unix has an important role in its market because it increases the scalability of Magic applications. "NT has a way to go to offer a real alternative to Unix," explains Zigner. He points to NT's very basic facilities for error handling compared to Unix. Magic believes the weakness of Unix is that there are no real standards. "It is easier and more economical to develop for NT."
Malcom Etchells, SCO UK's channel sales and marketing director, believes that a key driving force in the market today is thin clients: "People face huge costs in the Microsoft model." He adds that a key differentiation in Unix is that it can support multiple devices: "You can't plug a character terminal on an NT box."
SCO considers that whether Unix or NT will finally rule in the IT world is a redundant debate. The issue is rather which new computing model will enable businesses to move into the next century, building on existing and incorporating new technologies.
Unix reigns in enterprise-wide, business critical applications, with over 80% of the market, according to market researcher Dataquest. It is allegedly a portable operating system and is based on proven technology.
But the fact that it is portable actually makes the life of software developers increasingly difficult. In fact, Unix is not portable at all. No two Unixs are alike.
The emergence of NT as a credible alternative has forced those people developing Unix to realise they need to supply a coherent Unix. They finally got their act together in 1995 with the Single Unix Specification from the Open Group and Unix 95 branding.
But is this too little, too late? There is no black and white answer.
Operating systems are merely complex software devices which allow applications to run. In all likelihood, both Unix and NT will have to co-exist and compete in the IT world. Or to use Lambert's words: "NT will get better while there is effective competition."
Cotton seedling freezes to death as Chang'e-4 shuts down for the Moon's 14-day lunar night
Fortnite easily out-earns PUBG, Assassin's Creed Odyssey and Red Dead Redemption 2 in 2018
Meteor showers as a service will be visible for about 100 kilometres in all directions
Saturn's rings only formed in the past 100 million years, suggests analysis of Cassini space probe data
New findings contradict conventional belief that Saturn's rings were formed along with the planet about 4.5 billion years ago