In the run up to the millennium company mergers and acquisitions have increased dramatically. However, as this merger mania hots up little thought is given to the problem of joining the networks of the combined companies. This scenario is a significant challenge to IT and network managers, who will have to cope with bringing together disparate networks. It is almost impossible to prepare for this because a merger can happen at any time and vastly different systems will often have to be joined together. However, it is easier to integrate networks today than it was 10 years ago. The ubiquity of TCP/IP makes it less likely that companies will need complex bridges and gateways between proprietary architectures. When two become one Mark Taylor, Microsoft's product support services director, said that TCP/IP is available on most operating systems including OS/390-based machines. "TCP/IP has driven standards into e-mail technology," Taylor said, and pointed out that the rise of standards such as POP3 have made delivery of e-mail across different systems easier. "Acquisitions and mergers no longer get killed by technology as they have done in the past," he added. But despite standards, the process of merging networks still spans the integration of different application data streams, formats, and the reconciliation of different Lan and Wan technologies. IP may make this easier, but it's not a universal panacea. Companies merging their networks have two options. They can try to handle everything in-house or outsource the process to a third party. Keith Colman, technology manager at IT and networking consultancy Keltec Progress, explained why outsourcing can be appropriate for some firms. "Unless companies have IT experience, they won't have the broad knowledge of the options available to them in terms of bringing themselves together," he said. They should at least consult with an external partner if not commission them to do the work, he advised. Outsourcing companies and consultancies provide particular benefits in project management, said Colman, who argued that this was one of the key elements in a successful merger. The advantage of outsourcing is that you benefit from the experience of someone that has done it before, explained Savi Arora, director of the UK Network and Desktop Consulting Practice at Unisys. But you must be careful to retain strategic control. You may choose a two-pronged approach and outsource parts of your network integration while keeping control over others. Some areas of your network infrastructure may be impossible to manage in-house. Wan links are often managed by a third party for example, because most companies can't afford to go through the process of laying their own dark-fibre links. Consequently companies like Energis or Fibernet will step in to provide bandwidth from an established set of access points around the country. Economies of scale When companies that each have a Wan infrastructure merge they face a further issue. It makes no sense to employ service providers for two disparate networks when the size of the new company may allow it to take advantage of economies of scale with only one provider. Companies may find it useful to keep another supplier on hand to provide a back-up service should the primary network fail. This depends on the importance of data that passes over the corporate backbone. Before two companies can merge networks, they have to ensure that the technology they are using is up to date and compatible with others. According to Iain Bell, managing director of network integration product vendor Transition Networks, many companies are functioning on old, non-standard cabling infrastructures that make it difficult to link at the data transfer and application integration levels. Such organisations include smaller companies that have moved into a service building or educational establishments. "When you're connecting Ethernet and Token Ring the network operating system shouldn't cause many problems, but the hardware connection could cause companies much grief," Bell said. Installing a bridge isn't enough if you're trying to connect a BNC cabling network to a 10BaseT infrastructure - you need a multimode device. As you move up the communications stack to look at other integration issues, system maturity becomes an increasingly important factor. Year 2000 compatibility should be examined even at this late stage so that a company can decide which systems are worth keeping, and which would require too much investment to bring up to an acceptable standard. Existing projects may affect the way some components of the new infrastructure are treated. If a company is in the middle of changing to Windows 2000 when a merger arises, some older NT-based servers in the other company will need to be replaced too, so that the overall structure becomes easier to manage. Some of the nitty-gritty integration issues may only become apparent as the project progresses, especially if companies are inexperienced at bundling integration projects. Colman explained that two companies may experience problems exchanging data between word processing software or spreadsheets when they merge. Microsoft's dominance in the office suite market should ease such worries because the merging companies will probably be using software versions from the same supplier. Integrating server applications is more complex, because bespoke applications will have been developed with their own proprietary data structures. Any database information is likely to be structured differently across the companies to be merged. Islands of incompatible data often exist within post-merger IT systems. Companies are likely to have duplicate applications serving the same purpose but in different formats (payroll applications or customer service databases, for example). It might be easier to dump one and translate its data to the format of the other application rather than running both of them over a middleware system. This depends on how much functionality a company is prepared to lose. Unless a company plans to completely eradicate an infrastructure and transfer all of its data into the other's formats, it will probably need to hook applications from the different companies together. Middleware is the best technique for dealing with disparities in data structure and handling function calls between different systems. Various options are available depending on individual needs. Data warehousing products exist from a variety of vendors, although data warehousing can carry problems associated with aggravating large volumes of data, some of which may be inaccurate. International Software Group produces ISG Navigator, a data hub that uses a number of gateway plug-ins to access data from multiple sources on an ad hoc basis. Finding the middle ground The product differs from data warehouses in that it does not require information to be collated and scrubbed to produce an initial universal data structure. Instead, it simply gets the data direct from the legacy source when needed. However, this will create problems in scenarios where companies are trying to pull data off a legacy machine that is being heavily used. It is possible to create data warehouses with the product too, according to Marcus Hollingshead, UK sales manager at the company. If companies want applications to speak to each other at a functional level, rather than just throwing data at each other, there are a variety of options. These include transaction monitors, message-oriented middleware and object request brokers. Another option is a new type of middleware, called Enterprise Application Integration (EAI), which creates a central information hub into which all information flows. According to EAI developer Constellar, the difference between this and data warehousing is that this information hub can contain a set of business rules to enable applications to interact via the hub using logical calls that are translated into function calls by the EAI software. Essentially a company is warehousing its business processes along with its data. Another option to simplify the whole application integration process is to centralise as much of the application infrastructure as possible. On a Windows NT-based infrastructure this can be achieved using Microsoft Windows NT terminal server or the Citrix MetaFrame. A store of knowledge This pulls much of the application logic away from the desktop onto the server, making it easier to administer. This is useful when dealing with a large number of users in a merged organisation, if a company has a mixture of different server operating systems. It may be necessary to look for web-based functionality in legacy applications (some modern applications such as SAP R/3 have this). A middle-tier delivery system could be written to replace legacy code or wrap it in a component-based infrastructure. This creates an application that is served over the web and which can interface with data sources at the back end. Such a system involves a major development cycle and should be looked at as a medium-term goal. If a company wants to enjoy the benefits of centralised computing the period of redesign and redevelopment associated with the merger process may be the right time to restructure your applications. Making sure that disparate networks come together properly depends on three main things: unifying hardware and cabling infrastructures, marrying business processes and integrating applications. A company should not attempt to integrate fully until it has completed the marrying of business processes. Unifying cable and hardware is a less complex matter and should be looked at early in the process. ENTEC: COMING TO TERMS WITH A NEW ENVIRONMENT Usually, when you're trying to bring together companies spread over a wide area, there is no choice but to bring in a third party, with its own network infrastructure, to help. This was the case with Entec, an environmental consultancy formed by the acquisition of individual businesses which is owned by Northumbrian Water. Originally the various businesses had 64 Kbps links, and different Lan infrastructures, explained David Hawkins, IT manager at the merged company. The Lans had a mixture of different NetWare versions, and they all had different hardware and cabling infrastructures. Hawkins said that in the early days, there was an abundance of BNC cabling and some Category 3 copper, but no Category 5 infrastructure. Before Entec could tie together the various companies' offices using a single Wan, it needed to create a synergy in their Lan environments. After six months of planning, the company upgraded everything across seven offices. The process took about 18 months, and some work was still going on at the time of writing. "A lot of the first six months involved putting the project together," explained Hawkins. "There were a lot of legal issues such as building control." Eventually the project paid off. Offices have a mixture of 10BaseT and Fast Ethernet in their Lans, with NetWare 4.11 running in remote offices, and a mixture of NT and NetWare 4.11 at the central headquarters. The company went to service provider Fibernet to tie the buildings together over its ATM network. Hawkins got Fibernet to talk to his local supplier, XBS, which handled much of the Lan cable supply work. "We had three layers in the network," said Hawkins. "The PCs are all Compaqs running NT4. At the next level, everything is Bay products, and the bit that puts it all across the buildings is Fibernet." He explained that because XBS also supplied the PCs, it minimised the number of suppliers that the company had to deal with, making the project management process easier. MAKING THE RIGHT MATCH All the technical know-how in the world won't help you unless you marry your technical integration with some good old-fashioned business process re-engineering, according to Dr Robert Thurlby, head of consultancy at ICL's utilities business. Systems are more compatible than they were 10 years ago, so more effort can be spent ensuring that things work at a cultural and process-driven level. According to Thurlby, one merger he has seen was successful because it paid attention to the processes involved, while another failed because it had incompatible cultures and processes. "In the failed merger they found that although they had integrated and rationalised their hardware, they had immense difficulty making the processes work," he said. "As a result, the organisation never recovered from its merger." When two companies merge they must ensure that they have the same set of goals. It is important to understand which sets of business processes the company will adopt across the entire organisation and which will be dropped. This will have a radical effect on how a network and application infrastructure is built. If a company prefers to empower its staff by providing lots of flexibility at the desktop, and the other prefers a structured, centralised approach, this will have ramifications that extend across your switching and server architectures.
Freshly launched 11nm Qualcomm silicon will come with Adreno 612 GPU
Are pinning down the exact rate of expansion of the Hubble constant
RISC OS 5 to form the basis of RISC OS Open after Castle Technology sells to RISC OS Developments
A smartphone maker fiddling its benchmarking scores? That's unusual, isn't it?