While you're cheering or crying over your team's latest performance during Euro 2000, spare a thought for the work going on behind the scenes. One of the hardest working systems during the tournament will be the official Euro 2000 website that Uefa, European football's governing body, has been planning since the previous competition four years ago.
The site, which can be found at www.euro2000.com is expected to be the focal point for fans looking for up-to-date news and details of how their team is faring during specific matches. The organisers are predicting 1.5 million hits per day and the website's infrastructure has been built to handle 1.5 billion page impressions.
The site has been planned and built by four companies: Sportal, ArrowPoint, PSINet and CacheFlow. Sportal already hosts sport-related sites in eight countries, using seven languages, and the content for the Euro 2000 site will similarly need to be localised for each country.
Content aside, the site needed to be able to handle a lot of traffic, much of it generated in peaks around the matches. Neil Macdonald, chief operating officer at Sportal, said: "We asked Uefa how much traffic it expected and it said it didn't know. We know how much traffic there was during France 98, but there are a lot more internet users now than there were back then. We also know it's going to peak before, during and after each match has been played."
Coping with traffic
To deal with all the traffic it expects to generate, Sportal recruited PSINet to provide the backbone for the site. The company hosted the official World Cup 98 site and boasts more than one million miles of fibre optic cable throughout the world. In January of this year, Sportal and PSINet began working on a design for a network that would be able to cope with large quantities of traffic. It was felt that the way to guarantee a good response time for each user was to hold the information as close to those users as possible, effectively caching it.
The eventual design that was agreed begins with Sportal's Sun Microsystems Unix Vignette servers, which are contained in a virtual private network and used to create the content for the site. Data from these servers is then posted to Apache web servers, which are situated in Holland and Los Angeles, and a content management tool notes when new information has been added.
The next link in the chain is a network of caching devices situated in Amsterdam, London, Los Angeles, Vancouver, Paris, Berlin, Frankfurt, Geneva and Valencia. These include four super PoPs (points of presence), where four or five CacheFlow devices are situated, with just two caching devices in smaller locations. The content management tool then sends a request to the caching servers, telling them to delete the content that has been updated and upload the new content. This action can be performed several times per minute.
Nigel Hawthorn, marketing director at CacheFlow, said: "The network infrastructure has the built-in ability to push new information to all of these caches held in Europe and the US, so that they are providing information that's bang up to date. That was the challenge - to be able to make sure the content is always fresh."
Sitting in front of these caching devices are the ArrowPoint load balancing switches - CS-800s at the super PoPs and CS-100s at other locations.
Their load balancing function points browsers to their nearest cache of information, thereby speeding up the process. However, there is also the need to route the user's request to an alternative device if the local cache is in danger of becoming overloaded.
Patrick van Arendonk, systems engineer at ArrowPoint, said: "From a global load balancing view it makes sense for users to be pointed to their nearest PoP. What the ArrowPoint web switches do is look for the best location to provide content. Most of the devices talk to each other about how much load is on each site."
Rob Enderle, an analyst at Giga Information Group, said caching is a good way of giving companies a presence in a lot of countries, as it has a lower cost than using replicated servers and is easier to maintain.
But he said it is dependent on how much content is specific to each region. "If most content is country specific - for instance local language - the caching technique is not particularly practical," he said.
One of the major fears for the site is that it will have underestimated the amount of traffic that will be generated during the contest. To combat this and as a back-up in case of system failures, the network has a lot of redundancy built into it. Sportal's Macdonald said: "The whole site is fully redundant and everything in the architecture is duplicated - in some cases more than once. This allows for the loss of whole data centres, not just server failures."
But the best news for England fans who find themselves away from a TV set but next to a PC is that crucial games such as England versus Germany are in safe hands. Mike Holden, project manager at ArrowPoint, said: "Let's say lots of users were coming from Germany while the England versus Germany match was on, and we hit a capacity issue. If we experience a lot of demand for content going into Germany we can point users off to, say, Paris, which might be lightly loaded."
Finding refutes many earlier studies that suggest that galaxies don't have much dark matter at the time of their birth
Boris the robot outed as man in rented robot suit
Mission will provide vital data about the performance of rocket, spacecraft, autonomous docking system and the landing system
The flight will take off from California's Mojave Air and Space Port and could happen as soon as 13th December