Recently, there?s been a rash of companies boasting that they have permission to export 56-bit encryption from the US, allowing them to offer customers better security.
One such organisation is Gradient, which sells an ?enterprise security infrastructure,? called Net Crusader. Encryption is only one part of Net Crusader?s job, but it is important in securing the confidentiality of data and creating authenticating systems which are resistant to certain types of hacking.
These announcements about 56-bit encryption are not technical advances; they are a reflection of changes in US laws known as ITAR (international traffic in arms), which classify encryption as a munition.
The thinking goes back to the Second World War, where encryption played a vital role in military communications. Until recently, complex encryption was the province of large governments in any case: before the advent of the PC, ordinary individuals, didn?t have enough computing power to handle it, and had relatively few uses for it.
The real battle over cryptography regulation started in 1991, when clauses started to appear in US legislation seeking to ensure wire-tapping access to encrypted communications.
The early Net community, seeing encryption as a vital tool, became angry. It was in this atmosphere that the free program PGP (pretty good privacy) was released and distributed across the world?s networks: its supporters wanted to ensure that any legislation would be undecided.
The controversy became a national pastime in 1993 when the Clinton Administration adopted the Clipper Chip as a standard for encryption, only to find that US citizens rebelled at the key escrow built in to the chips.
The final circuit on Clipper was blown when Bell Labs researcher Matt Blaze persuaded a sample chip to bypass the built-in law enforcement back door and encrypt sessions so that no one would be able to crack them.
Since then, several bills have been introduced into Congress seeking to lift the export restrictions. All have failed, but fresh bills are pending, along with a new government-sponsored bill that aims to regulate the domestic use of encryption for the first time. Similar proposals are pending in the UK for a network of licensed, trusted third parties.
A court challenge fared better, and in December 1996 won a ruling classifying encryption as protected speech under the First Amendment; in December, one of them succeeded.
Shortly afterwards, Clinton announced he would change the rules to allow the export of 56-bit encryption ? the previous limit had been 40-bit ? on the condition that manufacturers promise to have ready a key escrow system within two years.
Such systems have yet to be tested, and experts warn that making minor changes to encryption systems can introduce major security flaws, and that we have no mathematical model for understanding the properties of key escrow systems.
Giving the world 56-bit really isn?t improving the situation. For a start, as long ago as January 1996, seven leading cryptographers wrote a Business Software Alliance report advising that, for real security, we should be using 75-bit keys, and to make systems future-proof we should be looking at 90-bit keys.
Efficient encryption, of course, is not a US monopoly. However, the major problem in deploying encryption is integrating it into the standard applications used every day ? and those do come from the US.
Microsoft has attempted to create a standardised cryptographic API to which non-US programmers can write. However, according to Gradient, this idea was also ruled illegal under ITAR if it allowed authors to plug in encryption greater than those same old 40 bits.
So, the battles continue. In the meantime, bear in mind that 56-bit encryption is better than nothing, that the biggest threat to company security comes from disgruntled staff or former employees, and that encryption is only one element of a security system.
Tony Lowrey, European technical manager at Gradient, says: ?For our customers, encryption is not the biggest issue; it is impersonation.?
The language of data encryption
Algorithm: mathematical process by which data is encrypted. Popular encryption algorithms include DES, RSA, RC4, and many others.
Brute-force attack: attempt to break in to encrypted data by trying every possible key.
Clipper Chip: largely defunct US government encryption device, with built-in key escrow. It used a proprietary algorithm developed by the National Security Agency, and was adopted as a government standard in 1993.
DES: Data Encryption Standard. Secret-key system developed by IBM in the late 1970s and endorsed by the US as a government standard. DES?s useful life has been extended by the technique known as triple-DES, where data is encrypted with one key, decrypted with a second, and then re-encrypted with a third.
Key escrow: system for storing copies of cryptographic keys by a third party.
Key length: expressed in bits. The longer the key, the more bits, and so the longer it would take to decrypt the data by mounting a brute-force attack. As the power of available hardware increases, longer keys are needed. However, key length is not the only issue in designing a good cryptographic system: the strength of the algorithm and the overall design of the security system are equally important.
Key recovery: new euphemism for key escrow.
PGP: Pretty Good Privacy. A free program, a commercial product and a company. The software uses public-key techniques and is either sold or distributed free across the Net. The company was founded in March 1996, by the software?s author, Phil Zimmermann, after the US Department of Justice dropped an investigation into whether he had contravened the export laws.
Public-key cryptography: allows strangers to communicate securely without prior arrangement. The encryption program generates a pair of complementary keys. One is kept secret; the other is ?signed? by witnesses and distributed widely. Any communication encrypted with the public key can only be decoded by the private key, while any communication encrypted with the private key can only be decrypted by the public key. The system offers confidentiality of data and authentication. Public-key cryptography was originally developed by Whitfield Diffie and Martin Hellman, who envisioned it as a system for real-time communication.
RSA: an algorithm and a company named for it. The RSA algorithm was developed at MIT by researchers Rivest, Shamir, and Adleman.
Dr Kuan Hon criticises GDPR consent emails that will only eviscerate marketing databases and 'media misinformation'
Apple squashes Steam Link app on 'business conflicts' grounds
Philip Hammond wants to forget rules that the UK agreed with the EU to ban non-European companies from the satellites
Instapaper to 'go dark' in Europe until it can work out GDPR compliance