Intel confirmed today it will introduce flash memory products next year that will effectively double the capacity of current devices - but cost will keep them out of computers in the near future.
Strataflash memory, said Julian Powell, a representative at Intel UK, was already sampling and would be available in volume next year.
Intel has managed to double the cells used in flash memory to make its breakthrough, but the major current applications for the product are in mobile phones, digital cameras and personal digital assistants, said Powell.
Higher specification, 32mb and 64mb flash memories are rarely appropriate to PC devices because the cost per mbyte of flash is so high, while hard drives are fast and more cost-effective, agreed Intel.
Although he was unable to give exact market shares, Powell said that Intel dominated flash memory with around 60 per cent.
While the technology will start off at the .4 micron level, Intel will shrink it further, he added. The new devices will be backwards compatible with previous Intel flash devices.
But despite Powell?s admission that Intel will not position its fast flash against PC hard drives, there will be some companies that use the flash memories in this way,if they have particularly heavy duty requirements. Dana Gross, an executive vice president at M Systems, said: ?Our...software allows Intel Strataflash components to seamlessly emulate a hard disk. This new two-bit per cell technology will drastically drive down the costs of flash memory.?
Although this technology seems, for now at least, to be targeted outside the PC market, a similar technology, fast static Ram, is already used within the Pentium II family of microprocessors.
Microsoft seizes control of phishing sites linked with Russian state hackers
Fitness trackers over-estimate the number of steps their users take, analysis of 67 research reports suggests
Everything we think we know about the imminent Apple iPhone 9, iPhone 11 and iPhone 11 Plus launches
All the latest rumours about Apple iPhone Displays, CPUs, launch dates and even prices
Nvidia brings Turing microarchitecture into the high-end gaming segment