Storage managers are struggling to keep control of storage as it expands almost exponentially, according to research conducted by VNU Business Publications.
Sponsored by storage management software provider Sagitta, the survey involved 251 telephone interviews with readers of vnunet.com's sister publication Computing who have a direct influence on their company's storage management strategy.
The results showed that storage needs will double over the next two years, from a level already 150 per cent up in three years. For public sector organisations the growth is even higher.
But poor utilisation of existing storage, lack of measurements to assist the management of costs, backup difficulties and an apparent lack of coherent plans for storage upgrades, all point to a situation out of control.
The survey confirmed and carried forward research conducted through focus groups in March, which found that few organisations had a clear storage strategy.
Andy Norman, managing director at Sagitta, said: "It's frightening. [Storage managers] are like rabbits in headlights: scared and paralysed.
"To get storage under control probably involves new technology and skills, and they are not sure they have the skills."
Organisations demonstrated that they are anxious to drive up storage utilisation from the current 55.8 per cent to nearly 85 per cent, but there was no consensus on how to achieve this.
Responses ranged from: "Buying more digital linear tape automatic tape feeders" and "Consolidating existing servers and storage [with] storage area networks", to "Making the business functions own their requirements".
The options to solve the problems are so diverse that even the most popular approach - the centralisation or consolidation of servers - was only identified by 15 per cent of respondents.
Equally worrying is that 74 per cent were unable to quantify storage costs. Of those who did, 67 per cent put the cost per gigabyte at below £50, but some organisations paying much more pulled up the average to £168.
Only 28 per cent were currently measuring the cost of downtime, and 40 per cent had no plans to do so. Only 33 per cent were attempting to measure upgrade costs, and as many as 44 per cent were not planning to do this.
The organisations represented an even split between public sector, finance, manufacturing/retail, media/professional, and others.
All had over 50 employees; 66 were small and medium sized enterprises of less than 200 employees, 94 had up to 1,000, and 85 had over 1,000.
Daily back-ups typically took one to six hours within an available window of 10 hours. But 16 per cent were experiencing regular back-up problems, the public sector companies suffering most at 25 per cent.
It was unclear how businesses supporting 24-hour or multiple time zone working were coping.
In hardware terms, most companies had between 10 and 50 servers and a mix of storage that included direct attached and network attached, or storage area networks.
"But they might as well throw half of it away [without a storage strategy]," said Norman.
In fear of future shortage - or in preparation for its own electric car project?
New Spectre microcode patches released by Intel to fix security flaws in Skylake, Kaby Lake and Coffee Lake CPUs
But if you're running anything older you'll have to wait
Powered by servers based on Qualcomm's scalable 48-core Centriq 2400 10nm CPUs
Malware has been in circulation for more than a year