Red Hat has laid out its big data manifesto, a series of principles the open source software maker believes will help businesses grapple with the issues of managing burgeoning data volumes.
The company said that it big data rules would underpin the development of its new platforms.
"To take complete advantage of big data, enterprises must take a holistic approach and transform their view of storage from a 'data destination' to a 'data platform'," the company said.
"As a platform for big data and not just a destination for data storage, enterprise storage solutions need to deliver cost-effective scale and capacity; eliminate data migration and incorporate the ability to grow without bound; bridge legacy storage silos; provide global accessibility of data; and protect and maintain the availability of data."
The company's core concepts for big data cover fields such as prioritising the uptime of data and ensuring global availability of data sets.
Additionally, Red Hat said that it would be prioritising concepts such as support for legacy storage platforms and the ability to cost-effectively scale platforms for user demands.
Other big data issues, such as the idea of eliminating data migration, create bigger problems for firms, Red Hat claimed.
"With enterprise data stores now approaching petabyte sizes, wholesale data migration is no longer logistically or financially feasible," the company said.
"A big data platform must address the requirement for periodic data migration by providing a system with the ability to grow without bound."
Red Hat has been among the most aggressive firms in the businesses in the emerging big data market. In addition to releasing dedicated big data products and appliances, the company has also made a number of acquisitions to expand its presence in the market.