Davy Nys, Pentaho European vice president, said the Intel Hadoop distribution with the Pentaho analytics capabilities will allow IT departments to recognise potential infrastructure problems before they occur.
"Machines generate so much data and that data is captured on a range of hardware and devices. The log files are there but people only generally look at when things go wrong. Many organisations want the data before something happens," said Nys, speaking to V3.
Ben Woo, Neuralytix analyst, said the Intel Distribution for Apache Hadoop announcement shows the chip maker is keen on integrating Hadoop directly into the silicon level.
"Intel know the chip so well so they can truly integrate the Hadoop framework into the depth of the chip. Because there are so many variations of the Intel chip, and alternative chips as well, developers always tend to work with the lowest common denominator. This is Intel's way of ensuring their CPUs are being used," he said.
"Intel has always built microprocessors for high-end processing of data. But what is new for Intel is the extent to which it's moving to touch individual customers. Intel is no longer looking to partner in OEM agreements with computer companies to reach customers. Intel is gently walking the line to get closer to the users' technology."
Meanwhile Helena Schwenk, an analyst for MWD Advisors, said she was not surprised by Intel's big data push.
"In the last 18 months, Intel has been investing in a number of software start-ups that have some foothold in the data warehousing market, so this seems like a logical move in some respects."
"It's interesting from my perspective to see a different kid on the [big data] block. It looks like they are ramping up their efforts particularly on the encryption and administration side."
Mike Davis, principal analyst for MSMD Advisors, said the announcement is further signs of Intel moving into the software market.
"First with McAfee, now with this. Intel doesn't want to be known as just a chipmaker ever again."