Hitachi Data Systems (HDS) has unveiled a new engineered system designed for big data projects, combining a hyperconverged hardware platform that integrates compute, storage and virtualisation capabilities with Pentaho's big data and analytics software.
Available now, the HSP 400 series is described as Hitachi's next generation hyper scale-out platform, providing customers with ready-made infrastructure to support big data blending, embedded business analytics and simplified data management.
The HSP 400 is based on 2U rack-mount nodes like some hyperconverged solutions such as those from Nutanix. Inside each node is effectively a twin-socket Intel Xeon E5-2620 server with 192GB memory, space for up to 12 SAS drives for data storage, and redundant power supplies. Customers scale out by adding additional nodes, each of which adds extra compute and storage resources to the pool.
HDS acquired Pentaho last year, and therefore offers native integration of Pentaho's Enterprise Edition big data and analytics tools as part of the HSP 400. This includes Pentaho data integration and business analytics server software to simplify data blending, and the Hortonworks Data Platform for certified data services.
The system also comes with Hitachi tools comprising virtual machine management and a scale-out file system with a global namespace for shared storage.
This avoids the NameNode bottleneck often experienced with Hadoop deployments, according to Hitachi, by distributing metadata and thus enabling every node to serve data without centralised metadata management.
The HSP 400 is delivered as a fully configured, turnkey appliance, and takes hours instead of months to install and support production workloads, Hitachi said. It also makes it easier to create a data lake with which to integrate data sets and run analytics workloads.
However, HSP is capable of running more than Hadoop and analytics workloads. It can be repurposed to operate any workload that runs inside a virtual machine atop the KVM hypervisor, and is compatible with the OpenStack cloud framework, supporting the Glance, Nova and Swift APIs.
Sean Moser, senior vice president for Hitachi's global portfolio, explained that data silos and complexity are major pain points for the firm's large enterprise customers, and that this gets worse as data volumes grow.
"Our HSP appliance gives them a cloud and IoT-ready infrastructure for big data deployments, and a pay-as-you-go model that scales with business growth. Seamless integration with the Pentaho platform will help them put their IT and OT data to work, faster," he said.
Moser added that this is just the first solution of this type from Hitachi and Pentaho. For example, the version shipping now is fitted with SAS hard drives but Hitachi expects to have an all-flash version by the middle of 2016.
Geoengineering on the sea floor near glaciers would form a new ice shelf to prevent melting
Alterations in capillary blood flow can be caused by body position change
Curiosity rover is in 'normal mode' but not transmitting scientific data back to base
NatWest outage comes a day after Barclays' IT systems shut out customers and staff