IBM Spectrum Conductor

Efficiently analyse, access and protect data with an integrated application- and data-optimised platform

Highlights

Organisations are realising that extracting full value from their data provides a critical competitive advantage. To accelerate business insights from all their data, IT managers are adopting a new generation of open-source frameworks such as Apache Spark and Hadoop MapReduce and scale-out applications such as MongoDB and Cassandra. However traditional IT server configurations, hypervisor environments and storage silos do not work well for these modern applications and frameworks because they are not optimised for distributed computing.

IBM Spectrum Conductor with Spark meets this challenge with software defined infrastructure technology designed for distributed environments. It enables organisations to deploy Apache Spark efficiently and effectively, supporting multiple versions and instances of Spark and a broad set of born-in-the-cloud application frameworks. It increases performance and scale, maximises usage of resources and eliminates silos of resources that would otherwise each be tied to separate application implementations.

IBM Spectrum Conductor with Spark can be combined with IBM Spectrum Scale to provide a highly scalable data- and application-optimised fabric that enables organisations to access, analyse and protect their data with maximum efficiency. This speeds deployment and simplifies management across resources while adding enterprise-grade capabilities such as built-in high availability (HA) and optimised utilisation with a global resource manager.

IBM Spectrum Conductor with Spark offers IT managers and architects a different way of thinking about their infrastructure – one that enables IT to unleash the potential of a new generation of applications.
 

IBM Spectrum Conductor with Spark
Features Benefits
Multi-tenant, integrated application platform Eliminates Spark and other application silos
Spark integration Includes integrated Spark distribution
Spark lifecycle management Supports simultaneous deployment of different versions of Spark
Single-pane management and monitoring Simplifies administration across a scale-out, distributed infrastructure
Multi-dimensional scaling Independent scaling of compute and storage infrastructure, along with flexible allocation of resources according to application requirements for compute resources and memory