Hewlett Packard Enterprise used its HPE Discover conference in Las Vegas Tuesday to unveil a series of updates around the company's cloud portfolio, discuss its vision for software-defined storage in data centers, and offer an open invitation to developers to participate in creating a new computing architecture the company calls The Machine.
HPE has undergone major changes in recent months. HPE was formed after Hewlett-Packard split its PC and enterprise businesses. More recently, HPE announced plans to spin off its IT services business and merge it with CSC.
Tuesday's news was forward looking, focused on momentum and plans for HPE's core business.
"Hewlett Packard Enterprise will continue to focus on secure next-generation software-defined infrastructure that leverages a world-class portfolio of servers, storage, networking, converged infrastructure, hyperconverged systems, as well as our Helion Cloud Platform and, of course, our world-class software assets," CEO Meg Whitman said at the event. "By bringing together leadership positions in these key data center technologies, we'll help customers run their traditional IT better while building a bridge to a multi-cloud environment."
[Find out more about HPE's hybrid cloud strategy. Read HPE Promos Synergy Platform for Hybrid Cloud.]
Here's a rundown of the big announcements from HPE Discover on Tuesday.
The company announced that "The Machine" -- a new computing architecture under development in HPE Labs and first revealed in 2014 -- will now be open to developers. In fact, HPE is inviting developers to collaborate to help make The Machine a reality. On a community web page devoted to the project, HPE stated this research project is focused on "reinventing the computer architecture on which all computers have been built for the last 60 years." The company said this opening of the project "extends HPE's longstanding commitment to, and participation in, open source software."
The Machine's architecture puts memory, not processors, at the core, according to the project's community page. "Memory-Driven Computing collapses the memory and storage into one vast pool of memory called universal memory," the company stated.
Instead of using electricity, the new architecture uses light to connect the memory and processing power.
"We're using advanced photonic fabric. Using light instead of electricity is key to rapidly accessing any part of the massive memory pool while using much less energy," the company stated on its community page.
Developers are invited to visit The Machine's community page, and are offered developer tools with these four contributions of code:
- A database engine designed to speed up applications by taking advantage of a large number of CPU cores and non-volatile memory
- A fault-tolerant programming model for non-volatile memory, which HPE said bypasses conventional file systems and databases to directly manage data structures in persistent memory in the event of a system failure
- Fabric-attached memory emulation, which is an environment to let users explore the new architectural paradigm of The Machine
- A DRAM-based performance emulation platform that leverages features available in commodity hardware to emulate the different latency and bandwidth characteristics of this new architecture
HPE Expands Helion Cloud Portfolio
The company introduced several updates to its flagship cloud portfolio during its Discover conference, including the Helion Cloud Suite. The software suite is designed to let organizations deliver and manage all their applications in many forms -- traditional, virtualized, cloud native, and containers -- across the full range of infrastructure environments.
HPE Helion CloudSystem 10, a hardware and software environment for rapid deployment of enterprise-grade cloud, offers deep integration with HPE OneView 3.0 for automatic provisioning of cloud resources from bare metal, according to HPE.
The HPE Helion Stackato 4.0 is the company's open application development platform-as-a-service product powered by Cloud Foundry for cloud-native applications.
Composable Data Fabric
HPE took its software-designed storage vision further at the event, too. Pitching its solutions hard against what is offered by EMC, HPE product leaders spoke about their company's plan to combine all data center storage assets into a combined resource pool.
HPE executives said its StoreVirtual technology provides a set of common data services embedded in every ProLiant Server, Hyper Converged appliance, HPE Synergy, Helion OpenStack, and newer infrastructure for Network Functions Virtualization.
From HPE OneView, administrators can manipulate capacity as a combined whole and orchestrate it from inside software. This is designed to help enterprises get closer to the goal of a software-defined data center.