IBM announces Elastic Storage and Datapipe rolls out a beta analytics service for AWS users.
I had briefings this week from IBM on a beefed-up version of its GPFS, now called Elastic Storage; and Datapipe, which has rolled out a new managed service for AWS.
IBM Updates GPFS To Elastic Storage
IBM announced Elastic Storage, a software front-end that virtualizes storage hardware and creates a single namespace, allowing multiple applications and services to access a common storage pool. IBM says Elastic Storage can run with any storage hardware, whether from IBM or a competitor. Elastic Storage also provides policy-based storage tiering, in which administrators can set rules to automatically move data among flash, flash cache, spinning disk or tape.
IBM says Elastic Storage, which is an enhanced version of its GPFS (General Parallel File System) platform, is ideal for high-performance analytics on large volumes of data because it can manage data access and workload distribution across multiple clusters. The company also said Elastic Storage will be available in the summer as a service on its IBM Softlayer platform, whether as a standalone public cloud service or a hybrid deployment.
As for pricing, IBM wouldn’t provide specifics but said it would be based on the sockets on the servers running Elastic Storage. IBM did say customers can start with a simple two-node cluster for tens of thousands of dollars.
Quick Take: IBM is clearly targeting Big Data and high-performance analytics, which may appeal to financial services, retail, healthcare, manufacturing and other verticals that are looking to crunch lots of information.
Datapipe Launches AWS Analytics Service
Datapipe, a managed services provider, is releasing a new managed service, Datapipe Cloud Analytics, for companies using Amazon Web Services (AWS). The service is designed to help customers get a company-wide view of AWS usage across various departments, better manage AWS utilization, and track just how much the organization is spending.
Included in the service are cost dashboards that provide both current and historical tracking of AWS costs. Costs can be broken out by departments, such as dev/test and marketing. The service can also help organizations track the use of AWS services to see if deployments are making the most efficient use of Amazon’s resources, such as identifying instances that may be under-utilized and could be taken down. The new service is currently available in beta, with free 30-day trials available.
Pricing is based on the level of consumption of AWS.
Quick Take: Organizations with large AWS deployments might want a hand when it comes to monitoring costs and usage. That said, potential customers should carefully price out the Datapipe service to ensure that tracking AWS costs doesn’t itself get too expensive.
Drew is formerly editor of Network Computing and currently director of content and community for Interop. View Full Bio
5 Top Federal Initiatives For 2015As InformationWeek Government readers were busy firming up their fiscal year 2015 budgets, we asked them to rate more than 30 IT initiatives in terms of importance and current leadership focus. No surprise, among more than 30 options, security is No. 1. After that, things get less predictable.
Join us for a roundup of the top stories on InformationWeek.com for the week of December 14, 2014. Be here for the show and for the incredible Friday Afternoon Conversation that runs beside the program.