Cloud 2012: More Data. More Efficiency. More Devices.
There’s no stopping data growth.
I often get asked for my thoughts on cloud computing and other data center trends. While I'll stop short of calling anything a prediction, I can tell you what is top of mind for me and many of my colleagues this year.
Unrelenting data growth will continue.
There's no stopping data growth. IDC predicts that by 2015, the amount of information managed by enterprise data centers will grow by a factor of 50, and the number of files the data center will have to deal with will grow by a factor of 75. Mobile data traffic alone will increase 26 times between 2010 and 2015, reaching 6.3 exabytes per month by 2015, when nearly 70 percent of Internet users will use more than five network-connected devices.
As enterprises face an avalanche of data triggered by social media, application growth, and a proliferation of mobile devices, they need cost-effective ways to turn bits and bytes into meaningful information. Moreover, with 15 billion connected devices by 2015, the amount of data for manufacturing, retail, supply chain, smart grid, and many other applications will require new approaches to both batch and real-time analytics. Driven by this need, many organizations are developing distributed analytics platforms based on frameworks such as Hadoop.
An open-source framework for the distributed processing of large datasets across server clusters, Hadoop enables fast performance for complex analytics through massively parallel processing. It also allows database capacity and performance to be scaled incrementally through the addition of more server and storage nodes. This approach is not without challenges as the usability and scalability of distributed analytics frameworks currently inhibit broad adoption.
Best practices will drive efficiency gains.
Cloud computing is one of the keys to dealing with massive amounts of data in a cost-effective manner while creating a more agile IT infrastructure.
That's the case at Intel IT, where our enterprise private cloud is up and running and has already realized $9 million in net savings to date. More than 50 percent of our servers are now virtualized. We've reduced provisioning time from 90 days to 3 hours, and we see the day coming where provisioning will take place in minutes.
Efficiency isn't important only in the software and compute layers; it's also a focus for best practices at the infrastructure and facility levels. One such best practice is high ambient temperature (HTA) data center operation. HTA raises the operating temperature within a data center to decrease operational and capital costs for cooling and enable energy savings to be used to power servers.
Unfortunately, however, it's not as simple as turning off the air-conditioning. The system design, rack and facility controls, and even technology component choices are critical and part of the reason that we've developed a blueprint of best practices that we share openly.
Client-aware computing will become essential.
In response to the proliferation and wide-array of devices, client-aware computing will be key focus in cloud data centers. In a client-aware environment, cloud-based applications both recognize and take advantage of the capabilities of the client device.
Rather than providing services that are dumbed down to a lowest common denominator-or the capabilities of the most basic client devices-the cloud service adapts to deliver optimal service based on the device at hand, making full use of the capabilities of both the client and the server. Understanding the compute, graphics, battery life, security, and other attributes of the device can greatly improve the user experience while efficiently using data center and network bandwidth.
Technology refresh will reinvigorate data centers.
Organizations will refresh data center technology to pack more computing power into each square foot, drive down power and cooling costs, and increase the security of data and applications.
With those goals in mind, I'm excited by the technology we're delivering in our new Intel® Xeon® processor E5 platforms. We're introducing new technology for the performance and scale of big data, power management for data center efficiency, and Intel® Trusted Execution Technology (TXT) to address some of the security requirements of cloud datacenters.
I believe 2012 is going to be a year of tremendous growth and innovation enabled by cloud computing. At Intel, we are thrilled to be a part of it.
Jason Waxman is the General Manager in Intel's Data Center Group responsible for High Density Servers and Intel's initiatives in Cloud Computing. Jason holds executive positions in industry design efforts including the board of Blade.org and the Server System Infrastructure Forum. He holds Bachelor and Masters Degrees in engineering and a Masters of Business Administration from Cornell University. You can find him on Twitter at @jpwaxman
The above insights were provided to InformationWeek by Intel Corporation as part of a sponsored content program. The information and opinions expressed in this content are those of Intel Corporation and its partners and not InformationWeek or its parent, UBM TechWeb.
Multicloud Infrastructure & Application ManagementEnterprise cloud adoption has evolved to the point where hybrid public/private cloud designs and use of multiple providers is common. Who among us has mastered provisioning resources in different clouds; allocating the right resources to each application; assigning applications to the "best" cloud provider based on performance or reliability requirements.
Join us for a roundup of the top stories on InformationWeek.com for the week of December 14, 2014. Be here for the show and for the incredible Friday Afternoon Conversation that runs beside the program.