Early Big Data applications tended to be vertically focused: genomics, quantitative geology, high finance, astrophysics, and so on. Today, however, Big Data is optimizing supply chains, deciding who to fire and who to promote, improving strategic planning, and much, much more.
In the coming decade, IT will become the owner of the Big Data platforms, pressed into service to make analysts smarter and to get results faster. In 2012, we'll see Hadoop and its strangely-named ilk find their way into testing and pilot projects, and companies will decide whether to do Big Data in house or through third-party providers.
Cloud Trend No. 2: The Move Out To Public Clouds.
If you ask the average person what cloud computing is, he'll say, "It's stuff like email that I don't have to run myself. It's out there in the Internet."
For consumers, cloud computing has always been public. Nobody calls the three machines in his house a private cloud, and small businesses have a huge advantage when it comes to cloud adoption because they have few regulatory concerns and little legacy moss to shed.
Consider a small consulting firm--me, in fact. 100 Gigabytes of my data is backed up across three hard drives in my house, on two operating systems (PC and MacOS). It's periodically backed up to version-controlled disc, with one of them stored offsite. Parts of it are synchronized across a tablet and a phone. It's replicated to a public storage service, which puts copies in multiple geographic locations. And I have access to all previous versions of all files created since mid-2008.
I use Dropbox, and it costs me $99 a year.
The advantages the cloud brings to a greenfield user are tremendous. I have access to storage, CRM, office suites, productivity tools, intranet and publishing systems, and project management tools, many for free. I don't have to enter into long-term contracts or invest in anything more than a notebook and a broadband router.
This was the promise of the cloud. Along the way, however, companies realized that the reality stopped far short of the promise. Governance, bandwidth constraints, and application inertia prevented enterprise users from reaping these benefits.
Since that time, several things have happened:
-- We've undergone at least one hardware refresh cycle, prompting organizations to compare the cost of re-investment and re-licensing against public systems.
-- Tolerance for public tools has increased, spurred on in part by the rise of sharing on the consumer Internet in the public psyche.
-- Regulations have emerged, and been implemented by cloud providers, to provide a degree of assurance around things like HIPPA and PCI.
-- Proven architectures for high availability and better performance are now understood. We know how to make unreliable cloud machines very reliable in large numbers.
-- Software vendors have fixed many of the inherent dependencies that tied their applications to physical environments, from licensing systems to hardware requirements. Those that haven't are seeing increased competition from upstarts.
Taken together, these trends point to a resurgence of public cloud adoption in 2012--albeit a wiser and more nuanced one--as the cost/benefit tradeoff and competitive pressures force companies to look outward again.
Cloud Trend No. 1: The Move To Platforms.
Nobody wants to see how the sausage is made.
IT has two jobs. First, it needs to make infrastructure efficient--managing more boxes, with less people, more predictably, more of the time. That's accomplished by standardization, virtualization, and automation, and this process is well underway.
But IT also has a second job. It needs to make the organization more effective. In every organization, IT is a means to an end. That end is to convert resources into value for greater than it costs to do so.
If IT simply "gets out of the way" of its business users, it may as well not exist. The business users want to build applications. There's little value in it having to manage boxes, networks, policies, and data.
For a decade, we've been talking about the move to "service-centric IT" and "service-oriented architectures." Essentially, this means that rather than giving developers and users machines to play with, IT gives them services, APIs, and functions they can build on.
But the call for service-centric IT has fallen on deaf ears. There was no urgency for IT teams to become more service-driven, and enterprise IT had too much friction. When it took a month to procure and provision a server, the business user, exasperated, simply said, "Give me it and I'll do it myself."
Now that the coefficient of friction has dropped, thanks to virtualization and automation, business users are realizing they don't want to manage machines. They want to write their code, use a few APIs, and have it work. It should scale up and down; stay patched against vulnerabilities; comply with data regulations; and automatically survive outages.
To do this, IT needs to deliver platforms, not machines. As with many things Cloudy, this is nothing new: Mainframe developers once wrote their code, without worrying about the machine on which it ran. More recently, Windows developers called OS functions to print, copy data, or let the user pick a color, rather than develop such functions themselves.
More than anything, 2012 will be a realization that virtual machines are simply a stepping stone to a world in which we write code rather than wrangle boxes. The best IT teams will be those that embrace private and public platforms as a service and worry about real issues such as lock-in, usage costs, and compliance atop such systems.
For the 15th consecutive year, InformationWeek is conducting its U.S. IT Salary Survey. Upon completion of the survey, you will be eligible to enter a contest for prizes including a Bravia HDTV or iPad 2, and get a link to download our report once it is published. Take the survey now. Survey ends Jan. 20.