The world has gone gaga for big data. Yet, companies still struggle to understand how collecting and analyzing data can provide a deeper view of customers and lead to better business decisions.
In the midst of all the big data hype, we seem doomed to the same fate encountered in the enterprise data warehouse (EDW) era, in that we don't understand what has changed. You may have noticed that the question often asked today is the same question that was asked when the EDW came into vogue: "How big is it?"
The question should be: "What can we learn from this data?"
[ Want more on this topic? Read Gartner Advanced Analytics Quadrant 2015: Gainers, Losers. ]
We haven't grasped that computational power -- or big computing -- is the real change that has opened the door to big opportunity. Big computing at small prices allows companies to look at, and deal with, data in ways not possible before. It's this computational capacity that has the real potential to transform data from a compliance burden into a business asset.
Organizations have always collected data, but until recently, large-scale cluster computing and analytic algorithms that could perform at scale were cost-prohibitive. That's no longer the case, and many organizations are now experimenting with big data. But they're not investing in skills and tools to analyze that data, a situation akin to planting a garden and then not watering it.
In 2014, big data was defined by masses of unstructured content. This year, big data will be defined increasingly by sensor data captured from the Internet of Things. Leaders in each industry are beginning to find real value in unlocking the potential in the data they already have. The insatiable desire to visualize data, recognize patterns, and turn data into dollars is being supercharged by high-performance computing.
In the latest wrinkle, the conversation is shifting from big data to machine learning. The foundations of machine learning were established in the 1950s. It now sits at the intersection of disciplines including artificial intelligence, statistics, and data mining. Leading work came from artificial intelligence pioneers like Alan Turing, but the concepts weren't widely adopted until the 1990s. Why? For one thing, the computations were too expensive for mass adoption at the time. Today, we can run billions of simulations to teach a machine to play poker or learn the concepts that define a cat.
The principles of statistics, forecasting, and optimization remain the same, but to seize the opportunities in big computing, it's imperative to modernize your playbook in four ways:
Offer education. Invest in educating the IT teams that support data scientists, from business analysts to database architects. These people need to understand the basics of analytics in order to appreciate both the art of the possible and the impact of organizing data for the task of rigorous analysis.
Ensure data access. Focus on data governance to accelerate and improve access to data, not to restrict it. Speed and agility require easy, yet managed, access.
Enable exploration. Invest in analytic "sandbox" environments that support large-scale cluster computing. Provide a toolbox for your data scientists, along with the computing capacity to look deep into your data, so they can find hidden relationships and meaning.
Be agile. Let the explorers fail fast, but also be sure to market their successes. The returns from combining big computing and analytics are like compounding interest: It's a gift that keeps on giving.
Attend Interop Las Vegas, the leading independent technology conference and expo series designed to inspire, inform, and connect the world's IT community. In 2015, look for all new programs, networking opportunities, and classes that will help you set your organization’s IT action plan. It happens April 27 to May 1. Register with Discount Code MPOIWK for $200 off Total Access & Conference Passes.