Big data has been around a long time. Here are four best practices to help you tap into the big computing power that is finally unlocking value.
10 Cloud Analytics & BI Platforms For Business
(Click image for larger view and slideshow.)
The world has gone gaga for big data. Yet, companies still struggle to understand how collecting and analyzing data can provide a deeper view of customers and lead to better business decisions.
In the midst of all the big data hype, we seem doomed to the same fate encountered in the enterprise data warehouse (EDW) era, in that we don't understand what has changed. You may have noticed that the question often asked today is the same question that was asked when the EDW came into vogue: "How big is it?"
The question should be: "What can we learn from this data?"
We haven't grasped that computational power -- or big computing -- is the real change that has opened the door to big opportunity. Big computing at small prices allows companies to look at, and deal with, data in ways not possible before. It's this computational capacity that has the real potential to transform data from a compliance burden into a business asset.
Organizations have always collected data, but until recently, large-scale cluster computing and analytic algorithms that could perform at scale were cost-prohibitive. That's no longer the case, and many organizations are now experimenting with big data. But they're not investing in skills and tools to analyze that data, a situation akin to planting a garden and then not watering it.
In 2014, big data was defined by masses of unstructured content. This year, big data will be defined increasingly by sensor data captured from the Internet of Things. Leaders in each industry are beginning to find real value in unlocking the potential in the data they already have. The insatiable desire to visualize data, recognize patterns, and turn data into dollars is being supercharged by high-performance computing.
In the latest wrinkle, the conversation is shifting from big data to machine learning. The foundations of machine learning were established in the 1950s. It now sits at the intersection of disciplines including artificial intelligence, statistics, and data mining. Leading work came from artificial intelligence pioneers like Alan Turing, but the concepts weren't widely adopted until the 1990s. Why? For one thing, the computations were too expensive for mass adoption at the time. Today, we can run billions of simulations to teach a machine to play poker or learn the concepts that define a cat.
The principles of statistics, forecasting, and optimization remain the same, but to seize the opportunities in big computing, it's imperative to modernize your playbook in four ways:
Offer education. Invest in educating the IT teams that support data scientists, from business analysts to database architects. These people need to understand the basics of analytics in order to appreciate both the art of the possible and the impact of organizing data for the task of rigorous analysis.
Ensure data access. Focus on data governance to accelerate and improve access to data, not to restrict it. Speed and agility require easy, yet managed, access.
Enable exploration. Invest in analytic "sandbox" environments that support large-scale cluster computing. Provide a toolbox for your data scientists, along with the computing capacity to look deep into your data, so they can find hidden relationships and meaning.
Be agile. Let the explorers fail fast, but also be sure to market their successes. The returns from combining big computing and analytics are like compounding interest: It's a gift that keeps on giving.
Attend Interop Las Vegas, the leading independent technology conference and expo series designed to inspire, inform, and connect the world's IT community. In 2015, look for all new programs, networking opportunities, and classes that will help you set your organization’s IT action plan. It happens April 27 to May 1. Register with Discount Code MPOIWK for $200 off Total Access & Conference Passes.
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.
Top IT Trends to Watch in Financial ServicesIT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Join us for a roundup of the top stories on InformationWeek.com for the week of September 25, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."