Processor giant uses big data to develop chips faster, identify manufacturing glitches and warn about security threats.
Big Data's Surprising Uses:
From Lady Gaga To CIA
(click image for larger view and for slideshow)
Intel is finding big value in big data. Over the past two years the company has developed more than a dozen data-intensive projects that have bolstered both its operational efficiency and bottom line.
According to Ron Kasabian, general manager of big data solutions for Intel's data center group, these ongoing efforts have resulted in millions of dollars of cost savings.
"We started the year before last, realizing we had an opportunity to leverage data that's floating around the enterprise today, (data) we weren't dealing with," Kasabian told InformationWeek in a phone interview.
Intel's big data initiatives are summarized in the company's 2012-13 IT annual report, which discusses the chipmaker's efforts in other areas such as cloud computing.
Big data and predictive analytics are helping Intel bring new chips to market faster, said Kasabian.
"We run a huge number of complicated tests on every single chip that comes through the manufacturing process," he said. "And as we're ramping up new chips, we uncover lots of bugs and fix them."
Every chip Intel makes goes through a quality check, which includes an extensive series of tests. By analyzing historical data collected during manufacturing, Intel can reduce the number of tests it conducts.
"We're taking some of the information that's coming out of the manufacturing process for those pre-release chips, and looking at it at the wafer level," said Kasabian. "Instead of running every single chip through 19,000 tests, we can focus tests on specific chips to cut down test time."
This predictive analytics process, implemented on a single line of Intel Core processors in 2012, allowed Intel to save $3 million in manufacturing costs. In 2013-14, Intel expects to extend the process to more chip lines and save an additional $30 million, the company said.
Data-intensive processes also help Intel detect failures in its manufacturing line, which is a highly automated environment. "A lot of what we're doing is pulling log files out of manufacturing and test machines," said Kasabian. "Across our entire factory network, we're talking about 5 terabytes an hour. So it's very big volume."
By capturing and analyzing this information, Intel can determine when a specific step in one of its manufacturing processes starts to deviate from normal tolerances.
Big data benefits Intel's security efforts too. The company says its big data platform can process 200 billion server events, and provide early warning of security threats within 30 minutes.
"Across the entire network there are these devices called NIDs, or network intrusion devices," said Kasabian. "They're little devices that check packets flying across the network. We're pulling all this data across thousands of these devices and feeding it into Hadoop."
After using Hadoop to capture and classify the data, Intel extracts the relevant information and loads it into an MPP (massive parallel processing) database.
"We're looking for anomalies," said Kasabian. "We started last summer with server anomalies on a small number of servers. By the end of the summer, we (were examining) pretty much all the servers on the Intel network."
Based on its experiences thus far, Intel believes in big data's long-term potential. "We started with three or four projects in 2011," said Kasabian. "In 2012 we had between 12 and 14 projects that together represented well over $100 million dollars in value to Intel, either in cost savings or cost avoidance."
He added: "We're just beginning to scratch the surface. Now that the manufacturing and silicon design guys understand what we can do, we're going to 10 times that in the next couple of years, no question."
Attend Interop Las Vegas May 6-10 and learn the emerging trends in information risk management and security. Use Priority Code MPIWK by March 22 to save an additional $200 off the early bird discount on All Access and Conference Passes. Join us in Las Vegas for access to 125+ workshops and conference classes, 300+ exhibiting companies, and the latest technology. Register today!
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.