The vice president of Intel's AI products group talks evolution and new applications for artificial intelligence.

Joao-Pierre S. Ruth, Senior Writer

April 19, 2019

5 Min Read
Image: sdecoret - AdobeStock

Intel touches many branches of technology, but lately they've been giving artificial intelligence a push. InformationWeek met with Gadi Singer, vice president in the AI products group and general manager of architecture at Intel, after he spoke at the O’Reilly Artificial Intelligence Conference in New York. Singer has overseen the creation of architecture that includes the Intel Nervana Neural Network Processor. He shared his perspective not only on Intel’s efforts but also broader trends in AI.

How has the AI scene been evolving lately?

“Three or four years ago, a lot of the work in deep learning was testing, experimentation, finding out what things could be done with the technology. At the time it was a mix of training and inference, and the tools of the trade were not highly optimized. Between then and now, deep learning frameworks arrived, such as TensorFlow. That allowed the creation of a lot of new models and topologies. It allowed us to create an optimized software stack from those deployment frameworks, down to the CPU. This optimization with new libraries and new graph compilers gives us over 200x improvement. The CPU that you probably have your enterprise application running on — you trust it and you’re familiar with it, now it is performing very well.

“The new hardware on CPU that we brought up, we have the second generation of the Xeon Scalable processor coming out. We’ve added hardware inside the CPU to accelerate some of the operations. The hardware gives you another 10x, or even 14x on top of the software improvement. The message is you can use the infrastructure you already have.

“Philips Healthcare does a lot of diagnostic imaging. We worked with them on two programs. One was on x-rays for bone age prediction and the other was CT scans for lung segmentation. By optimizing the solution, they were able to achieve 188x improvement on the x-rays and 37x improvement on the CT scans. By doing that optimization, [it helps them work with] a very large infrastructure all the way from the patient’s bed to the data center. They can use their servers with much, much higher efficiency. They can use the infrastructure that they have and extend it with CPU architecture.

“The most demanding task in deep learning is when you do Tensor arithmetic. When you have 3D image from tomography, it is a multidimensional array with a lot of information. When you need to do operations that are very intense and continuous, we are going to have the neural network processors in production later this year. We have two lines, the NNP-L, for learning and training, and the NNP-I, for inference."

Gadi_Singer-Intel.JPG

Are there particular pain points in the market that these innovations speak to?

“In many of those cases, deep learning fundamentally changed the level of service product being delivered. One example is the work we’re doing with GE Healthcare. One of the problems we worked on together is looking at digital imaging of pneumothorax, which is a lung collapse. You have air coming in between the wall of the chest and the lung. Today, you do some digital imagery, send the patient home, invite them back in five days, and do the analysis. The objective [of the technology] was to do it while the patient waits, give the diagnosis, and start treatment the same day. Those few days might be very significant for the patient."

Who are the users of the Nervana Neural Network Processor? Specialized equipment makers? Others?

“It’s for all the cloud service providers. It’s for Dell. It’s for an enterprise that has a data center. If you look at both lines­ — there is the training line and the inference line. In both cases, there is a lot of work that we do on software. In terms of operating it, having one application which some of the work is channeled to the CPU and then you offload the relevant portions to the NNP. The software that sits on top of those makes it easier for the users to do it. So, when they deploy it in the systems, they don’t have to create some special IT infrastructure. They deploy it in the system, and the software overlay takes care of the integration."

What is Intel’s position on data privacy?

“Supporting privacy is a great enabler because one of the characteristics of deep learning is that you would like to use a lot of real data. When you talk about healthcare, if you want lots of patient data, you have to protect their privacy. If you collect data off the behaviors of people on the street, you want to protect their privacy. Because of the dependency of aggregating data, protecting privacy is a good thing. We are investing in that. An example of that is a proactive project we contributed to on open source on homomorphic transformations. It takes data, encrypts it, and morphs it in one direction in a way that cannot be backtracked to the original.

Regarding machine learning and neural networks, is Intel working on any “explainability” that lets organizations see inside the black box and understand how decisions and recommendations are reached?

“Explainable AI is very important. Even if the machine gets it right all the time, you want to know why. And you want to know why if it doesn’t get it right once in a while. There are technologies being worked on in academia and industry on explainable AI. Some of it is by having another partner system that explains what the other system does. Part of it is looking for ways for the system to explain itself. When there are decisions being made, the reasoning that was done by the machine can be communicated. That’s something that is not deployed in all the systems, but it is definitely something that is being worked on.”

About the Author(s)

Joao-Pierre S. Ruth

Senior Writer

Joao-Pierre S. Ruth has spent his career immersed in business and technology journalism first covering local industries in New Jersey, later as the New York editor for Xconomy delving into the city's tech startup community, and then as a freelancer for such outlets as TheStreet, Investopedia, and Street Fight. Joao-Pierre earned his bachelor's in English from Rutgers University. Follow him on Twitter: @jpruth.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights