IBM Cognitive Colloquium Spotlights Uncovering Dark Data - InformationWeek
IoT
IoT
Cloud // Software as a Service
News
10/14/2015
10:05 AM
Connect Directly
Twitter
RSS
E-Mail
50%
50%
RELATED EVENTS
Faster, More Effective Response With Threat Intelligence & Orchestration Playboo
Aug 31, 2017
Finding ways to increase speed, accuracy, and efficiency when responding to threats should be the ...Read More>>

IBM Cognitive Colloquium Spotlights Uncovering Dark Data

This IBM cognitive computing event champions advances in image recognition, captioning, and deep learning to reveal insights from mushrooming IoT and machine learning data.

Apple, Microsoft, IBM: 7 Big Analytics Buys You Need to Know
Apple, Microsoft, IBM: 7 Big Analytics Buys You Need to Know
(Click image for larger view and slideshow.)

Data is being generated at a frenetic pace, with the Internet of Things about to multiply the Internet's current production several times over. Despite our passion to collect it, much of this data will remain unknown, un-sampled, and useless. In other words, it should be described as "dark" data.

That was the note on which John Kelly, IBM's "father of Watson" and the senior VP who pushed for Watson's contest with a human on the Jeopardy quiz show, began IBM Research's third Cognitive Colloquium, a San Francisco event that explores the changeover from linear, von Neumann computing to a compute architecture that better mimics the working of the human brain.

Today, "80% of all data is dark and unstructured. We can't read it or use in our computing systems. By 2020, that number will be 93%," he told over 300 neuroscience researchers, computer scientists, hopeful startup executives, and academics interested in cognitive computing.

(Image: DKart/iStockphoto)

(Image: DKart/iStockphoto)

At the same time, we are generating a million GB of health data for each person during a lifetime, and there are 7 billion people on Earth. By 2020, cars will be generating 350MB of data per second "and [that data] will need to be assessed," he noted.

[Want to learn more about machine learning on the IoT? See GE CEO: IoT Boosts Safety, Efficiency, Speed.]

Cognitive computing seeks to understand natural language, analyze pictures the way the human eye can, and do other complex tasks. But Kelly said, "This is not a journey to reproduce what the human mind can do," discounting worries that intelligent machines will first match, then surpass what human brains can do. The idea is rather to get computers, doing what they do well, to augment existing human intelligence.

"It's to understand what's out there, to get insights into all that data," he added. The price on not moving ahead with cognitive computing is high. More sophisticated compute power is needed to avoid such complex problems as building a $1 billion oil-drilling platform in the wrong place or having doctors and their healthcare imaging systems come up with the wrong diagnosis.

There's a trillion dollars in waste in the healthcare system, as hit-or-miss treatments and therapies are pursued without anyone knowing for sure whether they will work. In some cases they actually do not. "Think of a radiologist who sits in a room and looks at thousands of image each day. Eventually fatigue sets in." A cognitive system like IBM's Watson can analyze thousands of images tirelessly. Watson is being trained to examine X-ray images from a library of 30 million, well-defined medical images. When the training is finished, Watson may be used by some organizations to supplement or inform human radiologists.

When people object that it's expensive to capture and process all that data, Kelly said he asks, "What is the price of not knowing? What is the price of not being able to cure cancer?"

Cognitive computing will usher in a new era, as different from its predecessor as today's computerized banking is from its paper-ledger predecessor. In the last two years, neural cognitive computing networks have transformed natural language processing and language translation from iffy processes into ones in which not only words, but also meanings are captured with much greater fidelity.

"There's not an industry or discipline that's not going to be transformed by this technology over the next decade," Kelly said.

Research into cognitive computing is proceeding along several promising lines. Terrence Sejnowski, a pioneer of computational neuroscience at the Salk Institute at the University of California San Diego, pointed out the difference between today's cognitive computing skills and linear logic by recounting an anecdote. He was once on his way to a university's computer science event when his escort warned him the faculty members he was about to meet didn't like the idea of cognitive computing and considered it a threat to their work.

As he began speaking to the group, he showed a picture of a honey bee and laid down this challenge: With only a million neurons, the honey bee "can see, fly, and mate. Your supercomputer, with billions of neurons, despite the best efforts of the NSA, can't do that. Why not?" The question stumped his would-be questioners.

The answer, of course, was that the bee's brain had been wired by evolution to accomplish those tasks through its cognitive computing; the supercomputer had not.

Today's cognitive computing systems can look at an image of a person in an outdoor setting doing something and write an accurate caption that says, "Woman throwing a Frisbee in a park." That's a result of analyzing the image in a fashion that better resembles how the human brain works. In the past, image analysis has proceeded frame by frame, with the computer discarding the image of the previous frame to analyze the next. Cognitive computing retains the data for further examination. That situation leads to an area of focus and the ability to compare and draw a conclusion that better matches the brain's perceptions, which occur as a series of spikes in its electrical activity.

Sejnowski said the brain isn't analogous to an electronic computer. Rather, it might at best be compared to hundreds of computers wired in different ways to do different things, but able to coordinate their results. Cognitive computing will rely on neural networks, which can do pattern recognition and associative thinking; recurrent networks, which hold their data for repeated consultation by processors; and "deep, multilayered networks" that might aid in processing an image in many ways simultaneously.

Cognitive computing is also working on the process of computerized "deep learning." Over the next several decades, he said, neuroscience, cognitive computing, and nano-science together will come closer to how the brain captures data, accesses it, and processes it. That advancement will transform computing as we know it.

"Today's big data is important. The future is going to be even bigger," he said.

But Yoshua Bengio, a professor of computer science at the University of Montreal and an expert in machine learning, said one of cognitive computing's latest efforts, the two-year-old Adam Net, has cognitive powers that ranked

Page 2: Deep learning

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive ... View Full Bio

Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Gebildete
50%
50%
Gebildete,
User Rank: Apprentice
10/29/2015 | 8:15:43 AM
Re: A good news! Oracle began to structure data!
I really have dream about it. I know it's a very hard process. Konstruktor in comparison to Oracke is a very small company. It's simpler to organize all the data there as in Oracle.
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
10/14/2015 | 2:35:40 PM
Lab deputy director reassures on artificial intelligence
For those who worry about artificial intelligence taking over, Horst Simon, deputy dir. of the Berkeley Livermore National Lab, had a story. When he was in junior high school, he bult a computer that could play tic tac toe, and after learning its patterns, he beat it ten times in a row. At that point he sat back, suspecting that computer was angry at him. As he looked at it, he realized "it was just a bunch of wires and light bulbs. There's nothing there that can get upset." AI is more capable but no more emotive than his tic tac toe machine, he said.
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
10/14/2015 | 12:25:05 PM
Cognitive computing needs a 'concept expansion'
One of the demos at the colloquium was the Watson Concept Expansion, available as an online service. It analyzes larges amounts of text on a subject to build a dictionary of related words around the subject. It can map euphemisms and colloquial expressions to more commonly understood phrases and include them in the dictionary. Cognitive computing would be a good subject for Watson Concept Expansion to map, in my opinion. I didn't learn much about Adam Net. What's that? Part of Microsoft's Project Adam?
How Enterprises Are Attacking the IT Security Enterprise
How Enterprises Are Attacking the IT Security Enterprise
To learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
Register for InformationWeek Newsletters
White Papers
Current Issue
IT Strategies to Conquer the Cloud
Chances are your organization is adopting cloud computing in one way or another -- or in multiple ways. Understanding the skills you need and how cloud affects IT operations and networking will help you adapt.
Video
Slideshows
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Flash Poll