On February 13, they were thinking about rapid development, when they should have been thinking about rapid evacuation.
They met at a round table of peers, each with a say on the topic, cognitive computing. They should have been listening to voices higher up.
They were talking earnestly face to face when they should have been paying attention to the Internet of Things.
IBM staged a roundtable discussion on How Cognitive Intelligence and Cloud are Reshaping App Development. A group of 15 technologists, startup executives and members of the press were gathered at 44 Tehama Street in San Francisco. The four-story building in which they met was an old industrial space repurposed as a Galvinize-brand incubator site for startups. It also served as the site for IBM's Bluemix Garage in San Francisco and was filled with both practiced data scientists and unskilled newbies.
Galvanize is located in the South of Market district of the city in the heart of where many tech startups are located. South of Market is also a construction zone, given the high demand for office space. Building cranes dot the neighborhood and could be seen on both sides of the site where the Cognitive Intelligence round table was taking place. Across the narrow street, which was little more than an alley way, was one of those sites, 33 Tehama, a Lendlease residential tower project, currently building out its 37th floor and rising.
Among the participants inside 44 Tehama were Ryan Anderson, IBM Watson architect in residence in San Francisco and Marek Sadowski, and IBM lead cloud developer. Anderson spoke eloquently of the future role Watson could play in applications needing cognitive computing. The Watson system was so smart that it could combine historical data with current, streaming data and foresee what's about to happen next, he said.
Want to learn more about cognitive computing? See this report on an earlier cognitive colloquim: IBM Cognitive Colloquium Spotlights Uncovering Dark Data.
Given the right data that morning, Watson would have been able to warn Anderson that, with just a nudge of fate, IBM's famed AI system was in danger of being left bereft of one of its more skilled architects.
What Watson needed was sensor data from a construction elevator on the residential tower going up next door. Perhaps there was a slight vibration emanating from one of the nine struts that held the construction elevator in place. Audio data might have detected the groan of metal under severe strain. Gyroscopic data at the top of the elevator might have detected a slight change in pitch of the elevator platform. Or GPS data from the tip of the arm of the crane at the top of the elevator might have detected unwanted movement, presaging something that every construction site manager fears.
If Watson had had the data, there is little doubt he would have been issuing a stream of alerts in flashing red letters – with few false positives. Two days after the round table, a strut holding the elevator's platform in place failed, causing the platform to tilt 15 degrees and tipping a heavy piece of machinery for moving liquid cement into a 2,000-pound wall slab. According to a report, the slab, also referred to as a beam, tipped outward slightly. The closest building below was Galvinize's 44 Tehama Street.
Talk about Jeopardy. Where is Watson when you need him?
At mid-afternoon on Feb. 15, firemen burst into 44 Tehama to advise its occupants to leave. The reason why wasn't made clear, according to this report, leaving building residents questioning where the fire was, did they have to leave this minute, they were right in the middle of something. Confusion, rather than rapid message iteration, reigned. Paired programmers consulted each other on an unexpected event. There was no smoke. Was it a Fire Department error, a bug? Nevertheless, one by one they decided, maybe it was time to go. Collaboration gave way to evacuation. It was time for every man for himself.
At least ten buildings around the site were eventually cleared of their occupants as city engineers and construction site officials assessed the situation. Lendlease flew in a special building engineer from Seattle. Then at 9 p.m. that evening, the people were allowed back into their buildings, except for those at 44 Tehama St., whose evacuation continued on Feb. 16. Police sat in a parked cruiser to seal off access to Tehama Street all day on the 16th and even pedestrians were blocked by yellow police tape stretched across the alley.
At the Feb. 13 round table, my colleague, Chris Preimesberger, an editor at eWeek, asked, "What is cognitive computing anyway?" About six different answers, some them quite abstract, came from the eager participants.
I wanted to tell Chris that artificial intelligence is what tells you what cards have been played in a poker game and what the odds are of one of the remaining participants is still holding a strong hand. Cognitive computing, on the other hand, is that information combined with an analysis of the expression on each player's face matched to a historical record of their expressions, with a rating on who's holding the best hand. Cognitive computing can also detect when one of the players is swearing in French and what the oath means. But I didn't get the chance.
Cognitive computing could also give you an analysis of construction accident history, company reputation data, an assessment of whether the 33 Tehama Street tower was on schedule, frequency of fire department warnings and sensor data from multiple devices on the Internet of Things. It can combine algorithmic data with analogue data – groaning metal struts, vibrating elevator towers, quivering arms on cranes – and tell you, "Don't debate. Don't finish that line of code. Never mind the IPO. Get Out NOW!"Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive ... View Full Bio