There may be those who would preach that a company needs need to turn over their artificial intelligence initiative to data scientists, and let them run with it. Wow, would they be wrong! In fact, there may be no limit to the number of organizational roles that need to be involved in planning, building, implementing, scaling, and working with AI in any given enterprise.
Over the past two days, Sam Charrington, founder of CloudPulse Strategies, and a half dozen other expert speakers provided an idea-to-implementation look at what goes throughout the life cycle of an AI initiative, and where AI technology concepts are heading in the next few years. They spoke during the Interop ITX AI Summit before an audience of AI practitioners, data scientists, IT professionals, and business managers.
Charrington conceded that the challenges in understanding AI and machine learning start with the varied definitions for the concepts, and the "buzzword soup" that surrounds them. To simplify he said, "AI refers to technologies that allow software to do things that seem human like." Then, he added, "Machine learning refers to statistical techniques that allow us to make predictions that are trained on data."
From that foundation, the experts carried the audience through the AI process up through more technology topics such as the AI infrastructure within clouds and how to operationalize machine learning. "What we really are trying to do is widen the circle of folks who can take advantage of machine learning and AI," said Charrington.
For those getting started with their first AI projects, Charrington said that even though your competitors might have a headstart on you it's not too late to adopt AI. He said businesses have to view AI along the lines of a Chinese proverb, "The best time to plant a tree was 20 years ago. The second best time is now." In other words, get rid of the defeatist attitude and get moving.
There's general agreement that you need the right business problem to solve when starting out in AI. However, what executives shouldn't do is wait for a problem to appear. "Now is the time to proactively look for problems where you can apply this. Yes, I think it's that important," he said, adding that you could toss a dart at a company org chart and find an area that could benefit from AI.
Helping to identify the problems to be solved, and the type of improvement -- be it a new product or service, or a process improvement -- that should result is where business leaders need to work with technologists and data scientists to match the goals with technology capabilities.
Putting AI and machine learning into action is where David Karandish, founder and CEO of Ai Software, took over.
There's been plenty of discussion about how to use intelligent assistants or agents in the corporate world, taking a step beyond the bots that have popped up on websites in recent years. Karandish introduced the audience to his company's "Jane", a chat-based assistant that answers questions for employees and customers when integrated with a client company's internal systems. It's in use at several client companies besides his own.
Jane leverages APIs on applications such as human resources, mail, calendar, and CRM, as well as office documents and corporate policies to answer queries ranging from "how many vacation days do I have?" to those about the status of customer contracts.
After demonstrating Jane's capabilities, Karandish offered seven tips on designing a conversational user experience. A Jane-like interface should:
- Speak plain English
- Champion personality and delight to the user where possible, even complimenting the user
- Keep responses concise and focused, not wordy
- Provide reassurance that the user's intent was correctly understood when providing an answer
- Confirm what the user wants done when necessary, such as when to delete a file
- Anticipate what a user needs and provide information that is relevant
- Don't try to answer too many questions at once
He also recommended that a conversational user interface be equipped to route challenging questions to human experts when necessary. What is learned from those human experts then feeds into the system's knowledge base.
Karandish emphasized that once the conversational interface is set up, it's important that it learn from the questions users ask, rather than being instructed ahead of time by developers.
Looking to AI's future, Janakiram MSV, analyst, advisor, and architect with Janakiram and Associates, gave attendees a peek into what machine learning can do today, and where it will be in the very near future. It included a home-made, live demonstration of how a device such as a residential doorbell or a camera at a toll gate can change the way we interact with visitors or vehicles.
Janakiram pulled together an inexpensive camera, a compact server equipped with a plug-in GPU processor, image recognition software, along with a light bulb and toy vehicles. Applying his own models, he showed how the system recognized his face, and took action -- turning on the light. Then, another model, activated through an Alexa query, displayed the ability to differentiate between a toy car and toy bus.
While it may have sounded frivolous, the demonstration highlighted what can be done, and what will be easier to do as tighter integration among models, cloud storage and compute, and devices, such as those at the edge of an IoT network occurs around new hardware, software and cloud models.
For example, manufacturers are making better use of GPU chips, even to the point that Nvidia plans a GPU-based cloud. He advised attendees to watch the developments surrounding the industry standard Open Neural Network Exchange -- thus far embraced by major cloud providers other than Google. ONNX, he noted, will allow use of a wide variety of model types regardless of which cloud they use. Similarly, he said to expect cloud-agnostic edge platforms.
In envisioning the future, Janakiram presented a view where all types of consumer devices -- cell phones to doorbells -- will be AI enabled, and AI and machine learning revolutionize sectors such as healthcare and medical imaging. For example, he outlined how images such as MRIs and X-rays will be interpreted by machine learning-based tools, eliminating the need to wait for a specialist to read them. "The future of healthcare is this," he said.
For more about Interop ITX and advice for dealing with today's evolving tech concepts check out: