Virtual Instruments' Thompson New Kid On Virtualization Block
The former Symantec CEO compares the science of viewing application performance on a virtual machine to the dark art of viewing its SAN I/O.
Over 40 years, John Thompson, former CEO of Symantec, has watched basic changes periodically sweep the computer industry in repeated cycles.
“Client server was an inflection point. We’ve reached another inflection point in the march of democratization (in use of computer power),” he said during a recent visit to InformationWeek in San Francisco. By that, of course, he means cloud computing, a sweeping change that delivers more power to more end users than conceived possible before.
Thompson, an IBM executive for 28 years, has a keen eye for such changes. He was an executive in software development and marketing at IBM, ending up as head of IBM Americas, before being appointed CEO and chairman of Symantec. He took the job because he realized client/server computing made PC computing ubiquitous in business, which in turn had set loose changes that would yield new fortunes.
Those changes generated a vast, new industry devoted to producing anti-virus software, malware detection and various screening mechanisms that would try to re-establish the secure perimeter once offered by the IBM mainframe. Symantec grew to be one of two dominant players in that field with a stock value that increased 500% from 1999-2005. Thompson in 2006 was listed by Forbes as the eighth best paid executive in the technology industry. He is still chairman of Symantec.
Now he believes that virtualization and cloud computing are taking root and this disruptive change will unleash even more opportunities downstream from where the now heavily virtualized servers are chugging away. The cloud is a huge disruption, and with big disruptions come big opportunities for small companies, such as his current undertaking, Virtual Instruments. Thompson once again is chairman and CEO.
The field of server virtualization is already crowded with everyone from little VKernel and Veeam up to Oracle. We are still early, however, in the process of virtualizing applications. Gartner said at the end of 2009 only 16-17% of the data center was virtualized. InformationWeek Analytics, however, says the pace of virtualization will accelerate over the next two years.
Virtualization, Thompson says, introduces a new layer of abstraction and to some extent both the application and application performance disappear into that layer. Yes, it’s possible to discover and count virtual machines as they run, but there’s a larger problem of understanding what’s wrong if they’re not thriving the way they used on their own unfettered server. It was a horribly wasteful approach, this allocation of one application per server, but at least it gave you a view of what was going on.
The fundamental question for the virtualized data center, he says, is: “How do I keep track of these assets? That market is worth $10-$15 billion.” Virtual Instruments has staked a claim on one slice of it, albeit a potentially big slice: offering a view of storage I/O on the SAN network on an application-by-application basis.
Google in the Enterprise SurveyThere's no doubt Google has made headway into businesses: Just 28 percent discourage or ban use of its productivity ≠products, and 69 percent cite Google Apps' good or excellent ≠mobility. But progress could still stall: 59 percent of nonusers ≠distrust the security of Google's cloud. Its data privacy is an open question, and 37 percent worry about integration.
CIOs Get Smart About BIITís tried for years to simplify business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.