Virtual Instruments' Thompson New Kid On Virtualization Block
The former Symantec CEO compares the science of viewing application performance on a virtual machine to the dark art of viewing its SAN I/O.
Over 40 years, John Thompson, former CEO of Symantec, has watched basic changes periodically sweep the computer industry in repeated cycles.
“Client server was an inflection point. We’ve reached another inflection point in the march of democratization (in use of computer power),” he said during a recent visit to InformationWeek in San Francisco. By that, of course, he means cloud computing, a sweeping change that delivers more power to more end users than conceived possible before.
Thompson, an IBM executive for 28 years, has a keen eye for such changes. He was an executive in software development and marketing at IBM, ending up as head of IBM Americas, before being appointed CEO and chairman of Symantec. He took the job because he realized client/server computing made PC computing ubiquitous in business, which in turn had set loose changes that would yield new fortunes.
Those changes generated a vast, new industry devoted to producing anti-virus software, malware detection and various screening mechanisms that would try to re-establish the secure perimeter once offered by the IBM mainframe. Symantec grew to be one of two dominant players in that field with a stock value that increased 500% from 1999-2005. Thompson in 2006 was listed by Forbes as the eighth best paid executive in the technology industry. He is still chairman of Symantec.
Now he believes that virtualization and cloud computing are taking root and this disruptive change will unleash even more opportunities downstream from where the now heavily virtualized servers are chugging away. The cloud is a huge disruption, and with big disruptions come big opportunities for small companies, such as his current undertaking, Virtual Instruments. Thompson once again is chairman and CEO.
The field of server virtualization is already crowded with everyone from little VKernel and Veeam up to Oracle. We are still early, however, in the process of virtualizing applications. Gartner said at the end of 2009 only 16-17% of the data center was virtualized. InformationWeek Analytics, however, says the pace of virtualization will accelerate over the next two years.
Virtualization, Thompson says, introduces a new layer of abstraction and to some extent both the application and application performance disappear into that layer. Yes, it’s possible to discover and count virtual machines as they run, but there’s a larger problem of understanding what’s wrong if they’re not thriving the way they used on their own unfettered server. It was a horribly wasteful approach, this allocation of one application per server, but at least it gave you a view of what was going on.
The fundamental question for the virtualized data center, he says, is: “How do I keep track of these assets? That market is worth $10-$15 billion.” Virtual Instruments has staked a claim on one slice of it, albeit a potentially big slice: offering a view of storage I/O on the SAN network on an application-by-application basis.
How Enterprises Are Attacking the IT Security EnterpriseTo learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.