For starters, analytics may be a big buzzword, but it doesn't yet pervade corporate IT. Continuity asked whether companies were using analytics to measure the performance of internal IT, and a majority (56%) do not. Its survey did find that 57% of large companies (those with more than 2,500 employees) use analytic tools to measure performance. But only 29% of companies with fewer than 2,500 employees use such tools.
At their worst, vendor surveys reinforce company talking points or remind investors just how much upside still exists in a market. The Continuity survey does some of that, since the company's product involves helping companies analyze their systems for risk. But Continuity is not a monitoring company. Executives told Information Week they thought the survey was useful for understanding what the market wants, but that no one tool would be likely to fill those needs.
What is surprising is that very few companies appear to measure their cloud services -- only 14% of respondents reported doing this. As for other parts of IT infrastructure, the numbers range from 49% to 75%. Storage and network tied at the top, followed by applications and databases. Clusters were cited by only 49%.
Doron Pinhas, Continuity's CTO, believes the reason many companies aren't monitoring their cloud services is because the technology is still maturing, and it's hard to measure how these services should be performing. For clusters, Pinhas explained, "The rate of change over the last five years has been spectacular, with [the] rise of virtualization across the board." Such rapid change has created a far more complex IT environment that changes rapidly, making it hard to measure.
Measurement is a mantra of management, and the survey found that uptime was the top metric tracked among organizations that are tracking. Here's a breakdown showing the percentage of respondents that track each metric (note that the survey allows for multiple responses):
Performance/response time: 80%
Data loss: 56%
Number of open issues: 52%
Average time to fix: 51%
Security breaches: 49%
Mean time between failures: 38%
Less than half of the companies track in real time. When they do, uptime remains the most tracked metric, at 43% of respondents, and security breaches jumps to number two, at 33%.
Respondents said the hardest metrics to meet were performance (29% rated it number one) and uptime (17% rated it number one). Performance was ranked in the top three by 74% of respondents.
Analytics consultants are happy to tell us that many companies are not data-driven. The survey suggests less than half of IT departments are. Of respondents, 48% said that most or all critical decisions are informed by operational analytics, meaning more than half use analytics to inform operational decisions rarely or only some of the time.
The survey, conducted online from companies in Continuity's database, received 90 responses. The majority of respondents were not Continuity customers, according to Eran Livneh, the company's VP of marketing.
Of responding companies, 44% had more than 5,000 employees, 7% had between 2,500 and 5,000, 18% had from 501 to 2,500, and 31% had between 1 and 500 employees. More than half the respondents had more than 500 servers in their datacenters, and of those 25% had more than 2,500 servers. Only 19% of respondents had fewer than 50 servers.
Private clouds are moving rapidly from concept to production. But some fears about expertise and integration still linger. Also in the Private Clouds Step Up issue of InformationWeek: The public cloud and the steam engine have more in common than you might think. (Free registration required.)
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.