My optimism for the future of computing has never been greater
Looking back at the last 25 years of innovation, it's tempting to think that we're close to the limits of what we can do with computers. That's largely because the progress has been so amazing. We've gone from standalone mainframes to hundreds of millions of incredibly powerful PCs and smart devices. The spirit of Moore's Law has taken processing power from kilohertz to gigahertz, storage from kilobytes to terabytes, and networking speed from mere bits per second to gigabits per second.
Computers have moved out of the IT department into almost every part of our lives. More than 600 million PCs are in use today, a number that will rise to more than a billion in the next five years. Many devices we use every day--from mobile phones to TVs--are becoming like computers, with processing power, storage, and connectivity that meets or even beats the high-end PCs of just a few years ago.
Yet we're only beginning to realize computing's potential. I believe that we're entering an era when software will fundamentally transform almost everything we do. The continued growth of processing power, storage, networking, and graphics is making it possible to create almost any device imaginable. But it's the magic of software that will connect these devices into a seamless whole, making them an indispensable part of our everyday lives.
In the workplace, we're already moving from personally focused software, such as word processors and spreadsheets, to truly collaborative tools that bring teams together and drive a quantum leap in business productivity. Today's productivity software does a good job helping people collaborate, with shared workspaces and management software that helps teams and projects work efficiently. But a coming generation of software will take collaboration a step further, capturing the knowledge and experience of an entire organization, enabling individuals and teams to draw on that information to make better, more strategic decisions.
In the back office, software standards are driving a more model-based approach to developing applications. With the growth of XML and Web services, we're getting closer to being able to visualize any kind of business process and quickly develop software that can adapt to companies' changing needs. For example, today when a firm makes an acquisition or changes a key business process, the IT department often must embark on the time-consuming and expensive task of rewriting and testing the underlying software. But as we move toward a world of rich Web services and development tools that instinctively understand business processes, businesses can simply make the changes they want and the code will take care of itself.
Although it will be some years before the idea of truly self-managing systems is realized, we're already seeing great progress. For distributed systems, management always has been an afterthought, applied after the servers and applications are in place. Going forward, management intelligence will be built in. The service and health modeling capabilities of the recently released Microsoft Operations Manager 2005, for example, already are helping customers significantly cut the costs of supporting existing systems.
We're taking a similar approach to reliability, incorporating best practices throughout the software life cycle, educating our engineers to write more reliable code, and creating innovative development tools and technologies to improve software quality. We're also implementing customer-feedback tools in our products that enable us and our partners to gather reliability data from real-world usage scenarios.
Computing also is extending further into the physical world, with emerging technologies such as RFID tags and a growing number of embedded devices and sensors. This means that software can go places it has never gone before--tracking inventory from the factory floor to the cash register and beyond to watching your home for intruders or keeping tabs on what's in your refrigerator.
Ensuring that all these systems are reliable and secure will be a high priority for many years. Windows XP Service Pack 2 is obviously a big step for us, and we're on track to distribute 100 million copies in the first two months after release. We've trained 500,000 IT professionals worldwide on security technology and best practices. And we're already seeing the benefits of automated testing tools that can verify code and help eliminate common security vulnerabilities, as well as services such as Windows Update that can quickly distribute security patches across vast networks.
[Interop ITX 2017] State Of DevOps ReportThe DevOps movement brings application development and infrastructure operations together to increase efficiency and deploy applications more quickly. But embracing DevOps means making significant cultural, organizational, and technological changes. This research report will examine how and why IT organizations are adopting DevOps methodologies, the effects on their staff and processes, and the tools they are utilizing for the best results.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.