Recently, I was on a panel that was posed the question "What's the world of computing going to be like in ten years?"
Someone said something about computers disappearing in the fabric of life.
But the rest of us dutifully got the required observations in... we all agreed it would be different, in a bunch of obvious and semi-obvious ways - and I reminded people programmer need to "Think Parallel" (because multi-core is everywhere) - we each in different ways stated hopes that programming and hardware would evolve together.
So we spent time talking about our individual observations of places software could lead, or hardware could lead - and ulitmately agreed that languages need to advance models which were more intrinsicly parallel, and hardware would become more flexible and customizable.
But the observation about computers disappearing into the fabric of life was the more pointed prediction. It is already well underway. All great technologies - automobiles, indoor plumbing, electricity, telephones, airplanes, microwave ovens - go from being head-turning novelties, to being "common and trouble but worth it" to becoming uneventful reliable everyday things. Teh only way for computers to avoid this is to end as failures - and that isn't going to happen.
An industry built on PCs generally fears anything which sounds like a move away from PCs. Yet servers and laptops have grown the use of processors. As have cellphones, routers, DVR, automobiles and even kitchen appliances. The future is more computers, not less, but more uses.
One day we will stop "using a computer" and we will speak not of the computer, but the device the computer makes possible. "Navigating" (not using the locator program ona computer), "Watching TV", "Cooking food" - etc.
I recently built a Quad-core system with a tera-byte of storage running Mythbuntu to be our household PVR. It is amazingly good, very popular, but it is an embedded system not our desktop. Most powerful computer in our house, for the time being.
While multi-core will change the world of programmers, the more dramatic change already well underway is how computers need to be more and more designed into *everything*. More and more programmers will get to write software which makes devices do what they do. No expose the programs as yet another program on the start menu.
The future for more and more programmers is embedded computing.
I need to mention that first when asked about the future of computing. Maybe I'll say "embedded parallel programming." It doesn't hurt that embedded programmers deal with parallelism a great deal already.
A friend of mine, Max Domieka, recently wrote a book on embedded software "Software Development for Embedded Multi-Core Systems." It is an excellent introduction to the topic with plenty of detail and ideas.
I was really struck by how easy it seems for Max to enumerate the uses for parallelism in embedded computing. I'm also amazed at how popular virtualization is in the embedded world. The book is worth a read, and embedded computing is the future.
How Enterprises Are Attacking the IT Security EnterpriseTo learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
Infographic: The State of DevOps in 2017Is DevOps helping organizations reduce costs and time-to-market for software releases? What's getting in the way of DevOps adoption? Find out in this InformationWeek and Interop ITX infographic on the state of DevOps in 2017.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.