A Gartner analyst complains that this is far more cores than the software will probably be able to use, leading to a risk that buyers "will not be able to use all the processors that are thrust upon them" at that time.
Please. I mean, is this the biggest problem that we face?
I am forcibly reminded of an incident I experienced in the sales training class of a now-defunct mini-computer vendor I was working for in 1982. (I was in product publicity, but they sent us to sales training on the theory that everyone works for the sales department.) The teacher was describing various sizes of storage systems they offered, and I got dizzy as he pressed on from tens of megabytes to seemingly fantastical hundreds of megabytes. Finally he mentioned the word "gigabyte."
"Who would ever need ï¿¼" I found myself saying.
"-- Don't even think it," he cut me off. And now I'm sitting here, 27 years later, with 160 GB on my desktop, wondering when that will run out.
Here's another analogy: go to the beach. You encounter a vast expanse of water. Since you're on foot, it's a barrier. But as soon as you acquire a boat that same water becomes a global highway.
If we are presented with an ocean of cores I think we'll find some way to navigate it. Having more and more cores is going to be an opportunity, not a burden.
As for that sales training class, the teacher also handed out copies of, "Dress for Success." I wish I'd memorized it.
Visit the bMighty Server How-To Center for practical, hands-on information about how to choose, install, and maintain your company's servers.