Intel Blog Warns Of Multicore Crisis
No, it's not what you think. There's no hardware problem with dual- and quad-core processors. The alarm an Intel blogger has sounded is a warning to software developers. This doesn't make it any less serious; here's the deal.
No, it's not what you think. There's no hardware problem with dual- and quad-core processors. The alarm an Intel blogger has sounded is a warning to software developers. This doesn't make it any less serious; here's the deal.The quick-and-dirty summary is that this is a shot across the bow of applications vendors. The warning is that multicore computing is now so widespread, software sellers who don't optimize their apps to do serious multithreading will be in deep trouble.
OK, now for the dotted i's and crossed t's, which are a bit more boring, but no less important.
The post to which I refer is on the Intel Software Network Blogs. It's written by Kevin Farnham, a regular blogger for Intel, though he doesn't work for the company. He's an online community manager and editor for O'Reilly Media, but has standing to write about this subject since he's got 25 years of software-engineering experience.
Here's how Farnham characterizes the multicore crisis in his post: "There will eventually be an industry shake-out, where the companies that are aware of the multicore crisis will benefit from their foresight, while companies who proceeded in a business-as-usual mode, failing to notice the necessity to multithread their applications/products, will suddenly find themselves shunned in the marketplace because their programs take forever to run compared with competing applications."
It's important to note that Farnham references an earlier post at Bob Warfield's SmoothSpan Blog. Warfield appears to have coined the term "multicore crisis," at least insofar as it's used to refer to software-development issues.
The multicore crisis in software is a negative side effect resulting from the incredible power available in today's cutting-edge processors. Here's how Warfield explains it: "It takes considerable effort at the software end to take advantage of the additional cores. For the most part, we are far from keeping up with the availability of those cores."
Farnham and Warfield aren't the first to note that multithreaded processors are often all dressed up with no place to go when it comes to software. However, the "multicore crisis" coinage makes the problem much more accessible (fun, even).
Indeed, this crisis, where hardware capabilities outstrip software, goes back at least to the "dusty deck" days of early parallel computing. The latter term refers to the challenges of converting an old Fortran program to run on a parallel system. (For you kids out there, the programs were on IBM punch cards, hence the "deck" reference.)
Quite frankly, I haven't heard talk about dusty decks since the mid-1980s, when parallel-computing behemoths like Thinking Machines and Convex Computing were perceived to be the next big thing.
Mostly, the rapidly advancing clock speeds and throughputs of x86 microprocessors over the past 20 years have rendered all discussion moot. Faster computers will make up for a lot of flaws in how applications are coded up.
However, now that it looks like we're reaching some kind of processor plateau, where advances in computing power won't come from brute force clock-speed increases but rather from adding more sockets (i.e., more cores), this stuff is going to get pretty important very quickly.
So, like Warfield and Farnham warn, maybe we are indeed on the cusp of a multicore crisis.
About the Author
You May Also Like