The study uses a methodology created to estimate the development cost of proprietary software projects; it depends primarily upon a per-line cost for code developed by professionals making average salaries of around $75,000 a year.
I'm sure rivers of (virtual) ink will flow as a debate erupts over just how accurate this approach is when one applies it to an open-source project like Fedora.
Here's my opinion: Who cares? Even if $10 billion turned out to be too high by an order of magnitude, it would still represent the tip of a colossal economic iceberg.
What lies below the waterline? The mountains of cash that Linux and other open-source projects have saved companies, including -- no, make that especially -- companies that don't use Linux and other open-source applications.
Some of this is the result of direct competition; there is little doubt, for example, that Linux played a key role in turning the server market into a commodity game. At the same time, open-source projects such as SugarCRM and JBoss mean that a new generation of IT professionals will never know the meaning of the phrase "vendor lock-in."
And then there are the indirect benefits. Microsoft, for example, would still invest in server R&D even if it didn't face stiff competition from Linux. But would Microsoft invest nearly as much as it does without a penguin breathing down its neck? Would companies like Oracle, SAP, or PeopleSoft invest nearly as much in their database, CRM, or ERP development efforts without open-source upstarts breathing down their necks?
So, is Linux "worth" the billions invested in its development? Looking at the big picture, I'm not sure it even makes sense to debate the answer to this question.