IoT
IoT
Data Management
Commentary
12/12/2015
11:06 AM
Larry Loeb
Larry Loeb
Commentary
50%
50%

Facebook Open Sources Its 'Big Sur' AI Server Designs

Facebook is open sourcing the hardware design for the servers it uses to train artificial intelligence software.

Cognitive Computing Powers 6 Smart Deployments
Cognitive Computing Powers 6 Smart Deployments
(Click image for larger view and slideshow.)

Facebook announced Thursday that it will open source its latest artificial intelligence (AI) server designs. The move continues a course the company began in 2011 when it launched the Open Compute Project to let companies share designs for new hardware.

The server, codenamed Big Sur, is designed specifically to train the newest class of AI algorithms, called deep learning, which mimic the neural pathways found in the human brain.

Google uses this kind of AI technique to recognize spoken words, translate from one language to another, improve Internet search results, and other tasks.

Not coincidentally, last month Google released its machine learning library TensorFlow under the open source Apache 2.0 license. According to the TensorFlow website, it is an "open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs (Graphical Processing Units) in a desktop, server, or mobile device with a single API."

(Image: ymgerman/iStockphoto)

(Image: ymgerman/iStockphoto)

Here is where the Facebook AI servers come into play. Google didn't release the hardware that TensorFlow runs on, and without that hardware, the software engine is heavily impeded in what it can do. But Facebook's move to open source its AI servers solves that. 

Big Sur, which Facebook is contributing to the Open Compute Project, includes eight GPU boards, each consisting of many chips but yet consuming only about 300 Watts of power. It is Open Rack-compatible hardware.

It was built with the Nvidia Tesla M40 in mind, but is qualified to support a wide range of PCI-e cards, according to Facebook.

GPUs were originally designed to render images for games and other intensely graphical applications, but have proven adept at deep learning.

Traditional CPUs are also present in these sorts of machines, but it has been found that neural networks are far more efficient when they shift much of the computation load onto GPUs. In fact, GPUs in general can give much more computational throughput per dollar spent than traditional CPUs will provide.

According to a blog post by Facebook engineers Kevin Lee and Serkan Piantino, Big Sur is twice as fast as the systems that Facebook previously used for training AI software.

[Read Machine Learning: How Algorithms Get You Clicking.]

Facebook previously used off-the-shelf components in its self-designed machines. In Big Sur's case, it partnered with Taiwanese manufacturer Quanta Computer, as well as leveraged Nvidia's Tesla Accelerated Computing Platform. This arrangement cuts out any middlemen in the server's design and manufacture.

Facebook says open sourced its AI hardware for altruistic reasons.

As Lee and Piantino wrote in their blog, "We want to make it a lot easier for AI researchers to share techniques and technologies. As with all hardware systems that are released into the open, it's our hope that others will be able to work with us to improve it. We believe that this open collaboration helps foster innovation for future designs, putting us all one step closer to building complex AI systems that bring this kind of innovation to our users and, ultimately, help us build a more open and connected world."

**Elite 100 2016: DEADLINE EXTENDED TO JAN. 18, 2016** There's still time to be a part of the prestigious InformationWeek Elite 100! Submit your company's application by Jan. 18, 2016. You'll find instructions and a submission form here: InformationWeek's Elite 100 2016.

Larry Loeb has written for many of the last century's major "dead tree" computer magazines, having been, among other things, a consulting editor for BYTE magazine and senior editor for the launch of WebWeek. He has written a book on the Secure Electronic Transaction Internet ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
larryloeb
50%
50%
larryloeb,
User Rank: Author
12/13/2015 | 5:52:29 PM
Re: Altruistic Reasons
Gee, it seems cheaper to make a human than a machine that will outdo a human.

Easy to make,sure, but the maintenance is ongoing and a bear.
Brian.Dean
50%
50%
Brian.Dean,
User Rank: Ninja
12/13/2015 | 5:48:14 PM
Re: Altruistic Reasons
Complexity will work in favor of humans. Software might be able to self-improve someday but, hardware will be dependent on humans for maintenance and improvements. And, there is also the electricity bill that will have to be paid. It has been speculated that to simulate the human brain (simulating only the neurons) will require 1-10 exaflop/s, an exaflop could consume somewhere in the region of 20 to 30 MW of electricity by 2020.  
larryloeb
50%
50%
larryloeb,
User Rank: Author
12/13/2015 | 5:22:20 PM
Re: Altruistic Reasons
Working in favor of what? Humans program the AIs, and the ways they do it revise all the time.

It's no going to be a machine that self-improves that is the probem. It will be something that man creates.
Brian.Dean
50%
50%
Brian.Dean,
User Rank: Ninja
12/13/2015 | 5:12:50 PM
Re: Altruistic Reasons
The OpenAI initiative is good as well. Resources should be allocated towards protection as well in the unlikely event that something goes wrong.

One natural form of protection that seems to be working in favor is that humans have between 20,000 to 25,000 genes. Rice has around 41,000 genes. And, electronics on the other hand, if anything goes wrong it is cheaper to rebuild it from scratch with a new design rather than work on the fifth or sixth modification.
larryloeb
50%
50%
larryloeb,
User Rank: Author
12/13/2015 | 4:42:04 PM
Re: Altruistic Reasons
Well, yes.

The OpenAI initiative announced at the end of last week may also be involved here.

Perhaps the future will be corporate AIs vs. Copen AIs
Brian.Dean
50%
50%
Brian.Dean,
User Rank: Ninja
12/13/2015 | 4:13:21 PM
Altruistic Reasons
There is definitely some level of altruism on Facebook's part to open source the hardware effort. Collaboration is good as it helps to eliminate resources that are spent for duplication. Altruism aside, there is also the economic reason that creating AI (high quality AI -- comparable to the human brain) would require large amounts of resources and no single company can take on the challenge.
InformationWeek Elite 100
InformationWeek Elite 100
Our data shows these innovators using digital technology in two key areas: providing better products and cutting costs. Almost half of them expect to introduce a new IT-led product this year, and 46% are using technology to make business processes more efficient.
Register for InformationWeek Newsletters
White Papers
Current Issue
Top IT Trends to Watch in Financial Services
IT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Video
Slideshows
Twitter Feed
InformationWeek Radio
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.