Prism network is designed for data-intensive research that would cripple the campus' main network.
Big Data Analytics Masters Degrees: 20 Top Programs
(click image for larger view and for slideshow)
The University of California in San Diego (UCSD) has announced a high-performance computer network designed for big data-related research projects in a variety of academic areas, including engineering, medicine, science, engineering and the arts.
Called Prism@UCSD, the advanced network is being built by researchers from the university's San Diego Supercomputer Center (SDSC) and the USCD division of the California Institute for Telecommunications and Information Technology (Calit2). The National Science Foundation is providing $500,000 in funding for the project.
The network will assist researchers in several data-intensive subjects, including climate science, electron microscopy, genomic sequencing, oceanography and physics.
According to Philip Papadopoulos, principal investigator in Prism, the project is designed for big data users on the UCSD campus that need bandwidth of at least 10 gigabits per second to manage high-velocity information from numerous scientific instruments such as sequencers, microscopes and computing clusters.
Prism can handle 20 times the traffic of the university's current research network, and 100 times the bandwidth of the main campus network. However, it isn't the school's first high-speed experimental network, but rather builds on an earlier effort, Quartzite. Prism adds an Arista Networks' 7405 switch-router that triples the energy efficiency and quadruples the capacity of Quartzite' switch. In addition, it will expand the current Calit2-SDSC optical-fiber connection.
Although Prism isn't fully operational yet, it will get a significant performance boost next month.
"There are two parts/phases to Prism," Papadopoulos told InformationWeek via email. "The first is a replacement upgrade to our packet switch in Quartzite. This hasn't happened yet. Our latest delivery date is mid-April."
The current Prism-to-SDSC bridge, a 50-Gbps connection, will get a significant boost up to 120 Gbps after the upgrade, said Papadopoulos. Other campus sites will get faster connections, too. For instance, Prism's connection to the National Center for Microscopy and Imaging Research (NCMIR) on the UCSD campus will jump significantly from 20 Gbps to 80 Gbps.
After the switch upgrade and movement of all existing Quartzite connections, researchers will add new sites to Prism, such as an 80-Gbps link to the physics site by the end of June. Connections to the university's chemistry, medicine and computer science sites will follow.
Prism will serve both as a production network for everyday use, and as a test bed for experimental networking, the university said.
On the practical side, Prism will help ease congestion on UCSD's main network, which serves more than 30,000 people. The network will siphon off traffic generated by a few hundred data-hog researchers, who will be able to transmit massive data sets without bringing the primary campus network to its knees.
Prism also will function as a big-data thoroughfare to global research networks. For instance, the lab of UCSD physics professor Frank Wuerthwein hosts petabytes of data from the Large Hadron Collider, CERN's massive particle accelerator in Europe. Prism allows the lab to transmit terabytes of data without bringing down the campus network.
The Center for Networked Systems in UCSD's Computer Science and Engineering building will put Prism to good use as well, allowing its researchers to quickly swap massive data sets that can range from 100 to 200 terabytes in size.
If Prism is a success at UCSD, the school will consider linking it to nearby off-campus labs as well, the university said. "The most data-intensive scientific applications get the most value out of using dedicated 'fat' pipes with the ability to accommodate short, extreme-sized bursts of data," said Papadopoulos in a statement. "We believe Prism will be the forerunner of specialized, big data cyber infrastructures on many research campuses -- and beyond."
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?