Internet2's High-Def LHC Teleconference
My first impression is that it's an odd place to try to unlock the mystery of the Big Bang theory: 300 feet below a bucolic French village in an unpresupposing conference room, next to a 10-foot-high tunnel stuffed with catwalks, cables, and a thousand cylindrical, supercooled magnets, linked like sausages in an underground ring 17 miles in circumference.
My first impression is that it's an odd place to try to unlock the mystery of the Big Bang theory: 300 feet below a bucolic French village in an unpresupposing conference room, next to a 10-foot-high tunnel stuffed with catwalks, cables, and a thousand cylindrical, supercooled magnets, linked like sausages in an underground ring 17 miles in circumference.I was watching a live netcast using advanced iHDTV high-definition video technology from the Internet2's annual Fall Member Meeting in New Orleans on Wednesday, Oct. 15. Hosted by Dr. Ed Seidel, director of the National Science Foundation Office of Cyberinfrastructure, the 30-minute-long videoconference provided a live peek behind the scenes at the biggest science device on the planet, the Large Hadron Collider (LHC). The world's largest and highest-energy particle accelerator, the LHC was built by the European Organization for Nuclear Research (CERN) to investigate the origins of the universe and properties of dark matter, including the existence of the Higgs boson, which Nobel Prize-winning physicist Leon Lederman and others call the 'god particle,' since by observing this hypothetical elementary particle scientists may be able to figure out how mass-less sub-atomic particles combine to form actual matter. The LHC lies underneath the Franco-Swiss border between the Jura Mountains and the Alps near Geneva, Switzerland.
The nonprofit Internet2 consortium develops and deploys advanced network applications and technologies for education and high-speed data transfer purposes via its 100 Gb/s network backbone to more than 210 U.S. educational institutions, 70 corporations, and 45 nonprofit and government agencies. The Internet2's high-speed network is expected to play a major role for physicists testing various predictions of high-energy physics when the LHC becomes fully operational next year. The massive LHC project is expected to produce roughly 15 million GB of data annually for analysis by scientists around the globe. More than 70 Internet2 university members will participate in LHC research, and each will have the capability to download or transmit about 2 TB of data over a four-hour window every two weeks. The virtual Q&A with Jim Virdee, LHC CMS spokesperson; Jim Strait, head of the U.S. LHC Accelerator Project and Research Program; and Harvey Newman, professor of physics at the California Institute of Technology, started off with an audio glitch that took a good five minutes to resolve -- giving Internet2 members in New Orleans first-hand insight into some of the kinks the Large Hadron Collider has recently experienced as the enormous project has gone through its commissioning process. (The glitch, I found out later, was caused by a plug that came loose in the New Orleans ballroom, not by any of the Internet2 networking and iHDTV technology that supported the demo.) On Sept. 10, proton beams were successfully circulated in the main ring of the LHC for the first time, but operations were halted on Sept. 19 due to a serious fault between two superconducting bending magnets. Owing to the already planned winter shutdown, the LHC will not be operational again until the spring of 2009. The Devil Is In The Details
The aim of the LHC experiment is to crack the code of the physical world by creating conditions similar to those that happened when the Big Bang occurred, which most cosmological theorists say resulted in the creation of the universe. One of the major points of discussion in the interactive Q&A was the need to archive and catalog the enormous amounts of data the LHC project is generating. This was a topic Seidel addressed in his presentation following the videoconference with details of the NSF's DataNet initiative, a proposed $100 million "sustainable digital data preservation and access" network. Seidel also gave the New Orleans audience an overview of how the Internet2 high-speed network is being used in complex hurricane modeling, including models that are capable of predicting storm surges and levee stresses.
About the Author
You May Also Like