"This is a truly giant instrument," University of Hawaii astronomer John Tonry told the MIT News Service. "We get an image that is 38,000 by 38,000 pixels in size, or about 200 times larger than you get in a high-end consumer digital camera."
Congress in 2005 directed NASA to detect 90% of near-Earth objects larger than 140 meters by 2020. According to a 2003 NASA report, the 60-meter rock that struck the Earth about 50,000 years ago and formed what is now called Meteor Crater in Arizona released the equivalent of more than 10 megatons of energy. It created a hole over a kilometer across and 200 meters deep.
The telescope is one of four that will eventually be housed in the observatory's dome. It is part of a system is called Pan-STARRS (Panoramic Survey Telescope and Rapid Response System) that's being developed at the University of Hawaii's Institute for Astronomy.
The first gigapixel camera was sent to Haleakala in August 2007 and mounted on the PS1 telescope, a prototype of the Pan-STARRS system.
The Pan-STARRS cameras each have 1.4 billion pixels on an area approximately 40 centimeters square. A typical consumer camera has about 5 million pixels on a chip that measures a few millimeters.
The camera focal plane consists of a 60-by-60 arrays of 600-by-600 pixel CCDs. The CCD cells are grouped in 8-by-8 arrays on a single 5-centimeter chip called an orthogonal transfer array.
The CCDs for the camera come from work done over a decade ago by Lincoln Laboratory researchers Barry Burke, Dick Savoye, and Tonry, who was working at MIT at the time.
The trio developed a chip called an orthogonal-transfer charge-coupled device, or OTCCD, which can shift its pixels to compensate for the blur of random image motion. This is similar in concept to physical stabilization features on consumer cameras, but OTCCD technology manages this feat electronically, at the pixel level.