informa
/
4 MIN READ
Feature

Five Disruptive Technologies To Watch In 2007

This year will see the impact of several slow-developing technologies, such as RFID, virtualization, and advanced graphics.
The idea is to run multiple operating systems and applications in the same box, making it easier to provision a new server and make more productive use of the hardware. Unlike the mainframe era, having multiple VMs means IT shops can cut the cost of software development and simplify configuration as they deploy new servers. "Two years ago, it wouldn’t have been possible to handle so much workload in a data center," says Rene Wienholtz, the CTO of Strato, a German Web-hosting provider that has deployed virtualization software.

Karen Green, the CIO of Brooks Health System, is also a believer in virtualization. "We plan to use virtual server management to reduce our server support efforts, minimize downtime, and reduce the ongoing costs of server replacement, enabling us to support more hardware with existing staff," she says.

The fact that Microsoft and EMC are giving away their virtual machine software, along with preconfigured VM applications known as virtual appliances, makes a strong argument for investigating the advantages of the technology. Microsoft offers a virtual disk image that contains XP with Service Pack 2 and Internet Explorer 6, for those shops that need to run IE 6 and 7 side by side.

Nvidia’s graphics cards don’t just do nifty 3-D images; they help with computation, too

Nvidia’s graphics cards don’t just do nifty 3-D images; they help with computation, too
(click image for larger view)

The network perimeter is gone. That means companies need to protect themselves from the outside world, not only from intruders but also from infected insiders. The hitch is in delivering consolidated mobile and end-point security across a company that will cover multiple desktop operating systems, nondesktop network devices such as print servers, and various switch and router technologies. That’s a tall order, especially as most IT shops already have some collection of perimeter security devices that will need to work with whatever end-point solution is put together.

Most networks authenticate users via logon credentials but don’t examine the actual desktop or laptop hardware the user’s running. So extra steps are needed to scan the file system for any Trojans or key-logging programs, check the patches and antivirus signature files that have been installed to see if they’re up to date, and, if not, take steps to fix what’s wrong. There are several proposed responses. Microsoft has its Network Authentication Protection architecture, and Cisco Systems has one called Network Access Control; each covers slightly different aspects of end-point security. Juniper Networks and other networking vendors offer authentication systems under the Trusted Network Connect architecture from the Trusted Computing Group. The architecture uses open standards and taps into the "trusted" hardware chips incorporated in most new laptops.

Some shops aren’t waiting for the architecture to settle. The Fulton County, Ga., government is moving forward with Microsoft’s NAP software and began trials about 10 months ago with beta copies of Vista and Longhorn. The county is using IPsec authentication, and its NAP deployment checks for a series of health requirements, including making sure that the version of Norton’s antivirus client is current before giving out an IP address to its network for remote users.

ADVANCED GRAPHICS PROCESSING
Two developments are changing the nature of graphics in business computing: greater use of 3-D images, and the use of graphics processors for computation. Not only are more applications making use of 3-D, operating systems are using 3-D elements as part of their basic tasks. Microsoft’s Windows Vista is a good example. One of the most highly touted features of Vista is its "Aero glass" interface, which layers see-through elements on top of each other. But it doesn’t come cheap: Aero requires 128 Mbytes of dedicated graphics memory, at minimum (256 Mbytes is better).

Andy Keane, the general manager of GPU computing for graphic chipmaker Nvidia, says he’s seen greater adoption of 3-D graphics as a visualization tool in the oil and gas, medical imaging, and computer-aided design industries. Three-dimensional graphics are part of the basic function set for leading interactive applications, Keane adds. "3-D isn’t just about games."

The new graphics cards being developed by Nvidia and ATI (now a part of Advanced Micro Devices) may have a bigger impact on computational processing than the latest chips from Intel and AMD. As graphics processors become more powerful, they’re able to offload computational functions from the computer’s main central processing unit. Nvidia has had a program for several years to assist developers who want to harness their graphics engines for computational applications. Keane says he’s seen applications that could run only on racks of clustered servers comfortably fit on a single workstation, such as Acceleware’s electromagnetic simulation software that’s used to design cell phone antennas.

What this means for IT managers is that graphics processing is a key component of their PC strategies and needs to be managed just as carefully as the software and CPU resources. It also means that the days of buying PCs with graphics capabilities integrated on the motherboard are probably numbered, as this configuration doesn’t deliver enough performance.

Editor's Choice
Brian T. Horowitz, Contributing Reporter
Samuel Greengard, Contributing Reporter
Nathan Eddy, Freelance Writer
Brandon Taylor, Digital Editorial Program Manager
Jessica Davis, Senior Editor
Cynthia Harvey, Freelance Journalist, InformationWeek
Sara Peters, Editor-in-Chief, InformationWeek / Network Computing