5 Disruptive Technologies To Watch In 2007

2007 will be the year when a host of hot technologies which have been percolating around the mainstream rise high on the radar screens of CIOs and IT managers. We'll look at five of the more significant, including RFID, advanced graphics, and virtualization.

David Strom, President, David Strom Inc.

January 1, 2007

12 Min Read
InformationWeek logo in a gray background | InformationWeek

Disruptive Tech 2007

•  RFID•  Web Services•  Server Virtualization•  Graphics Processing•  Mobile Security

2007 will be the year when a host of hot technologies which have been percolating around the mainstream rise high on the radar screens of CIOs and IT managers.

For example, radio-frequency identification, frequently viewed as a standalone tagging technology, will begin to ramp up the data loads IT centers must handle, as the tags become more pervasive. Web services, long touted as the next big thing, is poised to begin presenting workaday challenges, as managers are tasked with integrated Web-based apps into the enterprise. Mobile security is a no-brainer as a hot technology for the coming year, as far-flung workforces face newer and more troubling threats.

Most challenging may be two technologies which will begin their ascent in 2007, but may take a bit longer to assume a dominant role in the enterprise. Those would be virtualization and advanced graphics. The latter will get a big boost from the advent of Microsoft's Vista operating system.

In this article, we'll discuss all five technologies and try to touch on how they'll affect your ability to deploy applications and manage your infrastructures.

Radio Frequency Identification

After years of promise, Wal-Mart and the Defense Department are among the organizations which have moved RFID into the mainstream, using the technology to track everything from pill bottles to pallets to people. As more vendors get on board, there are some solid enterprise integration efforts developing back-end, supply-chain, and inventory systems that can deliver real productivity benefits to savvy enterprises.

Why now? RFID isn't new, having been around in one form or another for more than a decade. However, several factors have come together to make it a big deal. First, there are new developments in the integration of supply chain infrastructure with products such as Reva Systems' Tag Acquisition Processor. This has made it easier to manipulate RFID data directly into inventory, supply chain, and manufacturing systems. These changes have stimulated other entrepreneurial efforts and created more of a market for RFID-related products. Second, the standards are solidifying, making it easier to develop applications and interoperate various pieces. And finally, the investments of the two biggest proponents of RFID -- the U.S. government and Wal-Mart -- are finally taking hold. Taken together, it is clear that the time is ripe for RFID.

InformationWeek Download

Anyone trying to master RFID will need to examine its three key components: scanners, radios, and warehouses. The reason for putting scanning expertise first is because the transition from bar codes to radio tags is a relatively easy transition. Think of each item as tagged with its own radio code rather than a physical one printed on a label. Any successful RFID deployment also needs to take into account potential radio issues and how wireless networks are deployed across the enterprise. Finally, warehousing and inventory experience are needed to collect the scanned information and integrate into any existing supply chain applications. "Essentially, what ERP did to the enterprise, RFID will do to the supply chain. It's all about centralization, visibility, and automation," said Marlo Brooke, senior partner at Avatar Partners, a systems integrator.

It also helps to understand the types of goods being tagged. Take the two situations where large appliances such as dishwashers and cases of disposable razors are being tagged. If you lose track of one or two cases of the latter product, it isn't as much of a big deal as if lose a couple of appliances. Ultimately, the IT shops that will succeed at RFID will be able to handle the massive data dumps and route this data to the right places within their applications infrastructure, and be able to act on this information as part of their decision support systems, too.

Disruptive Tech 2007

•  RFID•  Web Services•  Server Virtualization•  Graphics Processing•  Mobile Security

Web Services

The Web has become a force of nature and a solid applications delivery platform. Whatever you call this latest buzz, Web applications have changed the way we will deploy enterprise software.

In 2006, we saw more buzzwords describing the "Webification" of the enterprise. Software-as-a-service (SaaS), mashups, Web 2.0, RSS feeds, Wikis, blogs, the rewritable Web, social networking spaces, group chat rooms -- no matter which aspect you're talking about, clearly something new is happening here. The trick is paying attention, because the Web services movement is producing better and more capable enterprise class applications, which can be deployed in a fraction of the time that more traditional apps took.

IT managers are using the combinations of various Web-based applications to piece together what they need done. For example, you can now take a mapping service such as Yahoo or Google Maps and tie in the location of your current sales leads to determine where to deploy your sales force. Many of these begin with one or more hosted applications and build from there. For some leading-edge examples, look at Zimbra for hosting enterprise-class e-mail, Amazon's S3 for offsite disk storage, basecamphq.com for project management, Concur for expense reporting, and Jive Software's Clearspace for document and workflow management. All mix multiple applications using well-known, and, in most cases, open-source code.

"Hosted applications definitely provide a new and more flexible opportunity for providing application solutions to my clients," said Dan Parnas, a director at online broker Charles Schwab. "They have significantly lower up-front cost and the ability to bring the application online relatively quickly."

"With SaaS, we've seen this movie before with the invention of the PC. Resistance was futile then and it is futile now. The good news is that we finally we have a software architecture and business model that can meet our growing need for agility," said Doug Neal, a research fellow with Computer Sciences Corp.'s Leading Edge Forum -- Executive Program.

As we recommended in the recent article on Web makeovers, go slowly on deploying these technologies, start with some basic skills on CSS or RSS before moving any deeper, understand what expertise you have in-house versus what you need to purchase, and examine whether your existing portfolio of Web applications needs to be updated with more modern and dynamic content tools.

Disruptive Tech 2007

•  RFID•  Web Services•  Server Virtualization•  Graphics Processing•  Mobile Security

Server Virtualization (for free)!

In 2006, we saw the leading virtual server software vendors begin to give away their products. In 2007, more and more IT shops will consolidate their servers using virtual machines.

The concept behind virtual servers is simply stated but hard to implement: take a single server, divvy it up into separate "virtual" machines, each with its own memory, virtual hardware and drive images, and other resources. It isn't new: IBM has been doing this on its mainframes for more than 30 years, and we've had blade servers for the past five years. But what is new is that the power of a VM can be delivered to the PC platform, and there is a more compelling argument now that Microsoft and EMC are literally giving away their VM server software, along with pre-configured VMs to make setup even easier.

The idea is to run multiple operating systems and applications on the same box, making it easier to provision a new server and make more productive use of this hardware, just like our mainframe antecedents used to do in the 1980s. But unlike the mainframe era, having multiple VMs means IT shops can cut the cost of software development and simplify configuration as they deploy new servers. "Two years ago, it wouldn't have been possible to handle so much workload in a data center. Now we can, thanks to this new virtualization software," said Rene Wienholtz, the CTO of a German Web-hosting provider called Strato that has deployed these technologies.

"We plan to use virtual server management to reduce our server support efforts, minimize downtime, and reduce the ongoing costs of server replacement, enabling us to support more hardware with existing staff," said Karen Green, the CIO of Brooks Health System.

Microsoft has even offered a virtual disk image that contains XP with Service Pack 2 and IE 6 for those shops that need to run IE 6 and 7 side-by-side.

Advanced Graphics Processing

In the coming year, we'll see two forces changing the nature of graphics in the enterprise: greater use of 3-D and the use of graphics processors for computation.

Operating systems themselves are using three-dimensional elements as part of their basic tasks, and more applications are making use of 3-D. Microsoft's Windows Vista is a good example. IT managers need to understand their entire graphics desktop collection and manage the transition to more graphics-able PCs.

"We're seeing a greater adoption of 3-D graphics as a visualization tool, especially in the oil and gas, medical imaging, and computer-aided design industries," said Andy Keane, the general manager of GPU computing for graphic chipmaker Nvidia. "3-D is now assumed as part of the basic functional set for leading interactive applications. It isn't just about games."

In addition, as graphics processors become more powerful, they are able to offload more ordinary computational functions from the computer's main central processing unit. In some cases, the new graphics cards being developed by Nvidia and ATI (now a part of AMD) will have a bigger impact on computational processing than the latest chips from Intel and AMD. Nvidia has had its CUDA program for several years now to assist developers that want to harness their graphics engines for computational applications. Keane has seen applications that previously could only run on racks of clustered servers now comfortably fit on a single workstation, such as Acceleware's electromagnetic simulation software that is used to design cell phone antennas.

What this means for IT managers is that graphics processing is becoming a key component of their PCs and needs to be managed as carefully, if not more so, than the CPU inside. It also means that the days of buying integrated graphics on a PC's motherboard are changing, as this configuration doesn't deliver much in terms of 3-D performance.

Disruptive Tech 2007

•  RFID•  Web Services•  Server Virtualization•  Graphics Processing•  Mobile Security

Mobile Security

The perimeter is gone and the enterprise needs to protect itself from potentially infected remote users. There are numerous end point security solutions and architectures galore.

However, it's no longer adequate to just authenticate users: today's IT managers have to worry about infected laptops that can bring down their networks. The trick is delivering a consolidated mobile and end point security solution across the enterprise that will cover multiple desktop operating systems, non-desktop network devices such as Web cameras and print servers, and various switch and router vendors and operating system versions. That's a tall order, especially as most IT shops already have some collection of perimeter security devices that will need to work with whatever end-point solution is concocted.

Microsoft has its Network Authentication Protection and Cisco has its Network Access Control architectures, which cover slightly different aspects of end point security.

There's also a separate scheme from Juniper and other networking vendors under the Trusted Computing Group, called Trusted Network Connect, which is using open standards to leverage the trusted hardware chips inside most new laptops.

Some shops aren't waiting for the architecture wars to settle down. The Fulton County, Ga., government has gotten Microsoft's NAP religion and began trials about 10 months ago with beta copies of Vista and Longhorn. The county is using IPSec authentication, and its NAP deployment checks for a series of health requirements, including making sure that the version of Norton's AV client is current before giving out an IP address to its network for remote users. Others are started with Media Access Control layer filtering or using 802.1x authentication across their networks.

The trouble is that most networks authenticate users via login credentials, but don't examine the actual desktop or laptop hardware that the user is running. So extra steps are needed to scan the file system for any Trojans or key logging programs, check the patches and antivirus signature files that have been installed to see if they are up to date, and, if not, take steps to fix what is wrong.

Leading vendors in this market include Lockdown Networks, Consentry, and Juniper. As Microsoft and Cisco roll out new products this year, expect some further realignment.

About the Author

David Strom

President, David Strom Inc.

David Strom is one of the leading experts on network and Internet technologies and has written and spoken extensively on topics such as cybersecurity, VOIP, convergence, email, cloud computing, network management, Internet applications, wireless and Web services   for more than 35 years . He was the editor-in-chief of Network Computing  print, Digital Landing.com, and Tom's Hardware.com. He currently edits the Inside Security daily email newsletter. He has written two computer networking books and appeared on a number of TV and radio shows explaining technology concepts and trends. He regularly blogs at https://blog.strom.com

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights