02:18 PM

SmartAdvice: Follower Or Pioneer, X64's Time Is Near

Look at what programs you'll be running and how long you expect to keep the hardware when deciding when to start buying 64-bit PCs and servers, The Advisory Council says. Also, vendors are developing self-healing wide-area file services for delivering data on a wide-area network.

Editor's Note: Welcome to SmartAdvice, a weekly column by The Advisory Council (TAC), an advisory service firm. The feature answers two questions of core interest to you, ranging from career advice to enterprise strategies to how to deal with vendors. Submit questions directly to [email protected]

Question A: When should we start buying 64-bit PCs and servers?

Our advice: With the increasingly ready availability of "x64" (32/64-bit, x86-compatible) PCs, servers, and operating systems, it seems to be a question of how soon to start buying 64-bit PCs and servers.

There are four factors to consider:

  • Whether you're buying PCs or servers;
  • The applications they will run;
  • How rapidly you adopt new software; and
  • How long you expect to keep the hardware.
  • For as long as there have been servers (even before multiuser, network-accessed computers were called servers), the secret to good performance under heavy loads has been lots of memory. The recommended minimum memory for most modern server software is 512 Mbytes, and servers with several gigabytes are commonplace. Since 64-bit architectures make more effective use of memory larger than 4 Gbytes, most high-end servers have been 64 bit for years. Because memory demands keep growing, and servers are expected to last at least five years, we advise that server purchases, other than for workgroup servers, be 64 bit.

    Related Links

    Intel Dubs 2005 'The Year Of 64-Bit Computing'

    Microsoft To Launch 64-Bit Windows Monday

    Windows XP Professional x64 Edition RC2 Preview

    In contrast, very few business desktop applications are going to benefit from 64-bit platforms in the near future. Like high-end servers, high-end workstations for scientific research and computer-aided engineering have been 64-bit for years, and we expect those applications to quickly exploit less-expensive x64 platforms. Multimedia authoring applications also will quickly benefit from x64 platforms. These are the only desktop applications for which we advise immediately adopting 64 bit.

    For mainstream business desktops, the last two factors interact. We expect that it will be circa 2009 before mainstream business applications appear which require a 64-bit platform. If you are a "follower" when it comes to deploying software, you might be able to keep 32-bit PCs in productive service until 2013 or beyond. (Almost half of Microsoft Office users are still running Office 2000 or earlier.) As we expect all new PCs to ship with x64 processors by circa 2008, any 32-bit PCs you still have in 2013 would be fully depreciated.

    Conversely, if you're a "pioneer" at deploying software, PCs purchased today won't be fully depreciated when 64-bit-only business applications appear. Unless you plan to replace PCs on an aggressive schedule, we would advise that pioneers' next PC purchases be x64. The $100-plus premium for x64 today will be compensated for by avoiding the labor costs of premature upgrades.

    Keep in mind that x64 hardware doesn't require an x64 operating system. One of the x64 architecture's benefits is that 32-bit operating systems (e.g., 32-bit Windows XP) run fine on it. Since the Windows x64 versions don't support backward compatibility with 16-bit legacy software, apps that have 16-bit code buried within them may break. Give your software vendors time to address these issues before you deploy Windows x64.

    --Peter Schay

    Question B: What better alternatives are there to Windows file sharing and Unix NFS for use on wide-area networks?

    Our advice: Managed data storage is essential to any business. As companies globalize their workforce, while at the same time consolidating data-center functions, users need to access business data that could be located anywhere in the world. Users just want critical information delivered quickly and transparently, whenever and wherever they are connected. In response to this need, data-storage vendors are developing wide-area file services (WAFS) that are optimized for delivering data in a wide-area network environment. The advantages of WAFS are truly self-healing data systems, more flexible data resource allocation, and improved options for business continuity.

    Historically, most file systems were designed to work optimally with local disks, so they often are annoyingly slow in the networked environment in which most people now work. In response, several vendors are developing protocols and products to replace the aging Unix Network File System (NFS) and Microsoft Common Internet File System (CIFS). Both NFS and CIFS were originally developed for local-area network environments, so they have serious security and performance issues when used in a WAN setting.

    Related Links

    Watch Out For WAFS

    A New Niche For NAS

    Cisco Unveils WAN File Access Appliances

    A number of vendors, including Cisco Systems (which entered the market by purchasing Actona in 2004), Riverbed Technology, and Tacit Networks, are developing the underlying compression and acceleration technology necessary to efficiently share data over WAN connections. The major network-attached storage and storage-area network vendors are scrambling to integrate WAFS into their product suites as a solution to serving branch offices and smaller sites from centralized data centers. These are still expensive, first-generation products which don't have all the bugs worked out. As the technology matures over the next year or two, there will be a push to create protocol standards. For now, however, since the technology is still in its infancy, the vendors are offering proprietary solutions, sold as a combination of storage hardware and software to manage file access over the network.

    The need for faster and more reliable delivery of data files in WAN environments will be driving the development of new standards over the next few years. In the meantime, for companies that need a WAN-based storage solution, several major vendors offer proprietary products which can provide access to cost-effective, managed, and protected central storage from branch offices.

    --Beth Cohen

    Peter Schay, TAC executive VP and chief operating officer, has 30 years of experience as a senior IT executive in IT vendor and research industries. He most recently was VP and chief technology officer of SiteShell Corp. Previously at Gartner, he was group VP of global research infrastructure and support, and launched coverage of client-server computing in the early 1990s.

    Beth Cohen, TAC Thought Leader, has more than 20 years of experience building strong IT-delivery organizations from user and vendor perspectives. Having worked as a technologist for BBN, the company that literally invented the Internet, she not only knows where technology is today but where it's heading in the future.

    We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
    Comment  | 
    Email This  | 
    Print  | 
    More Insights
    Copyright © 2021 UBM Electronics, A UBM company, All rights reserved. Privacy Policy | Terms of Service