Sometimes, Fred Langa says, an ultracheap, generic PC is just the right tool for the job.
Chances are, you read InformationWeek.com because PCs are an integral--perhaps even essential--part of your day-to-day life. In fact, there's an excellent chance that you depend on a PC to do your job. Take away the PC, and your job would either vanish or change in a fundamental and probably unpleasant way.
It's the same for me. My business depends utterly on PCs. My work is done on PCs, almost all my communication is via PCs, and my work product exists only in electronic form, intended for display on PCs. Take away the PCs, and I'm dead in the water.
We both need dependable computer systems. That's one reason I spec and buy major-brand-name PCs for my primary workstations. I simply don't have time to fool around with a balky system, and I can't afford to lose work through system problems or unanticipated downtime. In fact, I'm writing this article on a major-brand PC, and I wouldn't want to do it any other way.
Of course, buying a major-brand PC isn't a guarantee of system perfection. However, it does help tilt the odds in your favor. For example, a reputable vendor will have done the proper system integration work to ensure that Component A will work properly with Component B. And there's also the security of having a good warranty and knowing you can get a repair or replacement in short order.
But major-brand systems cost more than no-name systems, and sometimes those extra costs simply aren't justified. There are special circumstances where a no-name, generic, "white-box" system can be a better (and far less expensive) alternative.
The Office 'Food Chain'
Your office probably has an IT "food chain," something like mine. Typically, when we get a new PC, it goes to the person with the most demanding computation needs, replacing an older unit. The old PC then moves on to another station in the office, displacing that machine, and so on. It's a slow-motion domino effect, driven by simple computational need. The positions requiring the greatest horsepower get the newest, fastest systems, and those with lesser needs get the older, slower boxes. (Fortunately, office politics--which in some enterprises can skew resource allocation by awarding PCs as status symbols rather than as work tools--don't play a part in deciding where our systems go.)
But given my earlier statement about the need for dependability, you may be surprised at what's at the bottom of our PC food chain: the LAN server. When a PC is too old and slow to function as a desktop system anymore, it's sent to live out its final years in the server closet. We clean it up, do a minimal install of the operating system, slap in some Internet-sharing/Firewall software, load our shared files onto the hard drive, and then forget about it.
Don't laugh. My business, like 98% of all U.S. business establishments, has fewer than 100 employees. Like most of the rest of that overwhelming majority, I don't need the kind of heavy-duty servers designed for megabusinesses. Those heavy-duty servers are perfect for their intended use, but they're overkill for smaller businesses--or even for many departments and branches within larger enterprises.
You see, for smaller LANs--and most especially in peer LANs--a server's life isn't particularly hard or complex. It doesn't take a lot of horsepower to get the job done. Simple machines, and even older, slower machines, can perform perfectly well.
For example, until recently, our LAN's main server was a cast-off, 7-year-old PC based on a 200-MHz Pentium I clone chip, with 24 Mbytes of RAM. The PC was way too underpowered for today's mainstream office software, but it was more than capable of providing print, file, and Internet access/firewall services to our LAN. In fact, the CPU rarely had to expend more than 10% of its cycles fulfilling its tasks (we monitor such activity). Very heavy Internet activity might drive that computer to 30% CPU use, and extremely large and complex print jobs might cause a temporary higher spike in activity. But the server was rarely, if ever, the cause of any bottlenecks. It simply doesn't take a lot of horsepower to sling bits from one place to another.
And here's the kicker: The system wasn't even a brand-name unit. It was a white-box generic clone I ordered seven years ago in kit form.
I say it was our server because a situation arose recently in which I had to remove that unit from server duty. For software testing, I specifically needed an old, slow system as a benchmark test bed. Thus, I unexpectedly found myself needing another system to act as a server and wasn't ready to shuffle the entire office food chain yet.
How Enterprises Are Attacking the IT Security EnterpriseTo learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
Infographic: The State of DevOps in 2017Is DevOps helping organizations reduce costs and time-to-market for software releases? What's getting in the way of DevOps adoption? Find out in this InformationWeek and Interop ITX infographic on the state of DevOps in 2017.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.