Learn why bad code is generally much better than no code all, except maybe in matters of life and death.

John Edwards, Technology Journalist & Author

February 13, 2017

3 Min Read
Credit: Pixabay
More on DevOps Live at Interop ITX

Pete Lumbis, a systems engineer with Cumulus Networks, doesn't like bad code. Yet he still believes that lousy code, despite its faults, is often better than having no code at all.

The reason is simple, Lumbis says. "In networking, or anywhere in computer science, we sometimes place a priority on perfection over getting things done," he explains. Yet poorly written code that still manages to accomplish its basic mission is always far superior to a coding project that drags on and on as its creator strives to develop an elegant masterpiece.

Lumbis acknowledges that perfection is critical in many projects, such when developing code for a heart monitor or a missile defense system. "But if I’m just trying to move a group of files around for some users, or I’m trying to create a bunch of descriptions on interfaces throughout my network, it doesn’t need to be perfect," he says.

There are many variations of bad yet generally functional code. Most often, crummy code is clumsy and hopelessly long. That's okay, Lumis says. "So what if it’s 200 lines, which somebody with better skills could have done in 10," he observes. "At the end of the day, if it’s saving you time and making your job better and easier, then the fact that the code is bad doesn’t really matter."

A Hard Habit to Break

Perfectionism is ingrained into many IT and network professionals from the moment they touch their first keyboard. "Most people who go into a technical discipline have at least a little bit of a perfectionist streak somewhere inside," Lumbis says. "It doesn’t mean that we always produce perfect work, but a lot of us take pride in ownership and so it’s hard to come to terms with the idea of doing something that’s bad."

Yet network professionals in particular are beginning to come around to the idea of settling for code that's sub par, but delivered quickly and fully functional. With software-defined networking (SDN) rapidly becoming a technology mainstay, network engineers are now finding themselves in the position of performing coding tasks they were never trained to handle. Yet Lumbis believes that network experts shouldn't feel bad when they create code that's rough and unpolished but still meets a project's basic requirements. "Software development is a completely unique and different discipline from network engineering," he says. "Yet even if we write some bad code, we can make our lives a lot better as network engineers."

Developing a New Skillset

Lumbis notes that fast and rough programming is simply a means toward reaching a goal. "We’re not going to create bad programming as a job with a salary range posted on Glassdoor," he says. "It’s about creating a new skillset, just as virtualization was a new skillset--it doesn’t mean that you need to be an expert."

Lumbis believes that network engineers are beginning to realize that programming -- even if it's not at an expert level -- is now a necessary job requirement. "You can see a growing community of network engineers trying to figure the situation out, being willing to take a stab at programming," he says. "Over the next three to five years, we’ll see network engineers not as software developers by any means, but as professionals with the ability to cobble together some bad programming code to accomplish a task."

Lumbis will explore the benefits of bad code in depth when he leads the session Imposter Syndrome: Bad Code Can Be Better Than No Code at Interop ITX on May 17 in Las Vegas.

About the Author(s)

John Edwards

Technology Journalist & Author

John Edwards is a veteran business technology journalist. His work has appeared in The New York Times, The Washington Post, and numerous business and technology publications, including Computerworld, CFO Magazine, IBM Data Management Magazine, RFID Journal, and Electronic Design. He has also written columns for The Economist's Business Intelligence Unit and PricewaterhouseCoopers' Communications Direct. John has authored several books on business technology topics. His work began appearing online as early as 1983. Throughout the 1980s and 90s, he wrote daily news and feature articles for both the CompuServe and Prodigy online services. His "Behind the Screens" commentaries made him the world's first known professional blogger.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights