What was intended as a set of personal practices has become a doctrine. And despite the mainstream adoption of Agile, the loss of its original intent has undermined its effectiveness.

Andrew Binstock, Editor-in-chief, Dr. Dobb's Journal

March 19, 2014

2 Min Read

Many people over the years have discussed their distress with the religious tone that cloaks the implementation of Agile practices. Particularly from the testing side of the world, there is a lot of "should," "should not," and "can do better next time" dialogue that sounds more like a man trying to fix ethical lapses than a developer writing code.

When I speak with adherents of test-driven development (TDD) in particular, there is a seeming non-comprehension that truly excellent, reliable code was ever developed prior to the advent of this one practice. I sense their view that the long history of code that put man on the moon, ran phone switches, airline reservation systems, and electric grids was all the result of luck or unique talents, rather than the a function of careful discipline and development rigor.

The disconnect between today's Agile view and earlier reality is equally evident in the wanton bashing of the waterfall model. To get any programmer today to adopt your recommendation, simply state that not doing so is just a new way of doing waterfall. Watch his toes curl despite never having used waterfall, nor seemingly having any awareness that it served the industry really well for decades. What, was everyone in that bygone era a fool?

Alan Kay was entirely right when he said that programming today has become a pop culture: "Pop culture is all about identity and feeling like you're participating. It has nothing to do with cooperation, the past or the future -- it's living in the present. I think the same is true of most people who write code for money. They have no idea where [their culture came from] -- and the Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free?"

Read the rest of this story on Dr. Dobb's.

About the Author(s)

Andrew Binstock

Editor-in-chief, Dr. Dobb's Journal

Prior to joining Dr. Dobb's Journal, Andrew Binstock worked as a technology analyst, as well as a columnist for SD Times, a reviewer for InfoWorld, and the editor of UNIX Review. Before that, he was a senior manager at Price Waterhouse. He began his career in software development.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights