Security features in software-testing products can highlight vulnerable areas of already-developed code

Charles Babcock, Editor at Large, Cloud

May 27, 2005

3 Min Read

The problem was hidden in the bowels of Abebooks' infrastructure. In building its customer database, Abebooks had tapped a database from an acquired company that lacked country-of-origin data for customers. Without a country of origin, the currency-conversion engine was listing the initial value as both the original and converted value. Abebooks' own customer database always presented a country of origin.

That find, discovered during the evaluation phase on Agitator, justified the expense of buying the product, and it was purchased and implemented. An entry-level, 10-seat deployment of Agitator 2.0 is priced at $4,000 per seat. The tool reviews software by generating every test case it can conceive of and running it against the code. It initiates tests that developers often don't think of. No one thought to test what would happen in the currency engine if no country of origin were presented, Minard says, because the development team assumed its software would never encounter such a case.

Some organizations don't need automated testing so much for their own development teams as for code being brought in from the outside. The 406-bed Children's Medical Center Dallas is one of the largest pediatric hospitals in the country. Its core system is Cerner Corp.'s Millennium hospital information system, with many other applications working alongside it. There are patient-registration, billing, clinical, and pharmaceutical systems, all with critical dependencies between them and the core system. At Children's Medical Center, for example, the pharmaceutical system must recognize that the patient is a child and not prescribe an adult dosage, says Alan Allred, group manager of information services.

Out of an IT staff of 140, 80 members are analysts who are well acquainted with the hospital's policies and procedures and who frequently consult with 105 users. For each new application, updated app, or software patch, these 185 analysts and users review what it's supposed to do and write a test plan for how it needs to work. Software testers then convert the plan into tests for the code and run them.

Until the fall of 2003, this was largely a manual process based on scripts captured in Excel spreadsheets and printed out for the testers. When the tests came back, they included the handwritten notes of the testers, deciphering the results on how the software fared, Allred recounts. Analysts then reviewed the results, recommended changes to the software suppliers, and came up with another round of testing to see whether the refurbished code could pass muster in a follow-up review. A massive paper archive was established for the test results on each new piece of code added to the hospital's systems.

Then, 21 months ago, the hospital decided to centralize both the test-script creation and test results in a repository called Test Director, supplied by Mercury Interactive Corp. "We can track the total life cycle of a defect until we know it's fixed," Allred says. "We have an audit trail of dates, comments, who did what, and when it was fixed."

Any change in software having to do with a patient coming into the hospital for cardiology tests, for example, threatens to disrupt 1,124 software steps that must occur to admit that patient and prepare him or her for testing. Those steps are scripted and tested for, with the results visible in Test Director, rather than scattered across hundreds of Excel spreadsheets. The result is a more structured and focused testing process, system analyst Don Ingerson says.

"Before, we didn't know for sure when we had closure" on a new piece of software, Allred says. "The confidence with which we deliver a thoroughly tested product has risen tenfold."

Continue to the sidebar:
Cold Code: Chain Tests Outside Apps

About the Author(s)

Charles Babcock

Editor at Large, Cloud

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelor's degree in journalism. He joined the publication in 2003.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights