Going to Extremes
Adopting Extreme Programming practices, Symantec developers, testers, technical writers and managers took a calculated leap into the agile unknown ... and even the executives are impressed.
It's a sunny May morning in American Fork, Utah. A two-story brick building in a small industrial park is silhouetted against the gray and ochre Wasatch mountains. In a large, windowed room on the second floor, 25 of the 120 people who work here form a circle, surrounding a central grouping of tables and computers.This is a stand-up meeting, designed to elicit brief updates from every team member without eating up more than 20 minutes of the group's time. Comments include "We're setting up auto testing," "We've completed the update of the database schema," "We're working on HP Manager and we need help from people who know HP-UX," "We've located the race condition" and "The nightly builds are working."
I've been invited by OO guru Robert Martin to observe how these Symantec engineers tackle the year-long development of a new Java-based security incident-response product, code-named Orca, because it's one of three pilot teams transitioning to the burgeoning methodology called Extreme Programming. Rubicon, a product update, is also in Utah; Jaguar, a new product, is in San Antonio, Texas.
Best known for its Norton AntiVirus consumer software, Symantec also offers vulnerability assessment, firewalls, intrusion prevention, Internet content and E-mail filtering, remote management technologies and security services. Founded in 1982 and headquartered in Cupertino, California, the publicly held company has operations in 36 countries. In December 2000, Symantec acquired Axent Technologies, an enterprise security firm, and in May 2001, Symantec announced it was opening the San Antonio–based security operations center to offer management, monitoring and response services.
But applying Extreme Programming to a company like Symantec evokes more questions than certainties. XP arose out of thepractices honed by Kent Beck and his colleagues in 1996, while working on the C3 (Chrysler Comprehensive Compensation) human resources project. How does an independent software vendor finesse the "customer" role required by XP? Can geographically separated teams do XP? How do you write a requirements document to satisfy senior managers when your product's complexity is evolving in two-week iterations? Do separate quality assurance and documentation departments fit into a code-centric, test-first approach? How do you know if XP is working? Finally, can a journalist adequately portray a company's development process—especially when a nondisclosure agreement precludes talking too specifically about the technology and the domain?
In search of answers, I'll spend the next six months watching Symantec's progress.
Not a Mandate, but a Good Idea
As if the requisite integration pains of their merger weren't enough, the American Fork lab, acquired for its expertise in intrusion detection and vulnerability assessment, has just embarked on a 12-week "XP Transition" program. This is only the third such exercise conducted by Object Mentor, a Vernon Hills, Illinois–based training firm founded in 1991 and led by Martin. Why rock the boat? Is there an executive mandate to shake things up?
"I was hired to improve our predictability, performance, content and quality. It wasn't necessarily a mandate, but that was the situation," says Carlos Cortez, director of program management at Symantec. Cortez, who has responsibility for teams in Utah and Texas, had worked with Russell Stay, Symantec's vice president of product delivery, for well over a decade at Structural Dynamics Research Corp., in Cincinnati. Not long after Stay's 2000 move to Symantec, he contacted Cortez for help in improving the development process.
Skeptical that self-help books alone could propel programmer productivity, Cortez recalls that he told Stay, "Changing your development culture and methodology and going from C to C++ or Java will cost you this much per person, based on the success I've had in dealing with Object Mentor—Robert Martin, particularly. I've watched Martin bring light to a highly contextual problem in a short time, showing very bright engineers who thought they already knew everything about OO how to really do OO." At Stay's request, Cortez soon joined him at Symantec to spearhead process improvement.
Conquering the Competition
Extreme Programming wasn't the only methodological candidate under consideration, however. The Personal Software Process (PSP), created by Watts Humphrey of Capability Maturity Model fame, held a certain attraction for Cortez, whose IT experience spans 25 years. And Symantec had implemented its own "Solution Centered Process," comprising six phases (explore, define-assess-refine, plan, implement, deliver and measure) and five checkpoints (opportunity, requirements, solution, launch readiness and post launch), leaving development details up to the team. In the end, the lure of XP was too strong for a metrics-oriented self-improvement program like the PSP to overcome--especially considering that some American Fork developers were already converts.
"We've been doing XP for two years on a grassroots level," says five-year Symantec veteran Joseph Shull, director of development. "We've had buy-in from upper management on and off. When this training started, only 25 percent of the engineers had an idea of what XP was about, but now there's some real excitement."
"I've had experience with older methods," says Stay, speaking one month into the transition, "and they were so disciplined, so tight, that people felt they were cogs in a machine. In going to XP from a modified waterfall approach, there were some similar complaints, but they died down as time went by, whereas in the old processes, they just got louder. But the biggest cultural adjustment is pair programming. Developers can have a sense of 'I've lost my personal world.' I wonder--are we going to have a backlash in six to 12 months?"
How Long Will This One Take?
It's 10:00 a.m. For this, only the second release planning meeting they've ever conducted, the Orca developers have divided into two groups of 10 members each and are estimating user stories—simple features that can be implemented in two weeks or less—written on 3-by-5-inch index cards. Since Symantec's products are designed to meet commercial, not internal, needs, Karen Rowell, product manager, is acting as the surrogate customer. She strides back and forth between the groups, answering questions.
"We're defining platforms," Rowell explains. "HP-UX, Solaris, Windows. The estimate is for one person doing the whole user story." In a huddle, the team members sort quickly through five cards, labeling each with the number of weeks (one or two) it will take to first write acceptance tests and then build the feature described. Suddenly, they find themselves stumped by one card's functionality. "I know that's a 'spike,'" Rowell interjects, meaning that it will require a brief experiment to determine how long it might take to build the real thing. "This story is entirely new code, and it may well be thrown away after this release," she says.
Given that there's only one Rowell, two teams and a limited amount of time, "We don't want them to be too dependent on the customer," says David Farber, the Object Mentor senior consultant who's guiding this exercise. "They should pass on a card if they can't figure it out without her." Though at the moment he's pressuring the developers to work quickly, he's by no means discouraging the intense interaction with the customer that XP hinges on.
In addition to developers and managers, there are several testers and one technical writer in the room. "We're listening for sanity," Matt Sellers, senior manager of QA engineering, tells me. "If we hear them say one day and we know it's going to take three days to test, we'll speak up."
In fits and starts, the process continues. By noon, the teams have estimated 74 user stories, Farber reports with elation. This is a great improvement over the first release planning meeting, several weeks earlier, in which the group only finished about 20 stories. "This morning was a tremendous level-set for the whole team," he explains. "Up to this point, they'd only heard the high-level concepts: this is Web-based, it's XML-based, it's response management. Now they've all had a chance to get their arms around the problem." The team has also enjoyed one of the advantages of XP user stories: avoiding unnecessary complexity. As it happens, one of the stories that Rowell had wanted is outside Orca's scope, and is too time-consuming to implement. In encountering and discussing that particular user story, the developers are able to explain that implementing that feature could take six months. Rowell axes it.
Things may be going swimmingly with Orca, but according to Cortez, Rubicon, a product update that is the second XP project at the American Fork site, isn't having such an easy time. Perhaps under the misapprehension that I possess analytical superpowers, he tells me I may be able to give some insight into Rubicon's lack of progress. All I can ascertain, once we pop in to their meeting, is that the only people smiling are James Grenning, Object Mentor's business coach, and the project's surrogate customer. Guided by this pair, team members attempt to define user stories, but repeatedly disagree on their purported worth. We leave without seeing a user story completed.
From Requirements to Stories
Like most of her colleagues, Karen Rowell has a computer science degree and loves the security field. She's spent 12 years as a Unix systems engineer and six and a half years working with Symantec. She had never heard of XP before learning that the team was considering officially adopting the method. As the surrogate customer, she's used some of her traditional requirements-gathering techniques to build a business plan for the Orca incidence response product, seeking market intelligence from her customer advisory council and from government and academic security centers.
Rowell has discovered that writing user stories instead of scenarios and focusing on two-week iterations means that all requirements can be addressed in terms of business value. "It's not a linear approach—we already have results. Two screens have been completed," she declares proudly.
User stories themselves are not just requirements, however. "It's a challenge coming up with releasable components at every iteration," Joseph Shull says. "We're still gravitating toward nonreleasable slices—we didn't see that until we did the testing. We need an infrastructure for test-first programming." His QA counterparts are working on just such a project—for better or for worse.
Painting QA Into the Picture
One of the novel aspects of this XP pilot is that it seeks to integrate a traditional quality assurance department into the definition, estimation and coding phases rather than after the fact in traditional "throw it over the wall" fashion. Doing so turns out not to be such an easy task, however.
"Following our old process, engineering work would be complete, then QA would start testing the product," senior program manager Bruce Ackerson explains. "Now we have the whole team together, including interactive QA (IQA) and automated QA (AQA), operating in parallel with development." Engineering writes the unit tests, AQA writes the acceptance tests and IQA does user-focused product evaluation. But achieving this partnership has taken a good eight or nine meetings, Cortez laments, and the roles and responsibilities are still unclear.
"There's still a QA versus development mindset," Object Mentor's Farber observes. "Still a lot of 'they' instead of 'us.'"
Matt Sellers, who at this time manages seven AQA employees, is excited about the possibilities, however. He's planning to move toward a more automated QA approach, citing "the pink book" (Extreme Programming Installed, by Ron Jeffries, Ann Anderson and Chet Hendrickson Addison-Wesley, 2001): "It says that the customer supplies acceptance tests and should automate them, so we've written stories for automated testing and have started developing, in an XP-like way, a framework to run them." He envisions a database to hold test cases. "Before Orca, there was no demand for us to build a library. We'll have data-driven script components; there will be more reuse."
Status: Three Months
Hoping to detect signs of the kind of slow change that blooms only in a stop-motion film, I'm back in American Fork, this time on a searing day in August. Still awed by the mountains ringing the city, I wistfully wheel the rental car past them and check in with Carlos Cortez for my second site visit.
Though the new product, Orca, is still on track, Rubicon, the major update to a separate product, hasn't fared as well. Several weeks before my visit, developers were pulled off of that project and put on the current product release, which needs a boost to finish on time. Orca is now the only XP team at American Fork, though Cortez anticipates restarting the Rubicon team in mid-October.
"The issue was the difficulty in defining the direction of this product," he says. "People were pulling different ways. There was a struggle with the user stories—you saw that."
Part of that struggle lay in the fact that Rubicon had violated a key XP tenet: Keep the customer in-house. The product manager who initially filled that role was commuting four days a week from the East Coast--a mistake Cortez says they won't make next time.
An intrusion detection product has also been successfully handed off by the Orca developers to a San Antonio team responsible for Blackbird, an appliance now in manufacturing with a firewall, filtering and intrusion detection—the latter being the only software portion of Blackbird built under XP.
"We pushed San Antonio hard with XP, and they delivered exactly on time—it was a four-month project—and built more functionality in than was planned," boasts Stay. The San Antonio team, he says, is slightly smaller than Orca, and was much more intransigent when it came to process changes. "Three lead architects there wanted to do the Rational Unified Process, but we let them do some reference checks. There were also three remote people. We're moving away from telecommuting for developers, but XP makes it even more necessary to be in the office," he says. According to Stay, one star developer who had been working from Boulder, Colorado, has begun to shine twice as brightly since he moved to San Antonio to pair program with his coworkers in person. Back in American Fork, they're also exploring the idea of "pair writing" for the documentation department. My initial reaction to the "pair writing" concept—a shudder—gives me an insight into many developers' visceral fear of pair programming; upon further thought, I realize that many writing-related activities can be collaborative. Nonetheless, Cortez admits there's some resistance to the idea.
The hand-off to San Antonio was done in true XP fashion, using cross-site pair programming, CRC cards and UML diagrams to exchange knowledge. With that transition out of the way, five people have been added to the Orca project, bringing the total to 10 developers, one and a half technical writers, five QA people and one customer. Turnover hasn't been an issue—one tester has left to pursue a microbiology degree, but that's it.
Cortez admits that it's also been tricky learning how to measure and report an XP project, especially in accordance with the existing Solution Centered Process (SCP). "The Symantec process folks are trying to understand estimation, but the problem's not with estimation. They're treating software development as a deterministic set of events, when in fact it's a stochastic process." According to Cortez, following a visit in which she was impressed by the team's enthusiasm, Gail Hamilton, senior vice president of the Enterprise Solutions Division, asked him to better define the SCP. At this point, he tells me, they'll treat the requirements document that the executives are clamoring for—which also contains predicted income, licensing terms and market success factors—as just another user story.
Turf Tussles
There are also continuing facility problems, as the conference room the team has commandeered for their XP war room—where all meetings and pair programming occurs—proves to be too confining. Cortez leads me downstairs to the open floor, once riddled with cubicles, that now holds the team. They plan to divide the space, although the developers are happy with the several-thousand-square-foot void. Above the requisite central work tables hangs an inflatable whale. White boards, XP posters and colorful progress graphs (showing acceptance testing success rates and user story completion velocity) cling haphazardly to the walls.
Every other Wednesday, the Orca team completes a release. In July, the group showed working code to executive management at a much earlier stage of development than was previously possible. "This is the opposite of our usual experience, where we have a lot of method function but no user function," Cortez enthuses. Iteration planning meetings, however, are still longer than the ideal one to two hours. "Each user story should not require a half-hour discussion—and if it does, then that should be called training, not the planning meeting," he says. It's also still unclear who will emerge as the XP coach—the internal team member who becomes the keeper of the methodology flame and ensures group fidelity.
QA, Take Two
The testers have been busy the past few months, building a framework for XML-formatted test results from scripts in HTTPUnit, JUnit, Perl, Python and Java. I'm sitting in a darkened room, watching a projection of a command-line screen scrolling rapidly through a series of unit-testing status lines. Mike Kirsch, senior QA engineer, and QA manager Matt Sellers show me how XML output from the acceptance tests, which run after the nightly build is done, populates an automated test results Web page.
"Because each group has its own testing tool--Orca uses JUnit; San Antonio prefers Perl--we've set up a program we call shell, which reports the results back in XML," Kirsch explains. They generate a colorful graph--the same kind I've seen posted on the walls of the war room--showing progress in terms of success and failure rates for acceptance tests. At the beginning of an iteration, acceptance tests fail, but increasingly succeed until they're all passing at the end of the two-week phase--that is, unless a user story proves too tricky to implement on that go-round.
Unfortunately, though the QA team feels good about their tool, they and Karen Rowell, surrogate customer, don't feel good about the actual tests. "I'm not a Java developer; I don't read Java. These tests aren't descriptive enough. How does the team and the company know we're successful?" she asks. "I want them to do what you do in any other methodology--I want QA to specify tests in every possible permutation." Sure enough, in the meeting I observe today, Rowell holds up an index card labeled "#382: Understandable Acceptance Tests" and announces, "I want to spend this iteration on acceptance testing, because it's hurting. I'm giving you time to work on it." Alex Smith, manager of development, tells the group that they also need to figure out what to do with obsolete tests.
Palpable Progress
If the testing process is still tenuous, however, the results of test-first programming are not. It's early afternoon, and I'm sitting in a circle of chairs with Shull, Ackerson, Smith, Rowell, Cortez and Grenning. We're only a few steps from the central work table, where copies of Steve Holzner's Inside XML (New Riders, 2001) and Larry Constantine and Lucy Lockwood's Software for Use (Addison-Wesley, 1999) lie prominently among the scattered books; the developers are pair programming in Visual SlickEdit as we speak.
There's a feeling of accomplishment in the air—either that or the heightened sense of being on good behavior for a guest—because the Orca team has successfully sent the latest release to the Symantec internationalization team in Ireland. The group there is synchronized with Orca's progress so it can consistently receive "guaranteed stable code," in Ackerson's words, and now has all the necessary strings for localization.
Not only that, but XP is starting to become second nature. "The team runs itself better," a smiling Joe Shull says. "It feels easier. Software development is always in a period of adjustment, and I've been doing XP for a number of years, but this is the first time we have management support for it." Thus far, pair programming hasn't caused anyone to run screaming from the room, and, as Alex Smith says, "There's peer pressure to be the first to commit your code because otherwise you're stuck with more work for the merge.
Unfortunately, Perforce a configuration management tool is slowing me down—though we're trying to resolve the problem." Though XP places a load on version control systems like Perforce, it all but eliminates the need for debuggers. "We used to use the debugger all the time; we haven't touched it for months," Shull says proudly.
Before I leave, Smith gives me a demo of the user screens now running on the prototype incident response system—there's definitely a product there.
Status: Six Months
It's late October, and I'm typing furiously as Carlos Cortez gives me, via telephone, an Orca team status update. "We had a great meeting today," he tells me. "We finished putting the incident list into a GUI that we like. You can manipulate, sort and do dynamic formatting of reports by dragging the mouse." I ask him if there are any metrics to describe the process improvement.
"Well, on the joint two-lab development, we were able to deliver a hardware appliance, which was a first, on time, and there were only five bugs found at beta, which was radical. By another measure, Orca today, after six months of development, had only 14 bugs across 13 two-week iterations." This, in a product that's roughly 5,000 lines of Java code, plus another 4,000 LOC in the acceptance and unit tests. "I've seen the lines of code shrink in the acceptance tests, and that got me digging out an article by Grady Booch that says you should only look at lines of code when the numbers shrink, which means you're successfully refactoring," Cortez says. The Booch whitepaper recommends measuring the number of classes in each release, and then tracking lines of code in comparison to that number to discover whether the architecture is becoming more streamlined in a growing project. Cortez has recommended a similar approach to a committee working on changes to the Symantec Solution Centered Process.
Most importantly, he thinks they've finally solved the QA conundrum. "We were asking, 'Why is the test creation rate so slow and the cost so high?' The infrastructure (AQA) group felt distant from the product, and there was friction. We rearranged how QA is organized two weeks ago. We broke up the infrastructure group and sprinkled them into the three existing product teams. The new mix of skills allows the new product-oriented QA teams to produce cleaner test classes and better coverage tests through pair-programming process. The test readability and status issues are 75% resolved," Cortez states—meaning that Rowell can now understand them.
Betting the Farm
"You can't impose process on a team," Vice President Stay says. "We had a few doubters, so we said we'd only do the XP transition if everyone is bought-in." Symantec has paid for the training by diverting funding away from incremental hires, in hopes that the productivity gains will offset the reduced head-count. "If you can't get a 3 to 4 percent productivity increase out of 100 people, you're going down the wrong path." Interestingly, the most vehement protesters, at the San Antonio site, are now fervent converts.
San Antonio's XP infusion was driven by senior engineers, Cortez explains, and they've now successfully delivered two smaller (three- to four-man-month) projects. "Recently, the development director and the director of program management there—the two who were the most skeptical going in—told me they would never again do anything other than XP. I said, 'maybe you're too bought-in. No process is perfect.'"
Status: A Year and Beyond ...
Praise for Orca's productivity and pleasurable atmosphere is spreading, according to Cortez; Stay anticipates having Orca, Rubicon and Jaguar train additional teams looking to adopt XP. Senior developer Stan Paulson has been tapped as the XP coach, responsible for keeping the methodology pure and preventing a slow slide back into old habits. To avoid confusion due to his existing management responsibilities, Alex Smith, initially considered the de facto coach, is now the tracker instead—the person who keeps tab on individual progress with all the tasks listed in the iteration plan. These two will help educate other groups, even as their own XP skills grow.
The transition has also highlighted the importance of tying QA groups more tightly to development teams—even where XP is not being practiced.
What will the next six months hold for the Symantec teams? If the last six are any indication, Orca will continue to hit its marks, delivering "releasable" slices of software every two weeks. What makes XP so attractive? Perhaps because it's a low-ceremony, iterative process that, unlike other agile methodologies, recommends very specific programming practices. It's clear from my visits, however, that the decision to adopt Extreme Programming is anything but a low-level one. Having senior development managers run interference with the executives who follow the existing company process is a key ingredient in Symantec's success thus far. More importantly, making an investment in quality training has encouraged a whole-hearted commitment to the change. And, in Kent Beck's words, "embracing change" is what XP is all about.--Alexandra Weber Morales, Software Development Magazine
About the Author
You May Also Like
2024 InformationWeek US IT Salary Report
May 29, 20242022 State of ITOps and SecOps
Jun 21, 2022