Educating customers to safeguard personal information helps prevent phishing thefts and builds loyalty, <B>The Advisory Council</B> says. Also, test to make sure systems are compatible with upcoming Windows XP Service Pack 2 release; and follow code-review practices to make sure your developers write secure code.

InformationWeek Staff, Contributor

July 29, 2004

4 Min Read

Question C: How can we train our application developers to write secure code?

Our advice: Think of "crackers" as expert code reviewers; if there's a security hole in your code, they'll be more than happy to uncover it for you. Even if you're a vendor with a market monopoly, do you really want your customers to learn about your insecure coding practices? Educating your developers in secure coding principles makes good business sense.

Crackers will, of course, happily exploit bugs from other sources, such as typos, but programmers frequently make certain assumptions which create potential security weaknesses that crackers take advantage of. Generally, crackers exploit one of three types of weaknesses:

  • Failure to establish, follow or enforce secure conditions, practices, and policies.



  • Software features where security consequences haven't been fully considered.



  • Software bugs that result from developer's unjustified assumptions.

While developers should, as a matter of course, include features that can help users avoid the first weakness (e.g., by including mechanisms for enforcing password policies), eliminating the latter two weaknesses is often more problematic.

To fully consider the security costs of including certain features in software applications, developers need to understand how crackers have been able to use apparently innocuous features in the past. Careful study of books such as Howard and LeBlanc's Writing Secure Code (Microsoft Press; 2002) is the best way for developers to learn about the many common weaknesses that already have been identified and solved.

Often, seemingly reasonable development assumptions can provide a window of opportunity to those with malicious intent. For example, a common source of exploitable bugs is a "buffer overflow" problem. The developer simply assumes that no data input will ever be larger than some, usually very generous, number of bytes. Under normal circumstances the probability of that assumption being violated may well be vanishingly small. However, if a cracker who knows or suspects that assumption exists deliberately includes large amounts of garbage input to exploit this weakness, then the code can be compromised. Data, or even code, placed at the end of the input ends up in memory after the input buffer, changing information to the cracker's advantage. A simple case might be to replace a secret password in memory with one selected by the cracker.

In principle, the solution to this kind of problem is simple: test, and don't assume. In the case of buffer overflow, for example, the developer needs to recognize the problem and to test the length of the input to prevent reading more input than is allocated to the buffer.

This is more easily said than done, however. It requires a conscious effort to recognize that one is making coding assumptions that may not be justified. Encourage your developers to systematically re-examine their own code looking for these assumptions. Since it's often easier to spot other people's assumptions than one's own, a powerful tool for creating secure software is to make extensive use of formal code reviews. Every line of code should be carefully read and checked for unintentional assumptions by at least one developer other than the original creator. This requires management buy-in, but has been shown to provide very high payback, not only in producing more secure code, but code that is generally more reliable, maintainable, and usable.

In conclusion, in today's security-sensitive world, learning what security threats to be on the look out for, constantly questioning assumptions, instituting systematic code reviews, and other secure-coding practices make good sense from technical and business perspectives.

-- Beth Cohen

Sanjay Anand, TAC Expert, has more than 20 years of IT and business-process management experience as a strategic adviser, certified consultant, professional speaker, and published author. More than 100 personal clients, both large and small, have included companies from an array of industries and geographies, from academia to technology. He's often referred to as a "consultant's consultant" for training and mentoring skills. He was the creator of Asia's first best-selling computer-assisted learning software package at the age of 17.

Peter Schay, TAC executive VP and chief operating officer, has 30 years of experience as a senior IT executive in IT vendor and research industries. He was most recently VP and chief technology officer of SiteShell Corp. Previously at Gartner, he was group VP of global research infrastructure and support, and launched coverage of client/server computing in the early 1990s.

Beth Cohen, TAC Thought Leader, has more than 20 years of experience building strong IT-delivery organizations from user and vendor perspectives. Having worked as a technologist for BBN, the company that literally invented the Internet, she not only knows where technology is today but where it's heading in the future.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights