The 25 Most Dangerous Programming Errors
Posted by fusionbox on June 30, 2014, 6 p.m.
Just today, a team consisting of the SANS Institute, MITRE, and "many top software security experts in the US and Europe" released their annual list of the 25 most dangerous programming errors (view the list here). Interestingly enough, the #1 and #2 errors (along with several others) deal directly with web development.
My goal today is to not delve into these programming threats directly. We've already explained how SQL injection works and if you have a deep desire to understand the rest, MITRE's list is fairly easy to read. Rather, my aim is to help the non-tech-savvy crowd understand some of the threats web developers deal with on a day-to-day basis and what can be done to help their quest.
Since the dawn of coding time, there have been two groups of programmers:
- Insanely smart people who journey to create faster algorithms, more efficient designs, and ironclad code
- Equally smart people who spend every waking moment discovering how to obliterate the work of group #1
It is because of group #2 (herein referred to as "Bad Guys") that security - particularly web security- has become such a massive issue. To make matters worse, Bad Guys tend to play on one of the unfortunate realities of programming: all too often, programmers trust the users of their software.
"But wait!" you say. "Users aren't bad people! Isn't that a tad unfair?" Yes, dear reader; in a perfect world, that would be an extremely unfair statement. However, the real world dictates a different standard: if programmers trust their users, they will be more lax in examining data (such as, say, a form) that the user sends. This lax security, in turn, leads to programmers taking this user-submitted data as is.
How does this relate to Bad Guys? Take another look at MITRE's list. More than half of the most dangerous (and as far as Bad Guys are concerned, most exploitable) programming errors related to improper handling and sanitization of user data. This reveals the sad nature of programming: because it does not discriminate, trusting the average user means also trusting Bad Guys. These two are inextricably linked and Bad Guys have made a living through the exploitation of this relationship.
Sadly, the dangers of programming do not stop at programmers. As many of us know, mistakes in coding can often lead to unhappy clients, which in turn affects everyone in the organization. This ties every member of a tech company together, and while the responsibility for secure code ultimately falls to programmers, it behooves every member of that organization to take an interest in development security. All too often, no one but programmers cares about or understands the code that drives a product. Smart organizations realize that this is road to disaster and that better testing comes about when average users (read, employees other than the developers) are involved.
What does this look like? Here are some ideas that we at Fusionbox try to employ:
- Be insane. In web development (and software development in general), it is no longer sufficient to test a piece of software by using it in the way you would normally use it. The rest of the world thinks differently than you do; a better strategy is needed. Adoption of a truly thorough testing scheme requires one to drop his preconceived notions of how the product does/should work.
How? Let loose. Try submitting Japanese characters in a form's text field and see what happens. If an application requires a number, trying entering a negative one (or characters other than numbers). Pretend to be a cat and let your paws jump all over the keyboard (we don't recommend that you actually jump on the keyboard). The point is to introduce a hint of randomness in your strategy - this may just reveal some unexpected, yet hideous bug that would not have otherwise been found.
- Learn, learn, learn. I know - the ideal of keeping up-to-speed on tech-savvy topics is about as fun to non-geeks as punching a pit viper. However, it is absolutely vital that non-programmers have a basic understanding of the technologies that drive the products they promote/sell/etc. Not only does this give those individuals a greater appreciation of what programmers do (which, as a programmer, I can say I need), it allows every member of the organization to understand how testing can be improved.
- Think like the bad guys. This one is crucial. As your knowledge of Bad Guy tactics increases, it becomes important to test your products by trying to rip them apart. Sadistically. You need to try and exploit that beautiful piece of code that your development team has created. Yes, it feels dirty; however, much as detectives learn when "thinking like a criminal," understanding your opponents will allow you to help promote ideas to thwart their efforts.
- Get ready for the long haul. Unlike many other tasks, programming is never truly complete. There will always be new vulnerabilities and exploits to worry about. Your programmers are going to be on top of them; they'll also expect you to rinse and repeat your testing assistance. Yes, you have other things to do, but consistent support to your development team will go a long way toward creating a sustainably viable product.
Programmed/coded products are often a beast to deal with. However, with a dedicated team, many of the tricks Bad Guys use can be subverted. And trust me: there is no greater feeling than thwarting evil in its tracks.
Ready to make it happen? In the words of Mad-Eye Moody: "CONSTANT VIGILENCE!"