Chances are you’ve read something, somewhere, about the increasing number of security breaches in web applications. Hackers are targeting these applications due to the amount of sensitive business (and personal) data they control. “Hacktivism” isn’t just a buzzword — groups of thousands of hackers are determined to take down organizations, which are targeted for reasons only the attackers themselves understand. Yet they are serious about it. And it’s time you become serious about it too.
Most organizations don’t think about application security until after the application has been designed and architected. They may layer on security testing after the application is developed. They may decide to test the application in its run time state, just before it goes to production. Some even test it after it’s in production. There are many high-profile companies that do this; you can read all about them (and the corresponding security breaches in their applications) in the latest news headlines.
To put the root cause of this issue bluntly, there’s a disconnect between security and development that can lead to serious security vulnerabilities down the road. Developers aren’t security experts, and most security experts aren’t developers. There needs to be a better mutual understanding — and earlier cooperation — between the two parties to better eradicate security issues.
Focus on software quality early and often
Usually, Quality Assurance (QA) is an “after process,” or something that happens late in the development cycle. A project that spans 180 days or more may get a 20-day cycle to perform QA testing–and security might get just three of those days.
This creates several problems, since QA and security are then testing a completely configured application that meets the functional specifications of whatever business unit ordered the application. This can pose a serious issue, since it’s the developers that need to fix the problems.
And herein lies the rub: by the time these problems are flagged, the developers have probably closed the project, been paid for meeting the deadlines and specifications of the application and moved on to another project. Re-architecting completed applications to meet new security concerns is likely not at the top of their list.
Engage devs with security issues sooner
So what exactly does “developer-first” security mean?
Developer-first security means ensuring that the people that know the most about the application, with the most amount of time and the most incentive to get the application right, actually get the application right!
They can’t be expected to think about all these things at the end of the development cycle, once they’ve moved on to another project. Their processes are all incremental and cumulative (not really an oxymoron!) — and testing has to be, as well.
Most folks have heard that developers resist security–that they don’t have the time, knowledge or understanding to do all of the requisite legwork, and create a great application at the same time. If you were developing a product and forced to tear it apart at the same time, you would likely feel the same way.
Seems logical, right?
Well, there’s more than meets the eye here. How can you expect developers to be testing or security experts, when the testing experts aren’t development experts?
It seems that most of the security companies out there have had “Year of the Developer” or Cross Site Scripting (XSS) eradication campaigns. Guess what? We’re reaching fewer developers and there are more flaws out there. Some of these flaws have been around for 15 or more years. Yet these campaigns continue to fail because organizations aren’t engaging development in an understandable manner.
Learn and speak the language
This isn’t to suggest quid pro quo here. Instead, testing experts should make a greater effort to understand how an application is developed. Give the developers requirements in a manner that they understand, using processes and technology purpose-built for development to test the quality and security of the application—and the underlying software code.
Don’t add something to a project once they’re done, and don’t expect them to be something they’re not. Test the code they write as they write it, and give them the defects in a manner they’re familiar with, either on a daily basis or as they complete their assignments (e.g. code check-in). And give them specific guidance on how to fix the issue in language they understand and in the context of their code.
That’s why more organizations need to switch over to testing that starts with developers—aka, “developer first” testing. They know more about the application than anybody else in a given organization, and can fix it in real time.
Adopt emerging best practices for dev-first security
While many organizations don’t typically think of application security as a functional business requirement, there are a number of growing companies that do.
What separates these organizations from others? They understand development. They can spare a smart developer to coach the team on QA or security. (Think embedded Security Evangelists!) And they employ a number of creative tactics to promote a developer-first approach to security, including training programs and internal incentives, from specialized titles (who doesn’t want to be a “Security Guru”?!) to contests that promote cooperation between development, QA, and security.
Organizations that adopt some or all of these methods will create better, more secure applications.
Mobile developer or publisher? VentureBeat is studying mobile marketing automation.
Fill out our 5-minute survey
, and we'll share the data with you.