Recently I was tweeting away, and I noticed a comment
@******** See, security should be taught in CS101. It not an immediate fix, but imagine how much better things would be in 10 or 15 years.
I thought to myself: “isn’t basic baked-in security an essential practice – not only for the sake of security, but for general functionality?”. Then I went back to my college days, or what I could remember of them; secure coding was part of the curriculum. In my intro-level programming class, we were taught to trust nothing – be it an automated feed or manual user input. If an input field was not populated using the proper format, our code was to catch it, and to throw an exception. This was taught as a measure to ensure that, if the code was part of a “bigger picture” application, there could be quick resolution in the event of an application failure. However, this basic functionality was also basic SECURITY. Now, it was not detailed in my curriculum that the impact of excluding this functionality may result in “arbitrary code execution” or “unauthorized access to confidential resources” – however, it was clarified that such coded shortcomings could lead to functionality issues. Perhaps if security issues were clarified, we may see better-rounded developers in the future. I seriously doubt it, however…functionality is frequently ranked by developers as being more significant than security.
But wait! At this point, a good deal of developers are NOT college-educated, and would NOT have this formal educational background. I could spend days preaching on about how corporations should favor formally-educated individuals when hiring, or about how they should educate employees on their own, but that’s not the real issue. Every day, I see exploit PoCs written by folks who may or may not have a formal education (trust me, many do not). Good developers and programmers know functional code, and how to render it. The true challenge is separating the good coders from the not-so good…and from the apathetic “this is just my day job” type. This may be integrated in employee rating systems by use of an application security audit regimen at the discretion of the employer.
As far as the mechanisms by which this is accomplished, I shall forever continue to press mandatory code audit, with proper (read: secure, effective, and available) functionality in mind. Every code object should be tracked from declaration to destruction, and both manual (code audit, QA, etc) and automated methods (fuzzers, vulnerability scanning tools, etc) should be used to test the code for errors. If you’ve not got the time to do it right the first time, what makes you think you’ll have the time to do it right when it gets compromised?
Good code is always secure code; secure code is sometimes good code.
My twitter feed can be browsed to via this link: