Before joining Core Security as a full-time customer service rep I spent some time working as a freelance web-app penetration tester. It was generally pretty enjoyable, rewarding work and I learned quite a bit – not only about the technical aspect of the job, but also about why proactively managing assets and performing regular assessments is truly such an important process to embrace.
One of the most important things I learned is the importance of looking in dark, dirty places when you want to find potential vulnerabilities that can be easily exploited.
By dark, dirty places I mean that "/old/" directory you still have on your production web site to keep back-ups of the old version of the site. You know the one, it calls out to you late at night, long after you’ve turned off your computer and gone home and you lay awake thinking: “What could I have possibly forgotten?”
By dark, dirty places I mean that CMS system you installed a year ago and never used. Perhaps it’s the same one which has the ability to manage the entire site and which no one is directly maintaining from a security perspective.
By dark, dirty places I mean the password-free admin directory that you named oddly and figured no one would ever find. Yeah, no one; just like that “Dungeons and Dragons” fan site you used to maintain that all your friends still needle you about. (BTW, the picture of you with the robe and pendant is priceless, and it’s currently hanging up on my cubicle wall)
All joking aside, the truth is that just about anything that you haven't looked at in over three months is likely vulnerable, or at least more so than you might have already imagined. Why?
Because the applications in that "/old/" directory will likely give me clues to figuring out the new version of your applications and probably have unmaintained (read as: vulnerable) code.
Because maybe you never changed the default password on that CMS system and I can use it to deface your web site or gain a better beachhead on the system. Maybe since you installed it, there’s been a security update to fix a flaw allowing for remote code execution or authentication bypass.
Because maybe someone decided they should put that password-free admin directory in the robots.txt file so that web spiders wouldn't index it. This actually happened to RIAA.org, and I highly recommend you read more about it in an amusing narrative found at http://www.theregister.co.uk/2002/09/21/want_to_know_how_riaa/, especially if you don’t understand why this is a bad idea.
I have seen and exploited all of these problems on many occasions, and attackers will too. The key to thwarting this issue is being far more aggressive and inclusive in terms of what you need to keep an eye on from an exposure perspective and testing frequently to understand your many points of potential risk. Maintaining acceptable levels of IT security is a complex process in almost any context, but keeping an eye on who’s responsible for testing what and when is a good way to really help yourself out.
Now, not to sound like marketing (no offense guys!) but Core IMPACT v10 is going to have expanded functionality which will be able to help you discover all these vintage artifacts sitting around on your web server, and I for one am pretty excited about it, especially because it uses the engine from one of my favorite tools, Nikto.
I love Nikto because it will not only scan for known vulnerabilities and known indicators of potential issues, it will also enumerate files and directories with common names, finding things like your “/include/” directory and the “backup.sql” file you have sitting around (the contents of which are hanging up next to the picture of you, thanks again for that). You can find more information on it at http://cirt.net/nikto2 if you’re interested.
Test regularly. Understand what you’re facing. Catalog your assets and make sure that someone’s responsible for them. Keep vigilant. Most of all, keep on fighting the good fight, info-warriors.
-Dan Crowley, Technical Support Engineer