No Gravatar

It’s that time of year again!  I just got my hands on the 2011 edition of the Verizon/SS Data Breach Report, and I figured I’d take a moment to share my thoughts.

First of all, note that the scope of the report now includes approximately 800 “incidents” from the year prior; last year’s report was comparable in size, covering 761 events.  Next, I observe that this report is by no means “complete;” while a good deal of the year’s most significant incidents have been covered, there are likely thousands of noteworthy data points which have been overlooked or otherwise left out.

Now, the report:

The Good - Improvements

Improvements Based upon USSS/Verizon Breach Investigation Report

Verizon has some good news and some bad news; the good news – only 76% of recorded data breach targets were servers in 2010, compared to much higher percentages in 2009 and 2008.  However, this implies that the focus has shifted towards endpoint and social targets, which is very bad news, indeed.  Probably nothing ground-breaking at this point, but this demonstrates the consistent challenge corporations face in raising enterprise-wide security awareness; we have erected multi-million dollar defense systems, and continue to monitor our logs for interesting traffic, but we cannot fix “people” problems with products.  Additionally, note that – of the breaches reported – we continue to see a steady decline in those involving multiple parties, as well as business partner attacks.  This is good news to corporations, as it indicates continued success in technical and business measures to control outsider access to enterprise resources.

The Bad - Deficiencies

Deficiencies Based upon USSS/Verizon Breach Investigation Report

Next, I’d like to take a look at some of the numbers which rose consistently between the three recent years.  Specifically, I’d like to dwell on the “Employed Physical Attacks” metrics; over a 3-year window, this percentage has tripled (with little fluctuation in data set size in the prior 2 years), indicating a continued focus on technical security.  While improved technical security may prevent a good deal of data breaches, it is not a holistic solution, and often results in “sore thumb” deficiencies.

The Ugly - Inconsistant Findings

Trends that are Not Necessarily Consistant based upon USSS/Verizon Breach Investigation Report

Finally, I’d like to focus on the metrics provided which seemed to fluctuate between the reports issued in 2009, 2010, and 2011; note that, in 2010, the size of the breach “pool” increased tremendously with the inclusion of the US Secret Service data.  Due to this, I would like to focus primarily on the metrics that rose between the 2010 and 2011 reports.  Most specifically, I am concerned when I see the HUGE rise in percentage of breaches that have been discovered by a third party (+25% over a year, +17% over two years).  While I’m sure corporate log monitoring initiatives have started to kick off, what is being done today is NOT enough.  With “blended” attacks on the rise, there is a growing business case for event correlation and collective log management & review; if enterprise shops do not take action on this item, this number will rise exponentially.  On a similar note, I observe that a steady (though slightly rising) portion of the reported breaches have been deemed avoidable, in retrospect, via simple or intermediate controls.  These controls may include password policy enforcement, implementation of stateful packet inspection on firewalls, and security-focused Quality Assurance for web application content (among others).  The effectiveness of such measures should be audited periodically.

Wrapping up:

  • Shift in focus from Servers to Endpoints and Staff
  • Shift to Physical Compromise, as opposed to Technical
  • Social Compromise percentage consistent between 2009 and 2011 reports, although 2010 report indicates huge increase
  • VAST majority of breaches are avoidable through simple controls
  • Insider attacks are down, as are business partner breaches
  • Third parties are disclosing breaches before first parties

 

Action Items: 

  • Know your assets
    • Accurate, comprehensive, and authoritative inventory is encouraged
    • Not just servers and endpoints, but identity assets as well
    • Pre-requisite to next item:
  • Monitor your logs
    • Consider Event Collaboration & Correlation tools (not necessarily a product or a service, this can be a series of well-crafted scripts); note that the return presented by a product will be extremely limited, based upon organizational structure.  From my limited perspective, I see that most enterprise organizations should have comprehensive identity and asset inventory systems to get the most out of vendor SIEM products.  Even with SIM/SEM, individuals need to review their relevant logs frequently
  • Invest in simple, easily monitored, controls (such as account usage policies, password complexity and refresh requirements, etc)
    • If they are already in place, audit your controls for effectiveness; more importantly, adjust accordingly
  • Continue to raise enterprise awareness against breach indicators, consider random employee awareness drills
  • Continue to raise enterprise awareness against physical security threats, enforce physical security policies (for example, laptops must be locked and docked within the office)
  • Secure your endpoints, aggregate event logs, AV logs, etc. from workstations to a common environment for review

Grab the full report here

TwitterDiggLinkedInFacebookRedditTechnorati FavoritesPingSlashdotShare

{ 0 comments }

Defcon 201 North New Jersey (DC201) Chapter Goes Live!

by Jason Stultz on October 16, 2010

No Gravatar

It is my pleasure to announce the go-live of the Defcon (DC201) New Jersey Chapter. After quite a bit of planning, we are excited to hold our inaugural meeting in Paramus on Thursday, December 2, 2010! For more details, such as the time and location, please check the website – http://www.dc201.org/ – we plan will post additional details as they become available.

From dc201.org:
DefCon Group 201 (DC201) is a meeting for hackers, industry professionals, technophiles, and anyone with an interest in technology and its continuously changing role in today’s world. Lectures, panels, and other presentations are of a highly technical nature and are geared towards giving attendees both in-depth knowledge of theory as well as practical skills that can be put to use in real-world scenarios.

TwitterDiggLinkedInFacebookRedditTechnorati FavoritesPingSlashdotShare

{ 0 comments }

Google Boards The Multi-Factor Bandwagon

by Jason Stultz on September 22, 2010

No Gravatar

Recently, Google announced that they have made “strong” authentication available to Google Apps users by way of a downloadable soft token. As per the announcement, this service will cost nothing to enable. While Google Apps had already supported multi-factor authentication via SmartCards and Tokens, this migration to software tokens accounts for a significant convenience boost. No more keyfobs or SmartCards means potentially more willing customers…

While I support the jump towards convience, I am still a bit concerned. SmartPhones are, and continue to be, the target of malicious users; every day naive, non-technical users unknowingly install rootkits(read:jailbreak) on their mobile devices and remove intrinsic security functionality. Typically, these “jailbreak” type exploits render the system open, with default access credentials, and ready for compromise at any moment. For example, many of the iPhone Jailbreak tools create an SSH listener with “alpine” as the root password. Knowing this, it shouldn’t be too difficult to compromise a few thousand smartphones; I wonder how many of those folks are using soft tokens.

A key fob is just that, and will typically not be particularly vulnerable to such an attack.

One possible control for corporations would involve requiring SmartPhones (that require the soft token to be installed) to comply with accepted software build requirements prior to and after being permitted to install the token application. Even this can prove difficult to manage.

What do you see as an effective means to control soft token distribution? Beyond that, what do you see as a strong SmartPhone security policy (without eliminating them entirely)?

The token software is currently available for BlackBerry and Android phones, and is projected to be available for the iPhone in the near future.

PCMag Commentary on The Announcement
Another Interesting Article Regarding Soft Tokens

TwitterDiggLinkedInFacebookRedditTechnorati FavoritesPingSlashdotShare

{ 0 comments }

No Gravatar

Recently I was tweeting away, and I noticed a comment

@******** See, security should be taught in CS101. It not an immediate fix, but imagine how much better things would be in 10 or 15 years.

I thought to myself: “isn’t basic baked-in security an essential practice – not only for the sake of security, but for general functionality?”.  Then I went back to my college days, or what I could remember of them; secure coding was part of the curriculum.  In my intro-level programming class, we were taught to trust nothing – be it an automated feed or manual user input.  If an input field was not populated using the proper format, our code was to catch it, and to throw an exception.  This was taught as a measure to ensure that, if the code was part of a “bigger picture” application, there could be quick resolution in the event of an application failure.  However, this basic functionality was also basic SECURITY.  Now, it was not detailed in my curriculum that the impact of excluding this functionality may result in “arbitrary code execution” or “unauthorized access to confidential resources” – however, it was clarified that such coded shortcomings could lead to functionality issues.  Perhaps if security issues were clarified, we may see better-rounded developers in the future.  I seriously doubt it, however…functionality is frequently ranked by developers as being more significant than security.

But wait!  At this point, a good deal of developers are NOT college-educated, and would NOT have this formal educational background.  I could spend days preaching on about how corporations should favor formally-educated individuals when hiring, or about how they should educate employees on their own, but that’s not the real issue.  Every day, I see exploit PoCs written by folks who may or may not have a formal education (trust me, many do not).  Good developers and programmers know functional code, and how to render it.  The true challenge is separating the good coders from the not-so good…and from the apathetic “this is just my day job” type.  This may be integrated in employee rating systems by use of an application security audit regimen at the discretion of the employer.

As far as the mechanisms by which this is accomplished, I shall forever continue to press mandatory code audit, with proper (read: secure, effective, and available) functionality in mind.  Every code object should be tracked from declaration to destruction, and both manual (code audit, QA, etc) and automated methods (fuzzers, vulnerability scanning tools, etc) should be used to test the code for errors.  If you’ve not got the time to do it right the first time, what makes you think you’ll have the time to do it right when it gets compromised?

Good code is always secure code; secure code is sometimes good code.

</rant>

My twitter feed can be browsed to via this link:

http://twitter.com/_jBlog_

TwitterDiggLinkedInFacebookRedditTechnorati FavoritesPingSlashdotShare

{ 0 comments }

Verizon and USSS Release 2010 Data Breach Report

by Jason Stultz on July 29, 2010

No Gravatar

Recently Verizon, in collaboration with the United States Secret Service, released their 2010 Data Breach Report.  I would like to take a moment to share my praise, concerns, and general findings.

I’ll begin with business practice findings.  In the past, it was emphasized that there was a gap in termination procedures as pertains to access removal from network assets.  Based upon the metrics brought forth from this report (an astounding 26% increase in breaches attributed to “insider” threats), this is still a persistent issue.  Here, another concern arises when one mentions the concept of segregation of duties; often trusted “insiders” have unhindered or UNDERhindered access to a broad pool of resources.  As corporations fail to recognize this, and respectively provide resource access controls and limitations, this will continue to be an issue.  Interestingly enough, the percentage of breaches implicating business partners has dropped by 23%.  One may attribute this to the increased business awareness and legal controls implemented in the contract phase over the past year.  If this trend continues (which it should, as the public is more aware than ever of the threats “in the wild”), this number should continue to drop at a decreasing rate.  Additionally, the report indicates that a vast 48% (26% increase) of breaches discovered over the past reporting period involved privilege misuse to some extent – while only 40% of breaches involved “hacking” proper (-24%).  This continues to make it obvious that nefarious users do not necessarily have to be “hackers,” and may employ conventional information gathering tactics to procure sensitive data.  This may be attributed to the presence of the inevitable “human layer,” and can only be mitigated through a strong, broad-scale, employee education policy.  If the point is still unclear, it was reported that 28% (a sizable increase since 2009) of breaches made use of social engineering tactics at some point. While a corporation may have the most “locked-down” and “secure” internet presence, it remains possible that a loose-lipped employee may still unknowingly play a role in facilitating a data breach.

On a rather interesting (read: disturbing) note, 79% of reported victims that were subject to the Payment Card Industry Data Security Standard (PCI-DSS) had NOT achieved compliance.  86% of breaches were preventable via use of reasonable, simple-to-intermediate controls.  While PCI may only provide a baseline data security model, following the standard ensures that basic defense mechanisms are in place – and, if a breach happens, the standard assures that the incident will at least be tracked to some extent.  On a somewhat related note, 86% of breach victims had substantial evidence logged, yet 61% of breaches were reported by a third party.  This indicates to me that log correlation/SIEM tools are not in place (or underreferenced) in many scenarios; avoid becoming a victim by implementing a strong log reference policy.  The burden of sorting through can be eased significantly by use of common string parsing tools.  Some examples of commercial-grade log/event correlation and management tool vendors include LogLogic, ArcSight, and Q1 Labs.  By the way, PCI 10.6 mandates log maintenance.

As far as demographics are concerned, the report continues to indicate that the focus of data breaches remains within the Financial Services, Hospitality, and Retail sectors.  This does not surprise me, and should not surprise anybody; Cash is King.  Note, however, that this may be attributed in part to the fact that – in the United States (the primary source for the data contained within this report), these sectors are required to adhere to strict breach reporting requirements (due to such regulatory standards as PCI and HIPAA).

On a closing note, the report indicates that approximately 13% of the reported breach cases involved organizations that had recently been involved in a merger or acquisition (as opposed to 9% in 2009).  This indicates the all-too-obvious truth that, in the common flurry associated with large-scale corporate policy changes, security assurance is frequently sacrificed.  Based upon reading this report, I believe that – in a world where cyber crime continues to be on the rise – large companies need to take a moment to smell the coffee.  Making small sacrifices in project deadlines and procuring additional software resources (e.g. log correlation tools, which are essential for far more than just security) to ensure their bottom lines are not only met, but exceeded, while maintaining brand stability.

The 2010 report may be found here

Verizon’s 2009 report (not collaborated with USSS) may be found here

TwitterDiggLinkedInFacebookRedditTechnorati FavoritesPingSlashdotShare

{ 0 comments }

Default Web Pages – and Why You Should Eliminate Them

July 8, 2010

Default web pages are bad, especially when you don’t even know they exist. Here’s why.

TwitterDiggLinkedInFacebookRedditTechnorati FavoritesPingSlashdotShare
Read the full article →