You are on page 1of 19

The OWASP Application

Security Metrics Project

Bob Austin
Application Security Metrics Project
Lead
KoreLogic, Inc.
OWAS bob.austin@korelogic.com
P 804.379.4656

AppSe Copyright © 2006 - The OWASP Foundation

c
Permission is granted to copy, distribute and/or modify this document
under the terms of the Creative Commons Attribution-ShareAlike 2.5
License. To view this license, visit
Seattl http://creativecommons.org/licenses/by-sa/2.5/

e The OWASP
Oct 2006 http://www.owasp.org/
Foundation
Presentation Objectives

Drivers for Security Metrics


Review the Project Plan.
Work Accomplished To Date, Next Steps
Provide Application Security Metrics
Resources
Solicit Feedback and Participation

OWASP AppSec Seattle 2006 2


The Best Metrics….Can Answer Hard
Questions
How secure am I?
Am I better than this time last year?
Am I spending the right amount of
money?
How do I compare to my industry
peers (senior management’s favorite
question)?

Source: Dr. Dan Geer


OWASP AppSec Seattle 2006 3
Forrester Survey: “What are your top three
drivers for measuring information security?”

Justification for security 63%


spending

Regulations 51%

Loss of reputation 37%

Better stewardship
Better stewardship 26% Base: 40 CISOs and
senior security
Report progress
Report to to
progress business
business
managers
23%

Manage risk 11%

Source: “Measuring Information Security Through Metrics And Reporting”, Forrester


Research, Inc., May 2006”
OWASP AppSec Seattle 2006 4
Forrester Survey: What do CISOs want to
measure?
“As a CISO, if you have a choice of measuring and monitoring up to five areas
in security, which ones would you measure?”

Regulatory compliance 62%

Incident handling and response 59%

Corporate governance 55%

Risk management process


52%
adherence
Source:
“Measuring
Security awareness 52% Information
Security
Vulnerability and patch Through
52%
management Metrics And
Reporting”,
Application security 34% Forrester
Base: 34 CISOs Research, Inc.,
and senior security May 2006”
managers.
OWASP AppSec Seattle 2006 5
Project Goal and Roadmap
Project Goal:
Address the current lack of effective application security metrics by
identifying, sharing and evolving useful metrics and metric processes to
benefit the OWASP community.
Current Project Contributors: Jeff Williams (Aspect Security), Cliff
Barlow (KoreLogic), Matt Burton (Mitre)

Phase Conduct
Conductresearch.
research. Publish
Develop Identify leading Develop/Conduct PublishSurvey
Survey
One DevelopProject
Project ➨ Identify leading ➨ Develop/Conduct ➨ Results ➨
practices, Initial
Approach
Approach practices, InitialSurvey
Survey Results
standards
standards

Current Project Status

Create
Phase Identify CreateApproach
Approach
Solicit
SolicitOWASP
OWASP IdentifyShort
ShortList
List totoDevelop
Two Feedback ➨ ofofNeeded ➨ Develop
Feedbackand
and Needed Metrics
Metrics
Perform Gap Analysis Metrics
Perform Gap Analysis Metrics

http://www.owasp.org/index.php/Category:OWASP_Metrics_Project
OWASP AppSec Seattle 2006 6
Phase One – Application Security Metrics Baseline
Survey Plan
Information Capture Analysis Survey Results

Interviews Key
Interviews Keyfindings
findings
(from
➨ Identify (fromsurvey
surveyresults)
results)
Identifykey
keyfindings
findings
(common
(common themes,barriers,
themes, barriers, Application
concerns) Applicationsecurity
securitymetrics
metrics
concerns)
Provide set of “best practices”
Assess Provide set of “best practices”
Research Assesssurvey
survey ➨ associated
Research participant-provided associatedwith
withestablishing
establishing
participant-provided an
an application securitymetrics
application security
metrics metrics
metrics program.
for program.
forapplicability
applicability
➨ for
for Use
OWASP Useresults
resultstotodesign
design
OWASPcommunity
communityuse use Phase
Metrics
MetricsSurvey
Survey PhaseTwo Two
of the Project
of the Project

http://www.owasp.org/index.php/Metrics_Survey_Form

OWASP AppSec Seattle 2006 7


Useful Resources from Research
 OWASP CLASP Project – “Monitor Security Metrics”
 Dr. Dan Geer’s “Measuring Security” Tutorial
 Other Initiatives: Securitymetrics.org, Metricon 1.0
 Secure Software Development Life Cycle:
 “The Security Development Lifecycle”, Howard and Lipner,
 “Security in the Software Lifecycle”, DHS, Cybersecurity Div.
 Information Security Metrics Standard - ISO 27004
 Dr. Larry Gordon, Cybersecurity Economics Research Projects
 Resources from NIST:
 Security Metrics Guide for Information Technology Systems,
 Guide for Developing Performance Metrics for Information Security
 NIST Software Assurance Metrics and Tool Evaluation (SAMATE)

OWASP AppSec Seattle 2006 8


Organizing Metric Types

Process Metrics Examples


Information about the  Secure coding standards in use
processes themselves.
Evidence of maturity.  Avg. time to correct critical
vulnerabilities

Vulnerability Metrics
Metrics about application
Examples
vulnerabilities themselves  By vulnerability type
 By occurrence within a software
development life cycle phase

Management Examples
 % of applications that are currently security “certified” and
Metrics specifically accepted by business partners
designed for senior  Trending: critical unresolved, accepted risks
management

OWASP AppSec Seattle 2006 9


Opportunities for Metrics - Secure
Development Life Cycle (SDL)
Were software assurance activities conducted at each lifecycle phase?
Security push/audit
Threat
Secure questions Learn &
analysis
during interviews Refine
External
review

Concept Designs Test plans Code Deploy Post


Complete Complete Complete Deployment

Team member
training Review old defects
Data mutation = on-going
Check-ins checked
Security & Least Priv Secure coding guidelines
Review Tests
Use tools
Source: Microsoft
OWASP AppSec Seattle 2006 10
Examples of Application Security Metrics

Process Metrics Management Metrics


 Is a SDL Process used? Are  % of applications rated
security gates enforced?
“business-critical” that have
 Secure application development been tested.
standards and testing criteria?
 Security status of a new
 % of applications which
application at delivery (e.g., % business partners, clients,
compliance with organizational regulators require be
security standards and application “certified”.
system requirements).
 Average time to correct
 Existence of developer support
website (FAQ's, Code Fixes, vulnerabilities (trending).
lessons learned, etc.)?  % of flaws by lifecycle
 % of developers trained, using phase.
organizational security best
practice technology, architecture  % of applications using
and processes centralized security
services.
 Business impact of critical
security incidents.

OWASP AppSec Seattle 2006 11


Examples of Application Security Metrics

Vulnerability Metrics
 Number and criticality of vulnerabilities found.
 Most commonly found vulnerabilities.
 Reported defect rates based on security testing (per
developer/team, per application)
 Root cause of “Vulnerability Recidivism”.
 % of code that is re-used from other
products/projects*
 % of code that is third party (e.g., libraries)*
 Results of source code analysis**:
 Vulnerability severity by project, by organization
 Vulnerabilities by category by project, by organization
 Vulnerability +/- over time by project
 % of flaws by lifecycle phase (based on when testing
occurs)
Source: * WebMethods, ** Fortify Software
OWASP AppSec Seattle 2006 12
The Path Forward

Complete KoreLogic-sponsored surveys


Encourage others to complete survey forms
Create metrics taxonomy. Test drive it.
Collaborate/share with other metrics initiatives
“Will Work for Metrics”. Volunteers needed!
Solicit survey participants. Collect survey data.
Help analyze survey data
Donate useful application security metrics
Help plan Phase Two

OWASP AppSec Seattle 2006 13


Our Security Metrics Challenge

“A major difference between a "well


developed" science such as physics and
some of the less "well-developed" sciences
such as psychology or sociology is the degree
to which things are measured.” Source: Fred
S. Roberts, ROBE79

“Give information risk management the


quantitative rigor of financial information
management.”
Source: CRA/NSF, 10 Year Agenda for Information Security Research, cited by
Dr. Dan Geer

OWASP AppSec Seattle 2006 14


Supplemental Slides and Metrics
Resources

OWASP AppSec Seattle 2006 15


Resources – Security Metrics
 Security Metrics Standards:
 ISO 27004 - a new ISO standard on Information Security Management Measurements.
 Other metrics initiatives - Securitymetrics.org
 Metricon 1.0 presentations, http://www.securitymetrics.org/content/Wiki.jsp?
page=Metricon1.0
 Dan Geer’s measuringsecurity tutorial. Pdf, http://geer.tinho.net/usenix
 Developing metrics programs:
 Security Metrics Guide for Information Technology Systems,
http://csrc.nist.gov/publications/nistpubs/800-55/sp800-55.pdf
 Guide for Developing Performance Metrics for Information Security,
http://csrc.nist.gov/publications/drafts/draft-sp800-80-ipd.pdf
 Establishing an Enterprise Application Security Program, Tony Canike, OWASP 2005
 Metrics-related Tools:
 NIST Software Assurance Metrics and Tool Evaluation (SAMATE),
http://samate.nist.gov/index.php/Main_Page
 Metrics-related Models, Frameworks:
 http://www.sse-cmm.org/model/model.asp
 Current Articles on Metrics
 www.csoonline.com/metrics/index.htm
 Metric-related Financial and Econometric Resources:
 Economics and Security Resource Page, Ross Anderson),
http://www.cl.cam.ac.uk/~rja14/econsec.html
 Dr. Larry Gordon, University of Maryland, Cybersecurity Economics Research Projects,
http://www.rhsmith.umd.edu/faculty/lgordon/Cybersecurity%20Economics%20Research
%20Projects.html

OWASP AppSec Seattle 2006 16


Resources – Software Assurance
 “A Clinic to Teach Good Programming Practices”, Matt Bishop,
http://nob.cs.ucdavis.edu/bishop/talks/2006-cisse-2/clinic.html
 Team Software Process for Secure Systems (TSP-Secure),
http://www.sei.cmu.edu/tsp/tsp-security.html
 OMG’s Software Assurance Workshop 2007,
http://www.omg.org/news/meetings/SWA2007/call.htm
 DHS Cybersecurity Division Software Assurance Initiatives:
 Software Assurance Measurement Workshop, Oct, 2006
 Software Assurance Program,
http://www.psmsc.com/UG2006/Presentations/11_DHS_SwA_Overview_for_PSM.p
df
 Software Assurance Forum, https://buildsecurityin.us-
cert.gov/daisy/bsi/events/521.html
 CERT Secure Coding Standards,
https://www.securecoding.cert.org/confluence/display/seccode/CERT+Secure+Coding
+Standards
 CRA Conference on
"Grand Research Challenges in Information Security & Assurance“,
http://www.cra.org/reports/trustworthy.computing.pdf

OWASP AppSec Seattle 2006 17


Resources – General Software Measures &
Metrics
 Measures and Metrics Web Sites,
http://www.stsc.hill.af.mil/crosstalk/1999/06/measuresites.asp
 Software Process Metrics Organizations:
 http://www.totalmetrics.com/cms/servlet/main2?Subject=List&ID=3
 http://www.swmetrics.org/ Software Metrics Symposium
 Capability Maturity Model Integration (CMMI)
 Tenth ANNUAL PSM USERS' GROUP CONFERENCE
Performance and Decision Analysis, http://www.psmsc.com/UsersGroup2006.asp
 History of Software Measurement (Horst Zuse), http://irb.cs.tu-
berlin.de/~zuse/metrics/History_00.html
 NASA Software Engineering Laboratory, Experience Factory:
http://sel.gsfc.nasa.gov/website/exp-factory.htm
 ISO/IEC 15939, Software Engineering - Software Measurement Process
 Software Metrics Glossary, http://www.totalmetrics.com/cms/servlet/main2?
Subject=List&ID=12
 2006 State of Software Measurement Practice Survey,
http://www.sei.cmu.edu/sema/presentations/stateof-survey.pdf

OWASP AppSec Seattle 2006 18


Really Bad Metrics Advice
 According to my data, roughly 122.45 percent of this journal's 347,583,712 readers need some sharpening up on
how to effectively collect and use metrics. There is less than a 0.0345 percent chance that this column will help.
 Q: I'm a manager who believes in keeping metrics simple, which is why I've limited the number we collect to 62.
But I also want to simplify their collection—do you know where I can find timecard readers designed for bathroom
stalls?
 A: Try voice print-activated stalls with timed door locks. But first, are you really trying to collect 62 metrics? 62?
[snicker snort chortle] You're obviously clueless about the "KISS" principle: Keep It Stupefyingly Strenuous. You can
collect a lot more than 62 different metrics. The accepted rule of thumb for the number of metrics you can
reasonably work with is this: "Seven, plus or minus the square of the number of door knobs in your home."
Remember, if something can be measured, it must be measured, and all metrics are equally critical.
 Q: I feel vindicated. Now I can introduce additional metrics for every obscure area of our process improvement
model. Naturally, I plan to drop the whole wad as an enforced edict and then make myself unavailable for a few
weeks.
 A: Bravo! But be sure you don't overcomplicate things by defining every minute detail, such as data integrity
standards or what you plan to do with the data. People learn nothing from constant handholding. Your job is to sit
back and wait for those reliable numbers to start pouring in.
 Q: Great! What do you suggest I do with all that data?
 A: What should you do with the data? Do? That question implies that metrics are a means to some end. Don't
waste resources—time spent analyzing metrics is time that could have been spent collecting even more metrics.
 Q: My boss keeps asking for data on stuff I don't think can be quantified—and it's often common sense stuff he
could just ask us! Aren't metrics just a big sham?
 A: Shhh! You're right, metrics are actually an extensive conspiracy—but an extremely helpful one. When people
want to make decisions based on "facts" rather than "opinions," you need metrics to push your personal agenda
under the guise of unassailable objectivity. Perception is everything:
 Politicized emotional drivel: "Let's try my approach. Her plan isn't working."
 Objective insight: "A consumptive analysis of my plan projects a 84.67 percent increased density of pro-active
rationals within six months. However, her key preambulatory vindicators are creating a 24.38 percent downward
sloping polymorphic trend. Plus, she wears really cheesy business suits."

Source: http://www.stsc.hill.af.mil/crosstalk/1998/08/backtalk.asp

OWASP AppSec Seattle 2006 19