Dan Cornell,
Denim Group
Toolkits for testers and devs based on open source OWASP (Open Web Application Security Project), ZAP, Threadfix and eclipse.
- Infomation security - can't fix problem, wrong skillset
- Audit and compliance - healthcare, financials
- Risk management - same
- Software developers - write code, have to take on tasks.
- Physical security - old
- Information security - relevant
- App security - new discipline but immature metrics. New tools, codescanning, etc.
- Legacy code
- Quantity of apps
- Not a lot of qualified devs
- Must automate.
- Gather data - Compare money spent on infrastructure vs dev
- Communicate with stakeholders
- Automate
- Repeat
- Automated static and dynamic tests as well as manual.
- Vulnerabilities can persist.
- No easy way for teams to work together.
- Site has metrics on vulnerability remediation, e.g. where time is spent, making sure not to break code, confirmation, environment setup high.
- Open source. Load and normalize data and interact teams. Findbugs, jira, many more.
- webbased.
- Shows apps with most vulnerabilities, hotspots.
- Divide vulnerabilities into teams and apps.
- Loads scan, does diff to see change tracking, and produces stats. Can mark false positive. Identifies vulnerabilities caught by multiple scans.
- Slice data into technologies, e.g show only vulnerabilities found by 2 scanners or only cross-site scripting vulnerability.
- Single view for security analyst.
- Load via rest, cli and Jenkins plugin. Can be automated.
- Creates a consolidated view, allows you to prioritize, and translates into dev tools. Manage security tasks in jira.
Unique vulnerability:
- Common weakness enumeration (CWE), relative url - directory misconfiguration
- CWE, relative url, injection pt - sql injection
- Injection pts - get put params, cookies
- CWEs - OWASP top 10 good.
Top-down obvious apps + custom + mobile + cloud = Attack Surface Model
- Dynamic (running) and static (code) analysis.
- Eg Dataflow analysis
- Automated scanning, static analysis, manual app and static testing.
- E.g. authenticated vs not authenticated tests
- Breadth first, automation key, base-testing of all. Has limitations.
- Results are diffed, normalized and false positives identified.
- Take friction out of process. See dynamic vulnerability, match changed code. E.g. HP Fortify to IBM Appscan example.
- Dynamic - Spider to enumerate attack surface. Fuzz to identify vulnerabilities based on e.g. request/response.
- Static - Tainted input, source of sql injections.
- Standardize vulnerability types. Match dynamic and static. Improve static parameter parsing.
- Info used: git url. Framework type e.g. jsp and/or spring. Extra if available.
Attack Surface Model can know things dynamic scanner does not. Plugin for OWASP ZAP to connect to server, give baseurl then import results in Eclipse-based app. Fuzzing can find other potential vulnerabilities.
- Filters can show by type, fix time, etc.
- Ship data to defect tracker, e.g. jira (attach to application).
- Can create multiple defects.
- ThreadFix will poll status. P
- ushed to dev teams.
- Map to line of code, load code, open Threadfix view to see vulnerability per line.
- Benchmark against other organizations. See stats and make decisions on technologies.
My thoughts:
- A very good presentation with no marketing hype.
- Well-evolved software process.
- Might be of more use if we used Findbugs more consistently but creating Jira tasks from Findbugs reports sounds do-able.
No comments:
Post a Comment