• Home
  • BVSSH
  • Engineering Enablement
  • Playbooks
  • Frameworks
  • Good Reads
Search

What are you looking for?

Practice : Dynamic Application Security Testing (DAST)

Purpose and Strategic Importance

Dynamic Application Security Testing (DAST) analyses running applications to detect vulnerabilities from the outside in. By simulating real-world attack vectors without access to source code, DAST identifies runtime issues such as misconfigurations, input validation gaps, and authentication flaws.

This practice strengthens defence-in-depth by complementing static code analysis and catching issues missed by other methods - especially those that only manifest in runtime contexts.


Description of the Practice

  • DAST scans are black-box tests performed against deployed web applications or APIs.
  • They detect issues like cross-site scripting (XSS), SQL injection, authentication flaws, and insecure headers.
  • Tests are typically run in staging or test environments as part of pre-release security validation.
  • Tools include OWASP ZAP, Burp Suite, Detectify, Acunetix, and commercial DAST platforms.

How to Practise It (Playbook)

1. Getting Started

  • Select a DAST tool that aligns with your tech stack, application type, and compliance needs.
  • Run exploratory scans against your test or staging environment.
  • Configure authentication (e.g. session tokens, login scripts) to scan protected routes.
  • Analyse results and prioritise remediation based on severity and exploitability.

2. Scaling and Maturing

  • Integrate DAST into CI/CD pipelines to run automatically after deployments.
  • Schedule scans to run regularly, especially before major releases.
  • Tune scanners to avoid noisy or irrelevant findings - focus on actionable risks.
  • Correlate DAST results with logs, observability data, and production incidents for deeper insights.
  • Track recurring issues and patterns to inform training and secure design practices.

3. Team Behaviours to Encourage

  • Treat DAST results as shared engineering responsibility, not just a security task.
  • Validate scanner findings - confirm exploitable vulnerabilities and dismiss false positives.
  • Document secure fixes and share patterns with peers.
  • Embed runtime security checks earlier through test automation or negative testing.

4. Watch Out For…

  • Scanning live production environments without safeguards or rate limits.
  • High volumes of false positives reducing trust in the process.
  • Lack of authentication coverage leaving blind spots.
  • Findings that don’t link to code or clear ownership for remediation.

5. Signals of Success

  • Vulnerabilities are identified and resolved before release.
  • DAST findings are tracked and actioned like functional defects.
  • Runtime security posture improves over time with reduced exposure.
  • Teams use DAST data to refine design, coding, and testing practices.
  • Security testing becomes a normal part of quality assurance.
Associated Standards
  • Access is continuously verified and contextual
  • Codebases consistently meet high standards of quality
  • Credentials are short-lived and auditable
  • Decision-making authority follows the work, not the hierarchy
  • Security is considered from the start
  • Sensitive data and credentials are managed securely
  • Teams understand the threat models relevant to their domain
Associated Measures
  • Percentage of Services Scanned

Technical debt is like junk food - easy now, painful later.

Awesome Blogs
  • LinkedIn Engineering
  • Github Engineering
  • Uber Engineering
  • Code as Craft
  • Medium.engineering