block 2 · online
guide · featured

Getting Started with Public Surface Analysis

A beginner-friendly walkthrough of what you can responsibly learn from a public URL.

published
Apr 20, 2026
slug
getting-started-with-public-surface-analysis
status
Published
All articles

Getting Started with Public Surface Analysis

Public surface analysis is the practice of reading the signals a website chooses to expose — HTTP headers, DNS records, robots and sitemap files, response codes — and turning those signals into hypotheses.

Why bother

Because most questions you'll ask about a website — "is this login page safe to integrate with?", "is this partner likely to be technically competent?", "has this site been migrated recently?" — have partial answers visible on the public surface. You don't need credentials or privileged access to read them.

A minimum viable workflow

  1. Start with DNS. A, MX, TXT and CAA tell you a lot in three queries.
  2. Ask for the homepage. Record the status code, the final URL after redirects, and the response headers.
  3. Ask for /robots.txt. Note any Disallow rules and the declared sitemaps.
  4. Read the HTML <head>. Note the canonical, <meta name="robots">, and Open Graph tags.
  5. Write down what you saw, with timestamps. Surfaces change.

Responsible defaults

  • Use a realistic, identifying User-Agent.
  • Respect robots.txt for anything beyond a handful of manual requests.
  • Never brute-force paths or credentials against a target you don't own.
  • Keep a log — if a finding turns out to be sensitive, you want to know when and how you observed it.
tagsBeginnerGuide
03explore next

Related articles.

Editorial pieces that share a tool context or type with this one.