Getting Started with Public Surface Analysis
Public surface analysis is the practice of reading the signals a website chooses to expose — HTTP headers, DNS records, robots and sitemap files, response codes — and turning those signals into hypotheses.
Why bother
Because most questions you'll ask about a website — "is this login page safe to integrate with?", "is this partner likely to be technically competent?", "has this site been migrated recently?" — have partial answers visible on the public surface. You don't need credentials or privileged access to read them.
A minimum viable workflow
- Start with DNS.
A,MX,TXTandCAAtell you a lot in three queries. - Ask for the homepage. Record the status code, the final URL after redirects, and the response headers.
- Ask for
/robots.txt. Note anyDisallowrules and the declared sitemaps. - Read the HTML
<head>. Note the canonical,<meta name="robots">, and Open Graph tags. - Write down what you saw, with timestamps. Surfaces change.
Responsible defaults
- Use a realistic, identifying
User-Agent. - Respect
robots.txtfor anything beyond a handful of manual requests. - Never brute-force paths or credentials against a target you don't own.
- Keep a log — if a finding turns out to be sensitive, you want to know when and how you observed it.
Related articles.
Editorial pieces that share a tool context or type with this one.
What Security Headers Actually Tell You
Security headers are not magic. Here is what they do, what they don't, and how to read them.
SPF, DKIM and DMARC: What They Reveal and What They Don't
Email authentication records are not silver bullets. Here is how to interpret them responsibly.
How to Read a Redirect Chain Like a Technical Analyst
HTTP redirects encode decisions, configurations and occasionally mistakes. Here is how to decode them.
Why Robots.txt, Sitemaps and Metadata Still Matter
These files are often overlooked. Here is why they are worth auditing and how they shape discoverability.