This isn't about keywords or vibes. This is about whether Google's crawlers can actually read your code — and whether your server is actively fighting against your rankings. These are the five checks I run before I do anything else on a new site. In this order.
The Google "Health Check" — The site: Operator
The first thing I do is check what Google actually thinks exists on your domain.
The Check: Go to Google and type site:yourdomain.com
What to look for: Does the number of results match your actual page count? If you have a 5-page portfolio but Google shows 400 results, you've either been hacked or your "Hello World" demo content is indexed. Both are bad.
The Fix: Use a noindex meta tag on utility pages, or delete the junk entirely. Go to Google Search Console and request removal of anything you can't delete.
Core Web Vitals — The Speed Trap
Google doesn't just care if you're fast — they care if you're stable and predictable. Core Web Vitals measure three things: how fast the biggest element loads (LCP), how much the layout shifts while loading (CLS), and how quickly the page responds to input (INP).
What to look for: LCP over 2.5 seconds means your visitors are leaving before the page loads. A CLS score above 0.1 means your layout is jumping around while loading, which destroys trust.
The Fix: Compress your images (WebP format, under 200KB for most). Implement server-side caching. If you're on a VPS, check your PHP memory limits — an underpowered server is the silent killer of CWV scores.
The Robot Barrier — robots.txt
I've seen developers accidentally leave the "Discourage search engines" box checked in WordPress for months after launch. The client wonders why their traffic is zero. This is why.
The Check: Navigate to yourdomain.com/robots.txt
What to look for: Does it say Disallow: /? If yes, you've told every search engine crawler to leave. You are invisible.
The Fix: Change it to Allow: / and make sure your sitemap URL is listed at the bottom.
Schema Validation — The Translator
Structured data is how you talk to both search engines and large language models. If it's broken, they fall back on guessing — and they often guess wrong.
What to look for: Red errors. Warnings are acceptable. Errors mean your structured data is being ignored entirely.
The Fix: Correct the JSON-LD syntax in your <head> block. Start with Person or WebSite on the homepage and Article on blog posts.
Mobile Usability — The Real World
Most of your clients in Boulder are looking at your site on an iPhone while walking down Pearl Street. If your site doesn't work on mobile, your ranking doesn't matter.
The Check: Open your site on your phone and try to click two links that are close together. Google also has a Mobile Usability report in Search Console.
What to look for: "Clickable elements too close together" or horizontal scrolling. Both are Google ranking signals — not just UX annoyances.
The Fix: Minimum touch target size is 44×44px. Use padding, not font-size changes, to hit that target.
Technical SEO isn't magic — it's maintenance. Run this check the day a site launches and once a month after that. You'll stay ahead of 90% of the competition, who either never check or wait until something breaks.
The goal isn't a perfect score. It's a site that Google can read, trust, and rank. These five checks are the foundation. Everything else — keywords, content, backlinks — is built on top of this.