The Technical SEO Checklist You Can Run Without an Engineer
Seventy-two percent of websites fail at least one critical technical SEO factor, according to Semrush's 2025 Website Health Benchmark Report. Most published checklists respond with 40+ items that require a developer to implement. This guide covers the 11 checks a marketing team can run in under two hours using Search Console, PageSpeed Insights, and their CMS. No engineering tickets required.
The Problem With 47-Point Checklists
From what I've seen, most published technical SEO checklists share a quiet assumption: you have a developer sitting nearby, ready to implement whatever you flag. For in-house marketing teams, especially at startups and mid-market companies, that assumption is generous. You have a shared engineering backlog, a sprint cycle that prioritizes product features, and an SEO ticket that's been sitting in "planned" since Q1.
The result is predictable. Marketing knows something is wrong with their site's technical health. They can't fix it themselves. The ticket sits. Traffic slowly erodes. Nobody connects the dots until the quarterly review, and by then nobody remembers what was on the checklist.
I think the fix isn't "hire a dedicated SEO engineer." For most teams under 50 people, that's not a realistic budget line. The fix is knowing which technical SEO problems you can solve without writing code, and which ones genuinely need a developer. That way you file a focused, prioritized request instead of a vague "please fix SEO" ticket that engineers rightfully deprioritize.
The split works out to roughly 60-70% of items being handleable with CMS access and a Search Console login. The remaining 30-40% require actual code changes. Understanding how Google evaluates ranking signals helps you decide where your time goes first.
Search Console Already Knows What's Broken
If you do one thing from this entire checklist, make it this: open Google Search Console, navigate to Pages under Indexing, and look at the "Not indexed" section.
Two statuses matter most.
"Discovered, currently not indexed" means Google found the URL but decided it wasn't worth crawling. If you see more than a handful of pages here that should be indexed, Google is telling you those pages look too thin, too similar to other pages, or too low-priority to bother with.
"Crawled, currently not indexed" is actually worse. Google crawled the page, looked at the content, and actively chose not to index it. That's an editorial judgment. The page exists, Google read it, and it passed. This is usually a content quality or duplicate content signal, not a technical one, which means it's something you can fix without a developer.
Here's the benchmark: if more than 15% of your submitted URLs (the ones in your sitemap) show a "not indexed" status, you have a structural problem. Under 5% is normal. Between 5-15% is worth investigating on a page-by-page basis. Over 15% means something systemic is happening.
The five-minute version: go to Search Console, filter the Pages report by "your sitemap," export the list of non-indexed URLs. You'll probably spot patterns immediately. Thin tag pages, empty category archives, paginated pages with almost no content. Most of these can be fixed by noindexing the pages you don't actually want Google spending time on, and improving the content on the ones you do want indexed.
Core Web Vitals Stopped Being a Tiebreaker
This is the section where I want to be honest about what you can and can't control without a developer.
Google's March 2026 core update changed the math on performance. Core Web Vitals used to function as a tiebreaker: two pages with similar content, similar authority, the one with better performance scores got a small edge. That's no longer how it works. Performance is now a filter. Sites with INP above 200ms are seeing measurable ranking drops averaging 0.8 positions on competitive queries. Sites above 500ms are dropping 2-4 positions.
The three numbers to know right now:
- LCP (Largest Contentful Paint): under 2.0 seconds. Google quietly lowered this threshold from 2.5s earlier this year. If you were barely passing before, check again. You might be failing now.
- INP (Interaction to Next Paint): under 200ms for "needs improvement," under 150ms for ranking stability. This replaced FID and is significantly harder to pass because it measures every interaction, not just the first one.
- CLS (Cumulative Layout Shift): under 0.1. This is the "content jumping around as ads load" metric, and it's the one most marketing teams can actually fix themselves.
What you can fix yourself: oversized images (compress and serve WebP format), too many third-party scripts (audit your tag manager and remove anything not actively driving revenue), and lazy-loading images below the fold. On WordPress or Shopify, most of this is plugin-level work. No code required.
What genuinely needs a developer: render-blocking JavaScript, slow server response times, complex INP issues caused by heavy client-side frameworks. If your INP is consistently above 300ms, that's almost certainly a code problem and your developer ticket should say exactly that, with the PageSpeed Insights screenshot attached.
The 15-minute version: open PageSpeed Insights, test your top 10 landing pages by organic traffic. Screenshot the results. If LCP and INP are green across all 10, move on to the next section. If you see red or orange, sort by traffic volume and work down. The pages with the most traffic and the worst scores are where you get the biggest return per hour spent.
Crawl Budget Leaks That Look Invisible
Crawl budget sounds like a concept that only matters for sites with millions of pages. It's not. If your site has faceted navigation, URLs with query parameters, or a blog with dozens of tag and category pages, you might be wasting Google's attention on pages that shouldn't be indexed at all.
The three most common leaks on marketing sites:
Redirect chains. A page moved once, then moved again, creating a 301-to-301-to-301 chain. Google follows these, but each hop costs crawl budget and dilutes the link equity being passed. One redirect on a handful of pages is fine. Chains across hundreds of pages add up. Check for these in Search Console under "Page indexing" or use Screaming Frog's free version, which crawls up to 500 URLs.
Internal links pointing to dead pages. Every 404 your internal links point to is a wasted crawl. Google follows the link, hits a dead end, moves on. Whatever authority that link was passing just disappears. In most CMS platforms, a broken link checker plugin handles this in about 10 minutes.
Tag and category page bloat. If your blog has 200 tags and each generates its own indexable archive page, most of those pages have 1-2 posts and zero search value. They dilute crawl budget and send thin content signals. Either noindex them, consolidate to fewer tags, or both. This is one of those fixes that sounds minor but I've seen it improve indexation rates noticeably on mid-size blogs.
Your Robots.txt Has a New Job in 2026
Robots.txt used to be straightforward. Block admin pages, block staging, let Google crawl everything else. In 2026, there's a whole new category of decisions you need to make: AI crawlers.
The distinction that matters: GPTBot crawls your site to train OpenAI's models. OAI-SearchBot crawls your site to show results in ChatGPT's search feature. These are two separate bots with two separate purposes, and you can block one while allowing the other.
The strategic move most sites should be making: block AI training bots (GPTBot, ClaudeBot, CCBot, Bytespider) to prevent your content from becoming free training data. Allow AI search bots (OAI-SearchBot, PerplexityBot) so your content still surfaces in AI-powered search results. According to a Q1 2026 analysis of robots.txt files across Cloudflare's network, GPTBot blocking has already plateaued because most sites that intended to block it have done so.
If you haven't touched your robots.txt since before 2024, there's a good chance none of these rules exist. That means every AI crawler is freely scraping your content for model training, and you're getting nothing in return. The fix takes five minutes and doesn't require a developer on most hosting platforms.
Internal Links Are the Architecture Fix You Actually Own
I'm putting this last because it's the highest-return item on the list, and it requires zero technical knowledge.
Internal linking tells Google which pages on your site matter most. If your top revenue page has 3 internal links pointing to it and your company holiday party recap has 15, Google is getting a confusing signal about what's important. And honestly, this is more common than you'd think.
The 20-minute audit: pull up your top 10 pages by organic traffic in Search Console. For each one, search your own site (site:yoursite.com "page title") to see how many other pages link to it internally. If your most important pages have fewer than 5 internal links, you've found the fix. Add contextual links from relevant blog posts, resource pages, and related content.
Internal linking structure directly influences ranking outcomes, and it's one of the few ranking levers where the marketing team has full control. Orphan pages (the ones with zero or one internal link) almost never rank regardless of how good the content is. Every blog post you publish is an opportunity to reinforce your most important pages with a relevant, contextual link.
And to be fair, this probably isn't news to anyone who's been doing content marketing for a while. But the gap between knowing it and actually auditing it on a regular schedule is where most teams fall short. Quarterly is enough. Just block an hour, pull the data, and add the links. It compounds.
Where to Start Tomorrow
If your team has been staring at a 40-item technical SEO backlog and waiting for dev resources, stop. Open Search Console. Check your indexation ratio. Run PageSpeed Insights on your top 10 pages. Look at your robots.txt for AI crawler rules. Audit your internal links to your highest-value pages.
That's maybe two hours of work across a marketing team, and it covers the majority of technical SEO surface area that actually moves rankings. Save the developer ticket for INP problems and render-blocking JavaScript. Everything else, you probably already have the access to fix.
Notice Me Senpai Editorial