A developer built 10,000 pages in two weeks and got 33,000 Google impressions before the month was out. No backlinks. No existing domain authority. Just structured data, dynamic sitemaps, and a lot of database rows.
The write-up on Dev.to is detailed - tech stack, what worked, what failed, and the exact sequence of steps. It's one of the clearest breakdowns of programmatic SEO in practice that I've seen.
Here's what actually happened.
The Stack and the Strategy
The site is built on Next.js with PostgreSQL as the database. Each page is dynamically generated from real data - not templates filled with placeholder text, but actual structured information that provides value.
The key insight: comparison pages ranked fastest. Pages comparing two specific things (Product A vs Product B, Feature X vs Feature Y) started appearing in search results within days. Google seems to prioritise these because searcher intent is clear and the content directly answers a question.
The developer used IndexNow to notify search engines immediately when new pages went live. That's a protocol supported by Microsoft Bing and Yandex that lets you push URLs to search engines in real-time instead of waiting for them to crawl your sitemap. Google doesn't officially support IndexNow, but the faster Bing picks up your content, the faster Google notices it exists.
Structured data was added to every page - Schema.org markup that tells search engines exactly what the page is about. Product comparisons got Product and FAQPage schema. Review pages got Review schema. This helps pages appear in rich snippets and increases click-through rates.
What Failed First
The initial attempt used generic templates. Pages felt thin - technically unique, but not useful. Google indexed them, but didn't rank them. Impressions stayed flat.
The fix was switching to real data. Instead of "Product A has features, Product B has features", the pages showed actual differences - pricing, specifications, user reviews, feature matrices. That gave each page substance.
Another early mistake: submitting the entire 10,000-page sitemap at once. Google choked on it. The solution was splitting the sitemap into chunks and submitting them progressively. Smaller sitemaps (500-1,000 URLs each) got crawled faster than one massive file.
Dynamic sitemaps also helped. Instead of generating a static XML file, the sitemap route queries the database and serves fresh URLs on every request. As new pages get added, the sitemap updates automatically. No manual regeneration needed.
The Indexing Speed Trick
Here's the sequence that worked:
1. Build the page with real data and structured markup.
2. Add it to the database.
3. Trigger IndexNow to notify search engines.
4. Let the dynamic sitemap pick it up.
5. Submit the relevant sitemap chunk to Google Search Console.
This combination meant new pages appeared in search results within 48-72 hours. Some comparison pages ranked on the first page for long-tail queries within a week.
The speed comes from redundancy. You're not waiting for Google to crawl your sitemap - you're pinging multiple signals (IndexNow, sitemap submission, internal linking) so Google notices the page exists from several angles.
What Actually Drives Impressions
The 33,000 impressions came mostly from long-tail queries. Not high-volume keywords, but specific searches like "Product A vs Product B for [use case]" or "Is Feature X better than Feature Y?"
These queries have lower competition because most sites don't bother creating dedicated pages for every possible comparison. Programmatic SEO wins here because you can generate thousands of these pages at once, covering combinations that would be tedious to write manually.
The click-through rate was decent but not spectacular - around 3-5%. That's normal for new domains. The impressions mattered more than the clicks at this stage. Getting into the index and appearing in search results is step one. Ranking higher comes later as the domain builds authority.
The Technical Details That Matter
Every page has a unique title, meta description, and H1 tag. These aren't generated from a template - they're built dynamically from the data itself. A comparison page for Product A vs Product B has a title like "Product A vs Product B: Which is Better for [Category]?" where [Category] comes from the database.
Internal linking is automatic. Each page links to related comparisons and category pages. This creates a web of connections that helps search engines understand the site structure and makes it easier for crawlers to discover new pages.
The pages are server-rendered with Next.js, so search engines get fully-formed HTML on the first request. No client-side JavaScript required to see the content. That matters for indexing speed - Google doesn't have to execute JavaScript to understand what the page is about.
What This Means for Builders
The technique isn't new - programmatic SEO has been around for years. Zillow does it for property listings. TripAdvisor does it for hotel comparisons. Nomad List does it for city pages.
What's changed is the tooling. Next.js makes server-rendering trivial. PostgreSQL is fast enough to generate thousands of pages on-demand. IndexNow speeds up discovery. Structured data helps pages stand out in search results.
The barrier to building a 10,000-page site is no longer technical - it's data. If you have structured information that answers specific questions, you can turn it into a programmatic SEO site in a weekend.
The risk is thin content. Google's algorithms are good at detecting pages that exist purely for SEO. If the pages don't provide value, they won't rank - or worse, the entire site gets penalised.
The developer in this case avoided that by using real data and making sure each page answered a genuine question. The comparison pages worked because people actually search for those comparisons. The content wasn't keyword-stuffed fluff - it was useful.
The Next Phase
33,000 impressions in two weeks is a strong start, but it's not the endgame. The real test is whether those impressions convert into clicks, and whether those clicks convert into users who find the site valuable enough to return.
Programmatic SEO gets you into the index fast. It doesn't guarantee you stay there. Google's algorithms are constantly shifting, and sites that rely purely on volume without genuine utility tend to get filtered out over time.
The smart move is using the initial traffic surge to validate which pages perform well, then doubling down on those patterns. If comparison pages are working, build more comparison pages. If review pages aren't getting traction, stop generating them.
Programmatic SEO is a use tool. It lets you test thousands of hypotheses at once and see which ones work. But it's not a replacement for understanding what people actually want. It's just faster than writing 10,000 pages by hand.