Studio Founder
Reverse-Engineering Competitor SEO Without Paying for Semrush
Key Takeaway
We built a complete competitive SEO intelligence pipeline β domain authority, keyword gaps, backlink profiles, traffic-driving URL patterns β using free Apify actors and three Mr.Chief skills. Zero paid subscriptions. Better output than most agencies deliver.
The Problem
Semrush costs $230/month. Ahrefs is $199/month. For a venture studio operating 8 portfolio companies, that's $2,400β$2,760/year per subscription β before you factor in the 3 people who each "need" their own seat.
We were paying for it. Then we looked at what we actually used it for:
- Check competitor domain authority
- Find keyword gaps (what they rank for that we don't)
- Identify their top traffic-driving URLs
- Audit their backlink profiles
All of that data is publicly available. It's just scattered across Google Search Console, Moz's free DA checker, SimilarWeb's free tier, and competitor sitemaps. The problem wasn't data availability β it was the aggregation and analysis layer. That's exactly what AI agents are good at.
The Solution
We built a pipeline using three Mr.Chief skills chained together:
- Web Scraper (Apify actor) β crawls target domains, pulls sitemap data, extracts meta tags and heading structures
- SEO Analyst skill β synthesizes raw crawl data into structured competitive intelligence
- Content Gap Finder skill β compares keyword coverage and surfaces uncontested opportunities
No paid subscriptions. The Apify free tier covers 5 actor runs per month. The Mr.Chief skills are pure reasoning on top of scraped data.
The Process
Step 1: Define your competitor set.
bashShow code
# competitors.txt
hexa.com
joinef.com
stationf.co
Step 2: Run the Apify web scraper on each domain.
javascriptShow code
// apify-seo-scraper-config.js
const input = {
startUrls: [{ url: "https://hexa.com/sitemap.xml" }],
maxCrawlDepth: 2,
maxCrawlPages: 500,
extractors: [
"title", "metaDescription", "h1", "h2",
"canonicalUrl", "internalLinks", "wordCount"
]
};
The scraper returns a JSON array of every URL with its on-page signals. For a 500-page site, this takes about 4 minutes on Apify's free tier.
Step 3: Feed the crawl data into the SEO Analyst skill.
The skill generates a domain comparison report:
| Domain | Domain Authority | Est. Monthly Organic Traffic | Ranking Keywords | Backlinks |
|---|---|---|---|---|
| hexa.com | 62 | ~48,000 | 3,200+ | 12,400 |
| joinef.com | 51 | ~18,500 | 1,800+ | 5,600 |
| pyratzlabs.com | 34 | ~2,100 | 380 | 1,200 |
Step 4: Run the Content Gap Finder.
The skill cross-references crawl data against our own sitemap and surfaces keyword opportunities our competitors own that we don't:
| Keyword Cluster | Competitor Owning It | Est. Monthly Volume | Our Current Rank | Gap Priority |
|---|---|---|---|---|
| "AI agent deployment" | hexa.com | 2,400/mo | Not ranking | Critical |
| "multi-agent orchestration" | joinef.com | 1,800/mo | Page 4 | High |
| "venture studio AI tools" | Neither | 880/mo | Not ranking | Quick Win |
| "startup automation stack" | hexa.com | 1,200/mo | Not ranking | High |
URL Pattern Analysis
The agent doesn't just find keyword gaps β it identifies the structural patterns competitors use to capture traffic at scale:
View details
hexa.com traffic-driving URL patterns:
- /blog/[tool-name]-alternatives/ β 31 pages, avg ~800 visits/mo each
- /compare/[tool-a]-vs-[tool-b]/ β 18 pages, avg ~1,200 visits/mo each
- /for-[industry]/ β 12 pages, avg ~600 visits/mo each
joinef.com traffic-driving URL patterns:
- /guides/[use-case]/ β 44 pages, avg ~400 visits/mo each
- /vs/[competitor]/ β 9 pages, avg ~950 visits/mo each
Prioritized Content Plan
From this analysis, the agent outputs a prioritized content plan:
View details
IMMEDIATE (next 30 days):
1. /vs/hexa/ β target "hexa.com alternative", est. 480 visits/mo
2. /vs/joinef/ β target "joinef alternative", est. 320 visits/mo
3. /blog/ai-agent-deployment-guide/ β target cluster with 2,400/mo total
SHORT-TERM (next 90 days):
4-15. /for-[industry]/ pages for top 12 industries
16-23. /compare/ pages for top 8 tool pairs
The Results
| Metric | Before | After (6 months) |
|---|---|---|
| Organic monthly traffic | ~2,100 | ~9,200 |
| Ranking keywords | 380 | 1,640 |
| Tool cost | $230/month (Semrush) | ~$3/month (Apify usage) |
| Content decision time | "Let's brainstorm" | Data-driven in 30 min |
| Pages with organic traffic | 12 | 87 |
340% organic growth. The compound effect of systematic gap-filling vs. guessing what to write.
Try It Yourself
The full pipeline is three Apify runs plus two Mr.Chief skills. Start with one competitor. Run the scraper on their sitemap, feed the JSON into the SEO Analyst skill, and let it surface the top 10 gaps.
bashShow code
# One-command competitor audit
mrchief run seo-analyst \
--input competitor-crawl.json \
--compare-to my-sitemap.xml \
--output gap-report.md
You don't need Semrush. You need to be systematic about using what's already public.
PyratzLabs doesn't outspend competitors. We out-see them. The data was always free β we just built the pipeline to use it.
Related case studies
Content Strategist
We Generated 20 Content Briefs in One Run β Each Better Than Our Agency's
Our AI agent generates 20 SERP-analyzed, customer-language-infused content briefs in one run β each with more competitive depth than what our $4K/month agency delivered.
SEO Analyst
Our Competitor's SEO Strategy in 30 Minutes β Without Semrush
We reverse-engineered our competitors' full SEO strategies β domain authority, keywords, backlinks, traffic estimates β without paying for Semrush or Ahrefs, using Apify actors that scrape their public pages.
SEO Lead
The Programmatic SEO Spy: How We Found Our Competitors' Best-Kept Secret
We crawled competitor sitemaps to detect which URL patterns are programmatic SEO, estimated traffic per pattern, and found content gaps worth thousands of monthly visits.
Want results like these?
Start free with your own AI team. No credit card required.