Product Manager

Monitoring 100 Competitor Pages for Changes β€” Weekly Diff Report

100 pages monitored weeklyProductivity & Security5 min read

Key Takeaway

An AI agent scrapes 100 competitor pages every week, diffs them against the previous snapshot, and flags every change β€” new features, pricing shifts, team hires, blog posts β€” in a single consolidated report delivered to Telegram.

The Problem

We compete in the AI agent infrastructure space. That means we compete with LangChain, CrewAI, AutoGen, Composio, and a dozen others. Each of them has a website with pricing pages, feature lists, team pages, blog archives, careers sections, and documentation.

These pages change. Pricing tiers get restructured. Features get added. Key engineers get hired. Blog posts signal new strategic directions. Partnership announcements reveal where the market is moving.

If you're not tracking these changes, you're reacting instead of anticipating. By the time someone mentions a competitor's new feature in a sales call, they've had it for three weeks and you look uninformed.

The manual approach? Open 100 URLs in a browser, eyeball each one, try to remember what changed since last week. Nobody does this. It's humanly impossible at scale.

Competitive intelligence tools exist β€” Crayon, Klue, Kompyte. They cost $15,000 to $50,000 per year. For a startup, that's absurd.

The Solution

Scrapling for data collection + a simple diff engine built by the agent. Scrape 100 pages weekly, store snapshots, compare current to previous, flag changes, and deliver a consolidated report.

Total infrastructure cost: the compute time to run 100 scrapes once a week. About 30 minutes of server time.

The Process

The system works in three phases:

Phase 1: Page inventory

yamlShow code
# competitors.yaml β€” what we track
competitors:
  - name: LangChain
    pages:
      - url: https://langchain.com/pricing
        category: pricing
      - url: https://langchain.com/features
        category: features
      - url: https://langchain.com/about
        category: team
      - url: https://blog.langchain.dev
        category: blog
      - url: https://langchain.com/careers
        category: careers
  - name: CrewAI
    pages:
      - url: https://crewai.com/pricing
        category: pricing
      # ... 20 pages per competitor
  # ... 5 competitors Γ— 20 pages = 100 pages

Phase 2: Weekly scrape + snapshot

bashShow code
# Cron: every Monday at 6:00 AM UTC
0 6 * * 1 mrchief cron run --task "Run weekly competitive monitoring.
Scrape all pages in competitors.yaml using scrapling (stealth mode for
protected sites). Save snapshots to competitor-data/YYYY-MM-DD/.
Diff against previous week. Report changes to Telegram."

The agent uses the right Scrapling mode per page:

View details
Static pages (blog, about) β†’ simple mode (~200ms each)
Protected pages (pricing, features) β†’ stealth mode (~3s each)
JS-heavy pages (interactive demos) β†’ dynamic mode (~10s each)

Total scrape time for 100 pages: ~25-35 minutes

Phase 3: Diff + report

The agent compares each page's current content against the previous week's snapshot:

pythonShow code
# The agent generates and runs diff logic like this:
# For each page:
#   1. Load previous snapshot (text content, extracted elements)
#   2. Load current scrape
#   3. Compute structural diff (added/removed/changed sections)
#   4. Classify change significance: major / minor / cosmetic
#   5. Generate human-readable summary

A real weekly report looks like this:

View details
πŸ“Š Weekly Competitive Intelligence Report
March 10–16, 2026 β€” 100 pages across 5 competitors

πŸ”΄ MAJOR CHANGES (3)
━━━━━━━━━━━━━━━━━━━━
1. LangChain /features
   ADDED: "Enterprise SSO" β€” new section under Security
   ⚠️ We don't have this feature yet

2. CrewAI /pricing
   CHANGED: Pro tier $49β†’$79/month (+61%)
   ADDED: New "Team" tier at $149/month

3. AutoGen /careers
   ADDED: 4 new positions β€” 2 ML Engineers, 1 DevRel, 1 Sales
   Signal: expanding engineering + go-to-market

🟑 MINOR CHANGES (7)
━━━━━━━━━━━━━━━━━━━━
4. Composio /blog β€” 3 new posts (RAG tutorial, partnership, changelog)
5. LangChain /about β€” new VP Engineering headshot/bio added
6. CrewAI /features β€” reworded "Autonomous Agents" β†’ "Agentic Workflows"
...

βšͺ NO CHANGES (90 pages)

πŸ“ˆ 30-Day Trends:
- Most active: CrewAI (14 changes across all pages)
- Fastest moving: Pricing pages (5 changes in 4 weeks)
- Hiring signal: AutoGen ramping aggressively (8 new roles in 30 days)

The Results

100

Pages monitored

5

Competitors tracked

~30 minutes

Scrape time (weekly)

8-15

Changes detected (avg/week)

2-4

Major changes (avg/week)

~1/week

False positives (cosmetic flagged as major)

Monday 7:00 AM to Telegram

Report delivery

5-10 minutes

Human review time

Cost comparison:

ApproachAnnual CostPagesUpdate Frequency
Crayon/Klue$15,000–$50,000VariesDaily
Manual monitoring$0 (but impossible)10-20 maxWeekly if lucky
Scrapling + agent$0100+Weekly (configurable)

The LangChain SSO feature detection was real value. We saw it within 48 hours of it appearing on their site. That gave us two weeks to discuss whether to prioritize SSO β€” before our sales team heard about it from prospects.

Try It Yourself

bashShow code
# Install scrapling
# Install via Mr.Chief dashboard after signing up at mrchief.ai/setup
# clawhub install scrapling

# Create your competitor page inventory
# (The agent can help you build this from competitor URLs)
mrchief run --task "I want to monitor these 5 competitors:
[competitor1.com, competitor2.com, ...]. Find their pricing, features,
about, blog, and careers pages. Save as competitors.yaml."

# Run the first baseline scrape
mrchief run --task "Scrape all pages in competitors.yaml using scrapling.
Save snapshots to competitor-data/$(date +%Y-%m-%d)/"

# Set up weekly cron
mrchief cron add --schedule "0 6 * * 1" \
  --task "Run weekly competitive monitoring from competitors.yaml.
  Diff against previous week. Report to Telegram." \
  --channel telegram

First week is baseline. Second week you start getting diffs. By week four, you'll wonder how you ever competed without this.


100 pages. 5 competitors. 30 minutes of compute. Every change, caught.

Competitive IntelligenceWeb ScrapingMonitoringDiff ReportsScrapling

Want results like these?

Start free with your own AI team. No credit card required.

Monitoring 100 Competitor Pages for Changes β€” Weekly Diff Report β€” Mr.Chief