PricePulse MCP - Live Competitor Pricing and Feature Matrix Context Server for Claude and Cursor
You are writing a pricing page in Claude but your competitor just changed their plans last Tuesday and Claude has no idea. PricePulse is an MCP server that gives Claude and Cursor live access to your defined competitor pricing pages, scraped and normalized daily, so every pricing decision you make is grounded in what the market actually charges today.
Difficulty
intermediate
Category
MCP & Integrations
Market Demand
High
Revenue Score
7/10
Platform
MCP Server
Vibe Code Friendly
No
Hackathon Score
🏆 8/10
What is it?
Founders and product managers make pricing decisions constantly — writing landing page copy, drafting investor decks, debating tier structure — but Claude's knowledge is stale and manually checking 5 competitor pages is tedious. PricePulse is a Model Context Protocol server that exposes tools for querying a live competitor pricing database: get_competitor_pricing, compare_feature_matrix, and get_price_history. The MCP host (Claude Desktop or Cursor) calls these tools during any pricing-related task and gets structured, up-to-date data without the user leaving their AI session. Scrapers run on a daily Supabase cron, storing normalized pricing tiers as structured JSON. Fully buildable with the MCP SDK, Playwright for scraping, and Supabase for storage — no proprietary data required.
Why now?
The April 2026 MCP adoption wave has Claude Desktop MAUs surging and the MCP server ecosystem is still thin — a live competitor pricing tool is an obvious gap that will get natural distribution through the Claude Discord and MCP registry before the market fills.
- ▸MCP tool get_competitor_pricing returns structured JSON of all tracked competitors pricing tiers updated daily.
- ▸MCP tool compare_feature_matrix returns a markdown table of feature-by-feature comparison across tracked competitors.
- ▸MCP tool get_price_history returns 30-day price change log for any tracked competitor.
- ▸Admin web UI to add competitor URLs, view scrape logs, and manage tracked feature list.
Target Audience
SaaS founders, product managers, and growth marketers actively using Claude Desktop or Cursor for product work — roughly 300k Claude Desktop MAUs as of the April 2026 MCP adoption wave.
Example Use Case
Rachel, a SaaS founder, asks Claude to help her rewrite her pricing page and Claude automatically calls get_competitor_pricing, retrieves last Tuesday's Notion and Linear pricing tiers, and writes copy that positions her plan correctly against the current market.
User Stories
- ▸As a SaaS founder, I want Claude to know my competitors current pricing when I ask for help writing my pricing page, so that I do not accidentally undercharge or use outdated comparisons.
- ▸As a product manager, I want to ask Claude for a feature comparison matrix against 5 competitors, so that I can prepare for a positioning meeting in 2 minutes instead of 30.
- ▸As a growth marketer, I want to be alerted when a competitor changes their pricing tier, so that I can update our battle card the same day.
Done When
- ✓Tool call: done when user asks Claude about competitor pricing and Claude invokes get_competitor_pricing and returns a structured response with tier names and prices from the current week
- ✓Feature matrix: done when user asks for a comparison table and Claude returns a correctly formatted markdown table with real feature data for all tracked competitors
- ✓Admin UI: done when user pastes a competitor URL and sees a successful scrape log with extracted tier data within 24 hours
- ✓Stripe gate: done when user tries to add a 4th competitor and is redirected to Stripe checkout, then the competitor is added immediately after payment.
Is it worth building?
$29/month x 80 users = $2,320 MRR at month 3. $29/month x 300 users = $8,700 MRR at month 9.
Unit Economics
CAC: $8 via Claude Discord and MCP registry organic. LTV: $348 (12 months at $29/month). Payback: 1 month. Gross margin: 92%.
Business Model
SaaS subscription
Monetization Path
Free tier: 3 competitors tracked. Paid $29/month: up to 20 competitors, daily refresh, price history.
Revenue Timeline
First dollar: week 3. $1k MRR: month 3. $5k MRR: month 9.
Estimated Monthly Cost
Vercel: $20, Supabase: $25, Playwright scraper compute: $20, Stripe fees: $15. Total: ~$80/month.
Profit Potential
Strong side-income at $5k–$10k MRR.
Scalability
High — add MCP resources exposing pricing trend charts, G2 review sentiment per competitor, and Slack alerts on competitor price changes.
Success Metrics
Week 2: 50 MCP server installs. Month 1: 20 paid users. Month 3: 75% retention.
Launch & Validation Plan
Post in Claude Discord and Indie Hackers MCP thread asking if founders want live competitor pricing in Claude, recruit 15 beta users, validate with 5 paid conversions before full build.
Customer Acquisition Strategy
First customer: post setup tutorial in the Anthropic Claude Discord MCP channel and offer free 60-day access to first 20 responders. Ongoing: MCP server registry listing, Twitter MCP community, ProductHunt, r/ClaudeAI, Indie Hackers.
What's the competition?
Competition Level
Low
Similar Products
Crayon tracks competitor changes but is $15k/year enterprise. Kompyte is similar price tier. Neither exposes data as an MCP tool callable from Claude mid-session.
Competitive Advantage
No existing MCP server exposes live structured competitor pricing data — most pricing tools are standalone dashboards, not Claude-native context servers.
Regulatory Risks
Low regulatory risk. Web scraping publicly visible pricing pages is generally legal in the US. Do not scrape pages behind login walls. Include robots.txt compliance check in scraper.
What's the roadmap?
Feature Roadmap
V1 (launch): three MCP tools, daily scraper, admin UI, Stripe billing. V2 (month 2-3): price change Slack alert, 30-day history chart in admin. V3 (month 4+): Cursor MCP support, G2 review sentiment tool, team workspaces.
Milestone Plan
Phase 1 (Week 1-2): MCP server with all three tools and scraper pipeline functional. Phase 2 (Week 3-4): admin UI live, Stripe billing, 10 beta users with real competitor data. Phase 3 (Month 2): listed on MCP registry, ProductHunt launch, 30 paid users.
How do you build it?
Tech Stack
MCP SDK (TypeScript), Playwright, Supabase, Vercel Cron, Next.js admin UI, Stripe — build with Cursor for MCP server logic and scraper pipeline.
Suggested Frameworks
MCP SDK TypeScript, Playwright, Supabase
Time to Ship
2 weeks
Required Skills
MCP SDK TypeScript, Playwright scraping, Supabase, structured JSON schema design.
Resources
Anthropic MCP SDK docs, Playwright docs, Supabase cron docs, Claude Desktop MCP configuration guide.
MVP Scope
mcp-server/index.ts (MCP server entry point), mcp-server/tools/get-pricing.ts (pricing query tool), mcp-server/tools/compare-matrix.ts (feature matrix tool), mcp-server/tools/get-history.ts (price history tool), scraper/playwright-scraper.ts (daily pricing scraper), scraper/normalizer.ts (JSON pricing normalizer), lib/db/schema.ts (competitors, pricingTiers, priceHistory tables), app/admin/page.tsx (competitor management UI), app/api/cron/scrape/route.ts (daily scrape trigger), .env.example.
Core User Journey
Install MCP server in Claude Desktop config -> add 3 competitor URLs in admin UI -> ask Claude a pricing question -> Claude calls get_competitor_pricing and returns live data in response.
Architecture Pattern
Vercel cron triggers daily scrape -> Playwright visits competitor pricing URLs -> normalizer parses tiers to structured JSON -> stored in Supabase -> MCP server reads Supabase on tool call -> Claude receives structured pricing data as tool result -> Claude uses data in response.
Data Model
User has many Competitors. Competitor has many PricingTiers. Competitor has many PriceHistoryEntries. PricingTier has many Features.
Integration Points
MCP SDK for tool exposure to Claude and Cursor, Playwright for daily competitor page scraping, Supabase for structured pricing storage, Vercel Cron for daily scrape trigger, Stripe for billing.
V1 Scope Boundaries
V1 excludes: Cursor MCP integration beyond Claude Desktop, Slack price change alerts, G2 review ingestion, white-label, team workspaces.
Success Definition
A SaaS founder installs the MCP server, asks Claude a pricing question, and Claude correctly references a competitor price change from the previous week without any manual copy-paste from the founder.
Challenges
Playwright scrapers break when competitor sites change their markup — requires a monitoring layer and fast repair loop. The hardest non-technical problem is convincing Claude Desktop users to configure a custom MCP server when the setup still involves editing a JSON config file.
Avoid These Pitfalls
Do not try to scrape dynamic JS-heavy competitor sites with a simple fetch — Playwright is required from day one. Do not build your own pricing schema — start with a fixed JSON shape and let users flag when it is wrong. MCP server discoverability is near zero outside the Anthropic Discord — budget 80% of launch effort on that channel alone.
Security Requirements
Supabase Auth for admin UI, API key auth for MCP server tool calls, RLS scoping competitor data to owner, rate limit tool calls to 60/minute per API key, no login-gated competitor pages scraped.
Infrastructure Plan
Vercel for Next.js admin UI and cron, Supabase for Postgres, GitHub Actions for CI, Sentry for scraper error tracking, dev/staging/prod via Vercel environments.
Performance Targets
500 MCP tool calls/day at launch, tool response under 200ms from Supabase query, scraper completes per competitor under 30 seconds, admin UI load under 2s.
Go-Live Checklist
- ☐Security audit complete
- ☐Payment flow tested end-to-end
- ☐Sentry live
- ☐Monitoring configured
- ☐Custom domain with SSL
- ☐Privacy policy and robots.txt compliance statement published
- ☐5 beta users confirmed tool calls working in Claude Desktop
- ☐Rollback plan documented
- ☐Claude Discord and ProductHunt posts drafted.
First Run Experience
On first run: three seeded demo competitors (Notion, Linear, Loom) with current fake pricing tiers are pre-loaded. User can immediately configure the MCP server and ask Claude what Notion charges for their pro plan without adding any real competitors. No manual config required: demo data loads from seed.ts, MCP server runs locally with npx.
How to build it, step by step
1. Define schema in lib/db/schema.ts: User, Competitor, PricingTier, Feature, PriceHistory tables with JSON tier structure. 2. Build scraper/playwright-scraper.ts visiting a competitor URL and extracting pricing section HTML. 3. Build scraper/normalizer.ts parsing extracted HTML into structured PricingTier JSON using Claude API for flexible extraction. 4. Create app/api/cron/scrape/route.ts triggered by Vercel Cron daily at 2am UTC. 5. Scaffold MCP server in mcp-server/index.ts using MCP SDK TypeScript with three tool registrations. 6. Build mcp-server/tools/get-pricing.ts querying Supabase for latest tiers of requested competitor. 7. Build mcp-server/tools/compare-matrix.ts returning markdown feature comparison table across all tracked competitors. 8. Build mcp-server/tools/get-history.ts returning 30-day price change log. 9. Build app/admin/page.tsx for adding competitor URLs, viewing scrape logs, and managing feature labels. 10. Verify: configure MCP server in Claude Desktop JSON config, add one real competitor URL, wait for scrape, ask Claude what the competitor charges for their pro plan, confirm Claude returns the correct current price from the database.
Generated
April 18, 2026
Model
claude-sonnet-4-6
Disclaimer: Ideas on this site are AI-generated and may contain inaccuracies. Revenue estimates, market demand figures, and financial projections are illustrative assumptions only — not financial advice. Do your own research before making any business or investment decisions. Technology availability, pricing, and market conditions change rapidly; always verify details independently.