RivalPrice — Paste 10 Competitor URLs, Get a Price Drop Email Before Your Customer Does
E-commerce sellers are manually refreshing competitor product pages every morning like it is 2009. RivalPrice scrapes up to 10 competitor URLs on a schedule, detects price drops over a configurable threshold, and fires a plain-English email alert before you lose the sale.
Difficulty
intermediate
Category
E-Commerce
Market Demand
Very High
Revenue Score
7/10
Platform
Web App
Vibe Code Friendly
No
Hackathon Score
6/10
Validated by Real Pain
— sourced from real community discussions
E-commerce sellers manually track competitor prices daily in spreadsheets or hire VAs because no simple paste-and-alert tool exists for non-Amazon product URLs.
What is it?
Reddit's r/ecommerce is full of sellers admitting they track competitor prices in a spreadsheet updated by a VA they hired specifically for this task. Keepa covers Amazon only. PriceSpy is consumer-facing. There is no dead-simple tool where a Shopify seller pastes 10 competitor product URLs and gets a daily digest email if any price moves more than 10%. RivalPrice is exactly that: URL input, CSS selector auto-detection for price elements using Playwright, a daily cron job, a threshold config, and a Resend email alert. No AI required. Fully buildable in a weekend with Next.js, Playwright scraping on a serverless cron, Supabase for job tracking, and Resend for alerts.
Why now?
Playwright's stable headless Chrome in Vercel serverless functions as of 2025 makes scheduled scraping a $20/month infrastructure problem instead of a $500/month server problem — the cost floor just dropped.
- ▸Playwright auto-detects price CSS selector from any product URL without manual config
- ▸Configurable alert threshold per URL (e.g. alert only if price drops more than 10%)
- ▸Daily and hourly cron scrape jobs via Vercel Cron with job status dashboard
- ▸Resend plain-English email digest listing which competitor moved and by how much
Target Audience
Independent Shopify and WooCommerce store owners doing $10k-$500k/month GMV, roughly 200k in the US who actively watch competitor pricing.
Example Use Case
Jake runs a Shopify kitchenware store and pastes in 8 competitor product URLs on Monday. On Wednesday RivalPrice emails him that a competitor dropped their knife set price by 15%, and he matches it before the weekend sale traffic hits.
User Stories
- ▸As a Shopify store owner, I want to paste competitor URLs and get email alerts on price drops, so that I can reprice before losing sales.
- ▸As an e-commerce seller, I want to set a percentage threshold per URL, so that I only get alerted on meaningful price changes not minor fluctuations.
- ▸As a store owner on the paid plan, I want hourly price checks, so that I can react to flash sales before my customers notice.
Done When
- ✓URL input: done when user pastes a product URL and the app displays the scraped current price within 10 seconds.
- ✓Alert email: done when a simulated price drop triggers a Resend email to the user's inbox within 5 minutes of the cron run.
- ✓Threshold config: done when user sets a 10% threshold and a 9% price drop does not trigger an alert but an 11% drop does.
- ✓Billing gate: done when a free-tier user adding a 4th URL sees an upgrade modal and is blocked until they pay.
Is it worth building?
$29/month x 50 users = $1,450 MRR at month 2. $29/month x 150 users = $4,350 MRR at month 5. Math assumes 3% conversion from ProductHunt plus targeted r/ecommerce outreach at 5% cold reply rate.
Unit Economics
CAC: $18 via r/ecommerce posts and cold DMs. LTV: $348 (12 months at $29/month). Payback: 0.6 months. Gross margin: 87%.
Business Model
SaaS subscription
Monetization Path
Free tier: 3 URLs, daily check. Paid $29/month: 10 URLs, hourly checks, email + Slack alerts. Pro $79/month: 50 URLs, 15-minute checks, historical price chart.
Revenue Timeline
First dollar: week 2 via first paid beta user. $1k MRR: month 3. $5k MRR: month 7.
Estimated Monthly Cost
Vercel Pro (cron + Playwright): $20, Supabase: $25, Resend: $20, Proxy service if needed: $30. Total: ~$95/month at launch.
Profit Potential
Full-time viable at $5k-$10k MRR targeting Shopify seller communities.
Scalability
High — add Slack webhooks, Shopify auto-reprice integration, and multi-user team plans at month 3.
Success Metrics
Week 1: 30 beta store owners paste their first URLs. Week 3: 15 paid conversions. Month 2: 85% week-4 retention.
Launch & Validation Plan
Post in r/ecommerce asking 'how do you track competitor prices?' — collect 30 replies confirming the manual spreadsheet pain before building scraper infrastructure.
Customer Acquisition Strategy
First customer: DM 15 active r/ecommerce posters who mentioned manually checking competitor prices, offer 60 days free. Broader: ProductHunt launch, Shopify App Store submission at month 2, r/fulfillment and r/smallbusiness weekly posts.
What's the competition?
Competition Level
Medium
Similar Products
Keepa covers Amazon only with no custom URL input. Visualping tracks page changes but not structured price fields. Prisync is $59/month and targets enterprise — none offer a $29 paste-and-go option for indie sellers.
Competitive Advantage
Zero setup — paste URL and done, no CSS selectors to configure — beats every existing scraper tool that requires technical setup.
Regulatory Risks
Web scraping is legally grey in some jurisdictions (hiQ v. LinkedIn precedent in US). Scraping publicly visible prices is generally low risk. Terms of service violations from scraped sites are a product risk, not a legal one. Advise users to check competitor ToS.
What's the roadmap?
Feature Roadmap
V1 (launch): URL input, daily scrape, threshold email alerts, 10-URL limit. V2 (month 2-3): hourly checks, Slack webhook, price history chart. V3 (month 4+): Shopify auto-reprice integration, proxy rotation, team seats.
Milestone Plan
Phase 1 (Week 1-2): scraper and cron working, prices stored in Supabase. Phase 2 (Week 3): alert emails live, Stripe billing gate active. Phase 3 (Month 2): 20 paying users, Shopify App Store listing submitted.
How do you build it?
Tech Stack
Next.js, Playwright for scraping, Supabase, Resend, Stripe, Vercel Cron — build with Cursor for scraper logic and cron routes, v0 for URL management dashboard.
Suggested Frameworks
Playwright, Cheerio as fallback parser, Supabase JS client.
Time to Ship
2 weeks
Required Skills
Playwright scraping, Vercel Cron, Supabase, Resend email templates.
Resources
Playwright docs, Vercel Cron docs, Supabase quickstart, Resend API docs.
MVP Scope
app/page.tsx (URL input dashboard), app/api/scrape/route.ts (Playwright price extraction), app/api/cron/route.ts (Vercel Cron daily job), app/api/checkout/route.ts (Stripe billing), lib/db/schema.ts (Drizzle: urls, price_snapshots, alerts, users), lib/email/alert.tsx (Resend alert template), components/UrlCard.tsx (URL row with current and previous price), seed.ts (3 demo URLs with mock price history), .env.example.
Core User Journey
Paste competitor URLs -> set alert threshold -> receive first price drop email -> see historical price chart -> upgrade to paid.
Architecture Pattern
User pastes URL -> Supabase insert -> Vercel Cron triggers -> Playwright scrapes price -> compare to last snapshot in Supabase -> if delta exceeds threshold -> Resend alert email fires.
Data Model
User has many TrackedURLs. TrackedURL has many PriceSnapshots (price, scrapedAt). TrackedURL has one AlertConfig (threshold, frequency). User has one BillingPlan.
Integration Points
Playwright for headless price scraping, Supabase for URL and snapshot storage, Vercel Cron for scheduled jobs, Resend for alert emails, Stripe for billing.
V1 Scope Boundaries
V1 excludes: Shopify auto-reprice, Slack alerts, proxy rotation, mobile app, historical chart beyond 30 days.
Success Definition
A Shopify seller pastes 10 URLs, receives their first real price drop alert within 24 hours, and upgrades to paid without contacting support.
Challenges
Anti-bot detection on major retailer sites (Cloudflare, PerimeterX) will block naive Playwright scrapers. Start with smaller direct-to-consumer competitor sites and add rotating proxies only when anti-bot blocking becomes a paying customer complaint.
Avoid These Pitfalls
Do not try to scrape Amazon or Walmart in V1 — their anti-bot is a full-time engineering problem. Do not build proxy rotation before a paying customer hits a block. First 10 paying customers will take 3x longer to find than the scraper took to build.
Security Requirements
Supabase Auth magic link. RLS on all URL and snapshot tables. Rate limit URL submission to 20/day per user. Sanitize all URL inputs to prevent SSRF.
Infrastructure Plan
Vercel Pro for cron and Playwright serverless. Supabase for DB. Resend for email. Sentry for scrape failure alerts. GitHub Actions for CI.
Performance Targets
200 DAU at launch. Playwright scrape under 5s per URL. Cron job completes full URL batch under 3 minutes. Dashboard load under 1.5s.
Go-Live Checklist
- ☐SSRF URL validation tested.
- ☐Stripe billing flow tested end-to-end.
- ☐Sentry scrape error alerts live.
- ☐Vercel Cron verified firing on schedule.
- ☐Custom domain with SSL active.
- ☐Privacy policy and terms published.
- ☐5 beta sellers confirmed alert emails received.
- ☐Rollback plan documented.
- ☐r/ecommerce and ProductHunt launch posts ready.
First Run Experience
On first run: dashboard shows 3 pre-seeded demo competitor URLs with 7 days of fake price history and one mock alert email preview. User can immediately paste their own URL and see a live scrape result. No CRM or third-party accounts required to see the core value.
How to build it, step by step
1. Define Drizzle schema for TrackedURL and PriceSnapshot before any UI. 2. Scaffold Next.js app with Supabase auth (magic link). 3. Build Playwright price extraction function returning numeric price from a given URL. 4. Build Vercel Cron route that iterates all active URLs and stores snapshots. 5. Build alert comparison logic: if new price differs from last snapshot by more than threshold, queue alert. 6. Build Resend email template showing old price, new price, and competitor URL. 7. Build URL input dashboard showing current price, last checked, and alert status per URL. 8. Add Stripe checkout gating URLs beyond 3. 9. Seed 3 demo URLs with 7 days of fake price history so the dashboard looks alive on first load. 10. Verify: paste a real product URL, trigger a manual scrape, confirm price appears in dashboard and alert email lands in inbox.
Generated
May 11, 2026
Model
claude-sonnet-4-6
Disclaimer: Ideas on this site are AI-generated and may contain inaccuracies. Revenue estimates, market demand figures, and financial projections are illustrative assumptions only — not financial advice. Do your own research before making any business or investment decisions. Technology availability, pricing, and market conditions change rapidly; always verify details independently.