VendorCast - Automated Supplier Lead Time and MOQ Change Monitor So Your Ops Team Stops Getting Blindsided
Your supplier quietly updated lead times from 14 days to 35 days on their portal two weeks ago and you found out when a customer complained about a late order. VendorCast monitors your supplier portals and email threads for lead time, MOQ, and pricing changes and alerts your ops Slack channel before they become your problem.
Difficulty
intermediate
Category
Supply Chain Tech
Market Demand
High
Revenue Score
7/10
Platform
Web App
Vibe Code Friendly
No
Hackathon Score
6/10
What is it?
Small e-commerce brands and DTC operators managing 5–30 SKUs across multiple suppliers are constantly blindsided by lead time changes, minimum order quantity updates, and price adjustments communicated inconsistently via supplier portals, email PDFs, and WhatsApp messages. VendorCast connects to supplier portal URLs via browser automation, parses email attachments via NLP, and flags any detected changes against a stored baseline in a Slack or email alert with the exact change highlighted. This is a pure ops pain point with direct revenue impact — a missed lead time change cascades into stockouts, customer refunds, and emergency air freight. The April 2026 supply chain volatility spike following the tariff uncertainty wave has pushed DTC ops teams to actively seek monitoring tools, and the build is entirely achievable with Playwright, Claude API for document parsing, and Supabase.
Why now?
The April 2026 tariff uncertainty wave has pushed supply chain volatility to its highest in 3 years, and Playwright plus Claude API now make portal scraping and document parsing cheap enough to offer at $79/month — a price DTC ops teams pay without a budget conversation.
- ▸Supplier dashboard where users add supplier portal URLs and upload baseline lead time, MOQ, and pricing data as a structured profile (Implementation note: stored in Supabase with field-level baseline values).
- ▸Playwright-based scheduled portal scraper that visits supplier URLs, extracts key fields, and diffs against stored baseline every 6 hours.
- ▸Claude API email PDF parser that accepts forwarded supplier emails with attached PDFs and extracts changed terms into the same baseline diff engine.
- ▸Slack webhook and email alert with a highlighted before/after diff of every detected change, sent within 1 hour of detection.
Target Audience
DTC e-commerce operators and Amazon sellers managing 5–30 active suppliers, estimated 200,000+ in this profile in North America and EU.
Example Use Case
A 7-figure Shopify brand ops manager adds their top 8 suppliers to VendorCast, and when Supplier 3 updates their portal lead time from 21 days to 45 days, the ops Slack channel gets an alert within 4 hours with the exact field that changed highlighted in red — before any PO is placed at the wrong timeline.
User Stories
- ▸As a DTC ops manager, I want automatic alerts when my supplier's lead time changes on their portal, so that I can adjust my purchase orders before placing them at the wrong timeline.
- ▸As a Shopify brand founder, I want to forward supplier PDF emails to VendorCast and have changes extracted automatically, so that I do not have to read every supplier email attachment manually.
- ▸As an ops team lead, I want change alerts posted to our Slack channel with the exact before/after diff, so that the whole buying team is notified simultaneously without a manual relay.
Done When
- ✓Supplier monitoring: done when user adds a supplier URL with two baseline fields and receives a Slack alert within 6 hours when one field value changes on the live page.
- ✓Email PDF parsing: done when user forwards a supplier email with a PDF attachment to their VendorCast inbox and sees extracted changed terms appear in their alert history within 10 minutes.
- ✓Alert format: done when Slack alert shows supplier name, changed field name, old value in red, new value in green, and a link to the supplier dashboard entry.
- ✓Billing: done when free-tier user tries to add a third supplier, sees upgrade prompt, completes Stripe checkout, and immediately gains access to 20 supplier slots.
Is it worth building?
$79/month × 20 users = $1,580 MRR at month 2. $79/month × 100 users = $7,900 MRR at month 5. Assumes 4% conversion from targeted LinkedIn and Shopify community outreach to DTC ops managers.
Unit Economics
CAC: $25 via LinkedIn cold outreach. LTV: $948 (12 months at $79/month). Payback: 0.4 months. Gross margin: 85%.
Business Model
SaaS subscription
Monetization Path
Free: monitor up to 2 suppliers. Pro at $79/month: up to 20 suppliers, Slack alerts, email PDF parsing. Business at $199/month: unlimited suppliers, API access, custom alert rules.
Revenue Timeline
First dollar: week 2 via LinkedIn cold DM beta conversion. $1k MRR: month 2. $5k MRR: month 5. $10k MRR: month 9.
Estimated Monthly Cost
Fly.io Playwright worker: $30, Claude API: $20, Supabase: $25, Vercel: $20, Resend: $10, Stripe: $15. Total: ~$120/month at launch.
Profit Potential
Full-time viable at $7k–$15k MRR. High LTV due to ops-critical use case with low churn.
Scalability
High — supplier portal templates marketplace, 3PL integration, and purchase order auto-adjustment recommendations are natural V2 expansions.
Success Metrics
Week 1: 10 beta users add at least 3 suppliers. Week 3: 3 detected real changes alert correctly. Month 2: 20 paying users, 0 false positives reported.
Launch & Validation Plan
Cold DM 30 DTC ops managers on LinkedIn with a one-sentence question: 'Have you ever been blindsided by a supplier lead time change?' Target 15 yes replies before writing code.
Customer Acquisition Strategy
First customer: DM 20 Shopify Plus ops managers on LinkedIn offering free 60-day Pro in exchange for a case study. Then: Shopify Community forums, r/ecommerce, LinkedIn content targeting DTC ops managers, ProductHunt.
What's the competition?
Competition Level
Low
Similar Products
Sourcemap for supply chain visibility (enterprise, not change monitoring), Anvyl for supplier management (PO-focused, not change alerting), no direct competitor monitors portal field changes and email PDFs in one tool.
Competitive Advantage
No existing tool combines portal scraping with email PDF parsing into a unified change monitor — competitors like Sourcemap focus on supply chain mapping, not operational change alerting at the SKU level.
Regulatory Risks
Low regulatory risk. No PII beyond supplier contact names. GDPR data deletion endpoint required for EU users. Playwright ToS: respect robots.txt and rate limit scraping to avoid IP blocks.
What's the roadmap?
Feature Roadmap
V1 (launch): portal scraping, email PDF parsing, Slack and email alerts, Stripe billing. V2 (month 2-3): login-required portal support via credential vault, custom alert rules per field, CSV baseline import. V3 (month 4+): PO adjustment recommendations, 3PL integration, supplier risk score.
Milestone Plan
Phase 1 (Week 1-2): Playwright scraper, diff engine, and Slack alert working end-to-end with 5 beta suppliers. Phase 2 (Week 3-4): email PDF parsing live, Stripe billing, 10 beta users monitoring real suppliers. Phase 3 (Month 2): ProductHunt launch, LinkedIn content push, 20 paying users.
How do you build it?
Tech Stack
Next.js, Playwright for portal scraping, Claude API for email PDF parsing, Supabase, Resend for email alerts, Slack API for Slack alerts, Stripe — build with Cursor for Playwright automation, v0 for the supplier dashboard UI.
Suggested Frameworks
Playwright, Anthropic SDK, pdf-parse
Time to Ship
2 weeks
Required Skills
Playwright browser automation, Claude API document parsing, Supabase cron for scheduled monitoring runs, Slack webhook integration.
Resources
Playwright docs, Anthropic SDK docs, Slack incoming webhook docs, pdf-parse npm library.
MVP Scope
app/page.tsx (landing + supplier add form), app/dashboard/page.tsx (supplier list and alert history), app/api/suppliers/route.ts (CRUD for supplier profiles), app/api/scrape/route.ts (Playwright scrape job trigger), app/api/parse-email/route.ts (Claude PDF parsing endpoint), lib/db/schema.ts (suppliers, baselines, alerts schema), lib/diff.ts (field-level baseline diff logic), workers/scraper.ts (Playwright worker on Fly.io), .env.example (Playwright, Claude, Slack webhook keys).
Core User Journey
Sign up -> add first supplier URL and baseline data -> receive first scheduled scrape confirmation -> get Slack alert when any field changes -> upgrade to Pro for more suppliers.
Architecture Pattern
Supabase pg_cron fires every 6 hours -> Edge Function triggers Playwright scraper on Fly.io per supplier URL -> scraped fields diffed against Supabase baseline -> changed fields trigger Slack webhook and Resend email alert -> alert stored in Supabase alerts table -> user sees alert history in dashboard.
Data Model
User has many Suppliers. Supplier has one Baseline (key-value field store). Supplier has many ScrapeRuns. ScrapeRun has many Alerts. Alert stores field name, old value, new value, and notification status.
Integration Points
Playwright for portal scraping, Claude API for email PDF parsing, Supabase for supplier profiles and alert storage, Slack API for channel alerts, Resend for email alerts, Stripe for billing, Fly.io for Playwright worker hosting.
V1 Scope Boundaries
V1 excludes: login-required portal scraping, purchase order auto-adjustment, 3PL integrations, mobile app, team user management, API access tier.
Success Definition
An ops manager adds their suppliers on Monday, receives a real supplier change alert on Thursday with the correct before/after diff, forwards it to their buyer, and upgrades to Pro without any founder involvement.
Challenges
Playwright scraping breaks every time a supplier portal updates their HTML — the biggest ongoing maintenance cost is keeping scrapers alive, so the V1 must support a manual CSV upload fallback for portals that cannot be scraped without login complexity.
Avoid These Pitfalls
Do not build login-required portal scraping in V1 — Playwright with auth is a maintenance nightmare, so restrict V1 to public supplier portals and email PDF uploads only. Do not skip the CSV manual upload fallback or 40% of early users will churn because their portal requires login. Finding first 10 paying customers takes 3x longer than the scraper build — prioritize LinkedIn outreach over Playwright feature work.
Security Requirements
Supabase Auth with Google OAuth, RLS on suppliers and alerts by user_id, supplier portal credentials never stored in V1, rate limit Playwright scraping to 1 request per supplier per hour, GDPR data deletion endpoint.
Infrastructure Plan
Vercel for Next.js, Fly.io for Playwright worker, Supabase for Postgres and pg_cron, GitHub Actions for CI, Sentry for scraper error tracking.
Performance Targets
100 DAU at launch, 600 scrape runs/day. Scrape job completion under 30s per supplier. Alert delivery under 5 minutes after change detected. Dashboard page load under 2s.
Go-Live Checklist
- ☐Playwright scraper tested against 5 real supplier portal formats.
- ☐Stripe payment tested end-to-end.
- ☐Sentry error tracking live on Fly.io worker.
- ☐Slack webhook delivery confirmed on test workspace.
- ☐Custom domain with SSL configured.
- ☐Privacy policy published.
- ☐5 beta ops managers monitoring real suppliers.
- ☐Rollback plan: disable pg_cron scrape jobs via Supabase dashboard.
- ☐LinkedIn outreach sequence and ProductHunt post drafted.
First Run Experience
On first run: a demo supplier named 'Sample Textiles Co.' is pre-loaded with baseline lead time 21 days and MOQ 500 units, and a pre-generated alert shows a detected change to lead time 35 days from a simulated scrape. User can immediately browse the alert history and see the diff format. No manual config required: demo data seeded, real monitoring activates after first supplier URL is added.
How to build it, step by step
1. Define Drizzle schema for suppliers, baselines, scrape_runs, and alerts tables in lib/db/schema.ts with field-level key-value baseline store. 2. Run npx create-next-app with TypeScript and Tailwind. 3. Build supplier add form at app/page.tsx to capture portal URL and baseline field values. 4. Write Playwright scraper in workers/scraper.ts on Fly.io that visits supplier URL, extracts configurable CSS-selector-defined fields, and returns a JSON field map. 5. Build lib/diff.ts to compare scraped field map against stored Supabase baseline and return changed fields with old and new values. 6. Build app/api/parse-email/route.ts to accept forwarded email with PDF attachment, extract text via pdf-parse, and send to Claude API for structured field extraction. 7. Build Slack webhook sender and Resend email alert template with before/after diff display in app/api/alert/route.ts. 8. Set up Supabase pg_cron to fire scrape jobs every 6 hours per active supplier. 9. Add Stripe billing with 2-supplier free tier and Pro plan gate at 20 suppliers. 10. Verify: add a real supplier URL with a known field, manually change the test page value, confirm Slack alert fires with correct diff within 6 hours, and confirm Stripe upgrade unlocks the 20-supplier limit.
Generated
April 27, 2026
Model
claude-sonnet-4-6
Disclaimer: Ideas on this site are AI-generated and may contain inaccuracies. Revenue estimates, market demand figures, and financial projections are illustrative assumptions only — not financial advice. Do your own research before making any business or investment decisions. Technology availability, pricing, and market conditions change rapidly; always verify details independently.