← Back to Preview Hub
Video 1 — Claude Code Lead Machine
Generated by Bloop 🫧 · S&V Preview Hub
Video 1 Analysis: "How I Get Unlimited Leads Using Claude Code (For Cold Email)"
Channel: Sales Automation (likely Eric Nowoslawski / Fixer team)
URL: https://youtu.be/Vo9VUnzYqpw
Topic: Building custom lead generation infrastructure with Claude Code to replace Clay at scale
📋 OVERVIEW
A cold email agency operator who was Clay's largest user (17.3M API hits/week) explains how his team—none of whom are developers—used Claude Code to vibe-code an entire custom lead generation system in roughly a week. The system processes 272,000 leads/second (vs Clay's 27 hours for 1M leads), costs ~$2,000/month to run, and includes Google Maps scraping, AI lead enrichment, ad library scraping, campaign analytics, and a 50M-lead private database. The speaker's core message: you don't need to code to build this—Claude Code and a clear pain point is all you need to start.
The ONE takeaway: You can replace expensive SaaS tooling (Clay, Apollo) by vibe-coding custom internal tools with Claude Code, even with zero coding experience, and the cost/speed advantage at scale is massive.
🎯 MAIN POINTS
1. Why They Left Clay [0:00-2:18]
- Processing 9 million leads/month for Fixer AI alone
- Clay limits: 50,000 rows per table, 12.5M rows per workspace
- Deleted tables take days to actually clear
- Clay was exploring charging per custom HTTP row/column
- Clicking "run all" thousands of times, waiting days for processing
- Decision: Build ahead of the pricing change, not after
2. The Switch from Cursor to Claude Code [2:20-2:54]
- Started with Cursor ~2 months ago, spent $3K/month
- Both speaker + James were spending $450/day on Cursor combined
- Switched to Claude Code when it launched (~1-2 weeks prior to video)
- Claude Code: $200/month per seat — "no-brainer"
- James (never touched these tools before) built the entire core system in ONE WEEK
3. The Tech Stack [2:54-6:12]
- GitHub — code storage & version control
- Railway — deploys workers (processing engine), hosts Postgres databases. ~$2,000/month for the lead processor
- 50 workers running simultaneously — "little robots that process leads 24/7"
- Postgres → migrating to Convex for real-time data handling
- Vercel — hosts dashboards and visual interfaces, syncs with GitHub, auto-deploys when Claude pushes code
- Claude Code — the builder/IDE ($200/month)
- Railway for N8N — previously used Railway for N8N processing workers too
4. Speed Comparison [6:12-6:50]
- Clay: 1 million leads = 27 hours (if perfect, often errors requiring reruns)
- Custom system: 272,000 rows/second = 1 million leads in 5 seconds
- This lets them pivot campaigns almost instantly when something isn't working
5. Custom Tools Built [7:06-14:23]
a) Google Maps Scraper [7:23-8:17]
- Scrapes local business leads zip code by zip code (not city/state)
- 32,000+ zip codes to ensure results for every query
- James built it in Cursor in 3-4 hours
- After scraping companies, runs AI enrichment to find contacts
- AI scrapes any public database to locate contacts at the company
- Another AI layer for segmentation (multi-location? years in business? confidence scores)
- Cost: $0.002 (1/5 of a penny) to find 3 qualified leads per company
b) AI Lead Finder [8:32-10:03]
- When primary vendor (ARarc) doesn't have a match, AI searches the entire internet
- Always tries to find 3 contacts per company
- If it can't find a valid email for a person, looks for personal emails or emails at their other jobs
- Returns LinkedIn profiles, confidence level, reasoning for fit
- Runs LinkedIn URL through waterfall process to get email
- Gets 95% valid email rate vs typical 30% from Apollo/LinkedIn
c) Ad Library Scrapers [10:16-10:57]
- Scrapes Google and LinkedIn ad libraries at scale
- Finds everyone running ads in last 30/60 days
- Filter by how many ads a company is running
- Logic: if running ads → actively trying to grow → has budget for client acquisition → warmer leads
d) Executive Summary / Analytics System [11:00-12:31]
- AI analyzes all campaigns daily
- Gives daily report on 15 randomly selected clients
- Average rates, email-to-lead ratios, meetings generated
- Building toward: analyzing copy by ICP — "what copy elements give highest multiplier for director of marketing at paper manufacturing company?"
- Matches every email to a schema: subject lines, hooks, CTAs, body messaging categories, social proof categories, ICP categorization
- Will generate weekly reports for clients
- Looks at all sends from last week + historical data + what's working for similar ICPs
e) Instant Workspace Cleaner [12:32-14:05]
- Automatically cleans Instantly workspaces (Instantly bills per stored lead)
- Deletes old leads, loads proper lead lists, tracks everything
- Sends "Friday analytics" report every Friday
- Looking at moving to Email Bison to avoid this billing model
f) Private Lead Database [12:53-14:23]
- Nearly 50 million leads
- Catalogs every lead pulled for every client (unless NDA prevents it)
- Most of the time don't need to buy new data — only fetch new market entrants
- Can analyze which data vendor has best data for each ICP
- e.g., "Lead Magic has best data for cybersecurity, Wiza performs better for e-commerce"
- Auto-refill: when client gets low, system deletes old leads from Instantly and uploads fresh ones
- Clients never run out of leads — runs on autopilot
6. Advice for Getting Started [14:23-16:00]
- Start with ONE pain point — your biggest bottleneck
- Don't try to build everything at once
- Ask Claude Code "how can I do this?" — it'll walk you step by step
- You don't need to understand the code, just know what you want to accomplish
- Bought Claude Code licenses for entire team including executive assistant
- EA uses it just to process CSVs faster
- Used WorkOS for authentication on internal tools
- Caution: wouldn't release these to the public — security isn't fully understood yet
7. Results / Social Proof [16:35-17:25]
- RB2B: $0 → $4M ARR in 4 months, 42% of revenue from cold email
- Fixer AI: $4.3M annual pipeline, could have been $32.2M ARR if hitting full TAM every 60 days
- Directive Consulting: 15-20 meetings/day through cold email
💎 DEEP DIVE
Gold Nuggets — Things Most People Will Miss
-
"We were Clay's largest user, hitting their platform 17.3 million times per week" — This isn't just bragging. It means they have deep knowledge of what breaks at scale. If you're building cold email systems, listen to the person who actually stressed the infrastructure to its limits.
-
"James, who had never touched any of these tools before, built out the entire core system in a week" [2:48] — The real insight: the person who built their million-dollar lead system learned Claude Code 3 WEEKS ago. The barrier to entry for building custom SaaS-replacement tools is essentially zero now.
-
"When you build something for yourself, you can go crazy with it... if Clay errors 1% of the time that's hundreds of thousands of users impacted" [4:24-4:38] — This is the fundamental insight about internal tools vs. SaaS: you only need it to work for YOUR use case. Custom tools can be 1000x faster because they don't need to be safe for everyone.
-
"We're looking at moving everything over to Email Bison just because of that particular pain point" [12:38-12:42] — Casual mention that Instantly's billing model (per stored lead) is a big enough pain that they're migrating an entire operation. Email Bison and Smart Lead don't charge this way. Huge signal for anyone choosing a sequencer.
-
"She'll even use it just to process CSVs faster" [14:40-14:42] — The executive assistant using Claude Code. This reveals that the tool isn't just for building apps — it's becoming the default way to handle ANY data task for the entire team.
-
"For us, it was about processing speed and row limits... we'd have to do a bunch of pre-work for a month to actually get the leads ready. Now we don't need a month. We just need a few days" [15:01-15:15] — The real bottleneck wasn't just speed — it was LAUNCH TIME. Going from 1 month of pre-work to a few days means they can onboard high-value clients ($100K/month accelerator plan, 5M emails) in 1-3 weeks. The speed advantage compounds into a revenue advantage.
-
"All of these features to be put into Outfound if and when we decide to release Outfound.io" [14:14-14:21] — They're essentially R&D-ing their next product (Outfound) using their agency as the test bed. Every custom tool they build internally becomes a potential feature in a SaaS product.
Insinuated & Implied Information
-
Clay is about to become more expensive — The mention of Clay "playing around with charging per custom HTTP row and custom columns" suggests insider knowledge of upcoming pricing changes. They built ahead of this.
-
Outfound.io is the real play — The agency is the cash cow funding the SaaS product. Every tool they vibe-code becomes Outfound IP. The agency validates the tools at extreme scale before they become product features.
-
Their competitive moat is data, not code — 50M leads + performance data on which vendors work for which ICPs = compounding advantage. The code can be replicated; the data cannot.
-
They're making their entire cold email operation autonomous — The progression (auto-refill leads → auto-clean workspaces → auto-generate campaigns from data analysis → auto-executive summaries) points toward a system that needs minimal human intervention per client.
-
ARarc has replaced Apollo as their primary data vendor — Mentioned casually at [8:48] — "one of our favorite vendors now, we just recently found them and they've basically completely replaced Apollo for us."
-
The waterfall enrichment approach — They don't rely on one data source. They stack: primary vendor → AI internet search → personal email lookup → other-job email lookup. This layered approach is why they hit 95% valid email rate.
-
WorkOS for authentication — They're using WorkOS to secure internal tools, which means these dashboards are accessible to their team via the web, not just locally. This is more sophisticated than most vibe-coded tools.
Small Details That Matter
- Zip code by zip code scraping — The key to Google Maps scraping is zip-code-level queries (32,000+ zips) rather than city/state, ensuring you get results for every query without Google's result limits cutting you off
- 50 workers running simultaneously — This is the Railway configuration that enables the 272K rows/second processing
- Convex vs Postgres decision — Convex for real-time data, Postgres for complex products like Outfound. Different databases for different needs.
- 3 leads per company — Their target is always 3 contacts per company, not 1. This triples their chances of reaching someone.
- Confidence scores on enrichment — The AI doesn't just find leads; it scores how confident it is in the match. This prevents bad data from entering campaigns.
- Multi-company contact discovery — If Joe works at Company A (target) and Company B, and Company A email bounces, they'll email Joe at Company B about the Company A opportunity. Clever and unusual.
- Friday analytics report — Automated weekly report to the team. Named specifically "Friday analytics."
- Vercel auto-deploys from GitHub — When Claude Code pushes code changes, Vercel automatically deploys the updated dashboard. Zero manual deployment.
Mistakes, Warnings & Lessons Learned
- Cursor was expensive — $3K/month for one person, $450/day for two. Claude Code at $200/month was a massive cost reduction.
- Don't release vibe-coded apps publicly — Speaker explicitly warns about security concerns. These are internal tools only.
- Clay's row limits will bite you — 50K rows/table, 12.5M/workspace, tables take days to delete. If you're scaling, you'll hit these walls.
- Start with one pain point — Don't try to build everything. Build for your biggest bottleneck first, then expand.
Competitive & Market Intelligence
- Clay — Still recommended for smaller users, but has fundamental scale limitations
- Apollo — Being replaced by ARarc for lead data
- ARarc — New favorite vendor, seems to have better coverage/pricing than Apollo
- Lead Magic, Wiza — Different vendors have different strengths by industry
- Instantly — Current sequencer but likely being replaced by Email Bison due to billing model
- Smart Lead — Alternative sequencer they also use
- Email Bison — Rising favorite, doesn't charge per stored lead
- Outfound.io — Their upcoming SaaS product that will unify all these tools
- Convex — Being adopted for real-time database needs
🔧 COMPLETE BREAKDOWN
Full Tool & Tech Stack
| Tool |
Role |
Cost |
| Claude Code |
Vibe coding / building all tools |
$200/mo per seat |
| GitHub |
Code storage & version control |
- |
| Railway |
Worker deployment + Postgres hosting |
~$2,000/mo |
| Vercel |
Dashboard hosting, auto-deploy |
- |
| Convex |
Real-time database (migrating to) |
- |
| Postgres |
Lead database (50M leads) |
Hosted on Railway |
| ARarc |
Primary lead data vendor |
- |
| Lead Magic |
Secondary data vendor (good for cybersecurity) |
- |
| Wiza |
Secondary data vendor (good for e-commerce) |
- |
| Instantly |
Email sequencer (being phased out) |
- |
| Smart Lead |
Email sequencer (backup/alternative) |
- |
| Email Bison |
Email sequencer (likely next primary) |
- |
| WorkOS |
Authentication for internal dashboards |
- |
| N8N |
Previously used for workflow automation |
Hosted on Railway |
Key Numbers
| Metric |
Value |
| Leads processed per second |
272,000 |
| 1M leads processing time (custom) |
5 seconds |
| 1M leads processing time (Clay) |
27 hours |
| Monthly lead volume (Fixer AI peak) |
9 million |
| Weekly Clay API hits (peak) |
17.3 million |
| Railway monthly cost |
~$2,000 |
| Claude Code monthly cost |
$200/seat |
| Previous Cursor monthly cost |
$3,000 |
| Previous Cursor daily cost (2 users) |
$450 |
| Cost per 3 enriched leads |
$0.002 (1/5 penny) |
| Private lead database size |
~50 million |
| US zip codes scraped |
32,000+ |
| Simultaneous workers |
50 |
| Clay rows per table limit |
50,000 |
| Clay workspace row limit |
12.5 million |
| Valid email rate (their system) |
~95% |
| Valid email rate (Apollo/LinkedIn) |
~30% |
| RB2B revenue (4 months) |
$4M ARR |
| RB2B cold email contribution |
42% of revenue |
| Fixer AI pipeline |
$4.3M annual |
| Fixer AI potential (full TAM/60 days) |
$32.2M ARR |
| Directive meetings/day |
15-20 |
| Accelerator plan client spend |
$100K/month |
| Accelerator plan emails |
5 million |
| Client launch time |
1-3 weeks |
⚡ ACTIONABLE TAKEAWAYS
If You Want to Replicate This:
- Get Claude Code ($200/mo) — this is the builder
- Identify your #1 bottleneck — for them it was processing speed + row limits
- Start with one tool — e.g., a lead enrichment script or a custom scraper
- Use GitHub for version control (even if you don't know what it is — Claude Code will handle it)
- Deploy on Railway — cheap, handles scaling, hosts databases
- Build dashboards on Vercel — auto-deploys from GitHub
- Stack your data sources — don't rely on one vendor. Build a waterfall: primary vendor → AI search → personal email fallback
- Process leads in bulk with parallel workers — Railway lets you run 50+ simultaneously
- Build analytics from day one — track email-to-lead ratios, vendor performance by ICP, campaign performance
Minimum Budget to Start:
- Claude Code: $200/mo
- Railway: starts small, scales with usage
- Data vendors: varies
- Total MVP: probably $300-500/month
Key Transferable Principles:
- Build for yourself, not for everyone — internal tools can be ugly and fragile, they just need to work for you
- Speed of iteration > perfection — being able to reload lists instantly is worth more than a polished UI
- Data compounds — every lead, every campaign result, every vendor comparison makes the next decision better
- Non-coders can build production systems — the barrier is knowing what you want, not knowing how to code