·4 min read·fluxLab.dev

Adding AI Features to SaaS Products Without Breaking the Bank

How fluxLab.dev integrates Claude AI into Jobber for job parsing, resume matching, and cover letter generation while keeping costs under control.

AIClaudeSaaSLLM

Introduction

When we added AI features to Jobber — our job application tracking platform — we had a clear goal: make the product smarter without blowing up our infrastructure costs. Here's how we integrated Claude AI for job parsing, resume-to-job matching, and cover letter generation while keeping API costs predictable.

The Features

Jobber uses AI in three areas:

  1. Job Import — Paste a URL, AI extracts job title, company, location, salary, and requirements
  2. Resume Match Score — Compare a resume against a job posting for a compatibility percentage
  3. Cover Letter Generation — Generate tailored cover letters based on job description and resume content

Architecture: The Layered Approach

We don't call Claude AI for every request. Instead, we use a layered parsing strategy that minimizes API costs:

Layer 1: Structured Data Extraction

Before touching AI, we check if the job page has JSON-LD structured data. LinkedIn, Indeed, and Glassdoor all embed this. When available, we extract everything we need without any AI call — zero cost.

Layer 2: DOM Pattern Matching

For known job boards, we use site-specific extraction rules. These are simple pattern matchers that know where each site puts the job title, company name, and description.

Layer 3: AI Fallback

Only when layers 1 and 2 fail do we call Claude Haiku. We chose Haiku over Sonnet for cost efficiency — it's 90% as capable for structured extraction tasks at a fraction of the price.

Prompt Engineering

Our prompts are structured for consistent, parseable output:

Extract the following fields from the job posting text.
Return ONLY valid JSON with these exact keys:
{
  "title": "string",
  "company": "string",
  "location": "string",
  "salary": "string or null",
  "description": "string (first 500 chars)"
}

Key principles:

  • Constrain the output format — always request JSON with exact field names
  • Limit response size — cap description length to reduce token usage
  • Include examples — one-shot examples improve accuracy significantly
  • Validate responses — parse the JSON and verify required fields exist

Cost Control

Per-User Quotas

Each subscription tier has AI usage limits:

  • Free: 10 AI job imports per month
  • Pro: 100 AI job imports, unlimited match scores, 50 cover letters
  • Enterprise: Unlimited everything

Model Selection by Task

  • Job parsing: Claude Haiku (fast, cheap, good enough for extraction)
  • Resume matching: Claude Haiku (scoring doesn't need deep reasoning)
  • Cover letter generation: Claude Sonnet (needs better writing quality)

Caching

We cache AI results by content hash. If a user imports the same job URL twice, the second call hits the cache. For resume match scores, we cache by (job_id, resume_id) pair with a 24-hour TTL.

Results

After three months of production usage:

  • Average AI cost per user: $0.03/month
  • Job parsing success rate: 94% (Layer 1: 45%, Layer 2: 30%, Layer 3: 19%)
  • Match score accuracy: Users report 87% agreement with AI scores
  • Cover letter satisfaction: 72% of generated letters used without major edits

Lessons Learned

  1. Don't default to AI — deterministic extraction is faster, cheaper, and more reliable
  2. Use the smallest model that works — Haiku handles 80% of our AI tasks
  3. Enforce quotas early — users will explore AI features aggressively at first
  4. Cache aggressively — same job postings get imported by multiple users
  5. Always have a fallback — when AI is unavailable, the app still works (just without enrichment)

Conclusion

AI features are a powerful differentiator for SaaS products, but they don't have to be expensive. By layering deterministic approaches before AI, selecting the right model for each task, and caching results, we keep Jobber's AI costs manageable while delivering real value to users.