Job Search Agent

Table of Contents

This is a list of all the sections in this post. Click on any of them to jump to that section.

Job Search Agent

FastAPI React PostgreSQL Redis Docker LangChain

Job Search Agent is an AI-powered assistant that takes your CV, understands your skills and experience, then searches the web for jobs that actually match your profile. Instead of scrolling through job boards and guessing what fits, you upload a PDF, and the agent does the searching, scoring, and ranking for you — all through a simple chat interface.

The interesting part of this project is the agent architecture. It is built on LangChain DeepAgents, which is essentially an agent harness on top of LangGraph. Think of it like a manager (the orchestrator) who delegates work to specialist workers (sub-agents). When you upload a CV, the orchestrator figures out the intent — is the user uploading a resume, asking to search jobs, or just chatting? Based on that, it hands the task off to the right sub-agent. Each sub-agent has its own set of tools and works in its own context, so they don’t step on each other’s toes.

What makes DeepAgents useful here is the built-in support for planning, sub-agent spawning, and durable execution through LangGraph. The orchestrator can break down a complex request into steps, delegate each step to a specialized agent, and keep track of everything through PostgreSQL-backed checkpointing. If something fails mid-way, the agent can pick up where it left off. This was especially important for the two-phase job search flow, where the first phase fetches quick results and the second phase deep-scrapes selected job postings for details like salary and requirements.

I also added human-in-the-loop (HITL) approval before any external API calls. The agent pauses and asks for your confirmation before it starts searching, so you always stay in control. The frontend shows real-time progress through Server-Sent Events (SSE), so you can watch exactly what the agent is doing as it works.

Features

  • CV Parsing - Upload a PDF resume and the agent extracts a structured profile with skills, experience, and job titles
  • Two-Phase Job Search - Quick search returns 15 scored jobs, then detail scrape pulls salary, requirements, and benefits for jobs you pick
  • Job Scoring - Each job gets a 0-100 match score with clear reasons explaining why it fits your profile
  • Human-in-the-Loop - The agent asks for your approval before making external API calls
  • Real-Time Streaming - Watch the agent work live through SSE events, not just a loading spinner
  • Session Persistence - Chat history stays across page reloads with full session management
  • Dark/Light Mode - Clean UI with Indigo accent, Outfit + DM Sans typography

Tech Stack

Backend

  • FastAPI - Async web framework handling REST endpoints and SSE streaming
  • LangChain DeepAgents - Agent harness for orchestrator and sub-agent architecture
  • LangGraph - Runtime engine for durable agent execution and checkpointing
  • DeepSeek-V3 - LLM powering the agent reasoning
  • PostgreSQL - Persistent storage for user data, sessions, and LangGraph checkpoints
  • Redis - Rate limiting backend
  • SQLAlchemy - ORM with Alembic migrations

Frontend

Search & Scraping

Infrastructure

  • Docker - Compose setup with nginx, backend, PostgreSQL, and Redis
  • nginx - Reverse proxy serving the frontend and routing API calls

How the Agent Works

The system uses an orchestrator pattern with three specialized sub-agents:

Job Search Agent Architecture

Every message you send goes through intent detection first. The orchestrator classifies it as one of four types: CV upload, search jobs, refine search, or general chat. Based on the intent, it either delegates to a sub-agent or responds directly.

The two-phase search flow is where this gets practical. Phase 1 (quick search) casts a wide net and returns up to 15 jobs with match scores. You then pick the ones that look interesting, and Phase 2 (detail scrape) goes deeper on just those jobs using Firecrawl to pull out salary ranges, full requirements, and benefits that aren’t always visible in search snippets.

Prerequisites

Before you begin, ensure you have the following:

  • Python 3.11 or higher
  • Node.js 22 or higher
  • Docker (for PostgreSQL and Redis, or full deployment)
  • uv package manager (recommended)
  • API keys for:
    • DeepSeek API (required) - LLM provider
    • Tavily API (required) - Primary web search
    • Brave Search API (optional) - Backup search
    • Firecrawl API (optional) - Deep page scraping

Installation & Setup

  1. Clone the repository
git clone https://github.com/Rahat-Kabir/job-search-agent.git
cd job-search-agent
  1. Configure environment
cp .env.example .env
# Edit .env with your API keys
  1. Docker deployment (recommended)
docker compose up --build -d

Open http://localhost — nginx serves the frontend and proxies API calls to the backend.

  1. Local development (alternative)
# Start Postgres + Redis
docker compose up db redis -d
 
# Backend
uv sync
uv run uvicorn backend.api.app:app --reload --host 127.0.0.1 --port 8020
 
# Frontend (new terminal)
cd frontend && npm install && npm run dev

Important Notes

  • API Keys: At minimum, you need DeepSeek and Tavily keys. Other search APIs are optional but improve research quality
  • Graceful Degradation: The agent works with whatever search APIs are available
  • Human-in-the-Loop: The agent always asks before making external calls, keeping you in control
  • Checkpointing: LangGraph checkpoints to PostgreSQL, so agent state survives restarts
  • Token Overhead: Sub-agent context passing adds ~30-50% token overhead, mitigated by trimming and compact outputs
  • No Auth: Currently uses X-User-ID header from CV upload for user identification

Contributing

This project is open-source and welcomes contributions! Feel free to fork, improve, and submit pull requests. Whether it’s adding new search providers, improving the scoring algorithm, or enhancing the UI, all contributions are appreciated.

License

This project is licensed under the MIT License and is open-sourced for educational and professional use.