I'm always excited to take on new projects and collaborate with innovative minds.

Social Links

Build a Realtime Local PDF Chatbot with Blazor + Ollama (No Cloud Required)

Create a realtime chatbot that answers questions from your own PDFs—without the cloud. This Blazor Server app uses Ollama for local AI, LiteDB for vector storage, and PdfPig for text extraction. 100% offline, free, and privacy-friendly. Full GitHub repo and setup guide included.

Build a Realtime Local PDF Chatbot with Blazor + Ollama (No Cloud Required)

Cloud AI tools are powerful — but sometimes, you don’t want to upload your private files.

That’s why I created a Blazor Server app that lets you chat with your PDFs offline using Ollama. It’s fast, private, and free.

📂 GitHub Repo: https://github.com/DheerGupta35959/BlazorPdfChat


Key Features

  • 📄 Upload multiple PDFs
  • 🧠 Automatic text extraction & chunking
  • 💾 Local vector storage with LiteDB
  • 🔍 AI answers only from your documents
  • 💬 Realtime streaming chat with token-by-token updates

Architecture

Blazor Server (UI) → ASP.NET Core API → LiteDB Vector Store
                         ↕
                   Ollama REST API
                         ↕
                    Local AI Models

Tech Stack

  • .NET 8 / Blazor Server
  • PdfPig → Extract PDF text
  • LiteDB → Store embeddings & chunks
  • Ollama → Embeddings (nomic-embed-text) + Chat (llama3)
  • SignalR → Stream AI responses

How It Works

1. Ingest PDFs

  • PDFs are parsed into plain text
  • Text is split into manageable chunks
  • Each chunk is converted into embeddings via Ollama
  • Chunks + embeddings are stored locally in LiteDB

2. Chat

  • User enters a question in the Blazor chat
  • The app searches LiteDB for the most relevant chunks using cosine similarity
  • Context + question are sent to Ollama’s chat model
  • Tokens are streamed back in real-time

Setup Guide

  1. Install Ollamahttps://ollama.ai
  2. Run Ollama in the background:

    ollama serve
    
  3. Pull models:

    ollama pull llama3
    ollama pull nomic-embed-text
    
  4. Clone the repo:

    git clone https://github.com/DheerGupta35959/BlazorPdfChat.git
    cd BlazorPdfChat
    
  5. Run the app:

    dotnet run
    
  6. Open http://localhost:5000 in your browser
  7. Upload PDFs → Start chatting

Why This Matters

  • 🚫 No API costs
  • 🔒 Works completely offline
  • 📈 Perfect for internal docs, contracts, research papers

Conclusion

AI doesn’t have to mean giving up privacy or control.
With Blazor + Ollama, you can have the power of AI on your own terms.

📂 GitHub Repo: https://github.com/DheerGupta35959/BlazorPdfChat

2 min read
Aug 15, 2025
By Dheer Gupta
Share

Leave a comment

Your email address will not be published. Required fields are marked *