I'm always excited to take on new projects and collaborate with innovative minds.
Create a realtime chatbot that answers questions from your own PDFs—without the cloud. This Blazor Server app uses Ollama for local AI, LiteDB for vector storage, and PdfPig for text extraction. 100% offline, free, and privacy-friendly. Full GitHub repo and setup guide included.
Cloud AI tools are powerful — but sometimes, you don’t want to upload your private files.
That’s why I created a Blazor Server app that lets you chat with your PDFs offline using Ollama. It’s fast, private, and free.
📂 GitHub Repo: https://github.com/DheerGupta35959/BlazorPdfChat
Blazor Server (UI) → ASP.NET Core API → LiteDB Vector Store
↕
Ollama REST API
↕
Local AI Models
nomic-embed-text
) + Chat (llama3
)Run Ollama in the background:
ollama serve
Pull models:
ollama pull llama3
ollama pull nomic-embed-text
Clone the repo:
git clone https://github.com/DheerGupta35959/BlazorPdfChat.git
cd BlazorPdfChat
Run the app:
dotnet run
http://localhost:5000
in your browserAI doesn’t have to mean giving up privacy or control.
With Blazor + Ollama, you can have the power of AI on your own terms.
📂 GitHub Repo: https://github.com/DheerGupta35959/BlazorPdfChat
Your email address will not be published. Required fields are marked *