The offline-rag-system is an innovative prototype designed for your convenience. Built for the Smart India Hackathon 2025, this tool allows you to effortlessly manage and retrieve information from various document types. Whether itβs PDFs, DOC/DOCX files, images, or audio, our system processes everything offline after an initial setup.
Unified Knowledge Base: Easily drag and drop all your documents, screenshots, and audio recordings into one simple store.
Offline-First Pipeline: Utilize advanced models for text/image embeddings and audio transcription. Everything runs without an internet connection after the first model download.
Cross-Modal Retrieval: Search through text queries to find relevant pages in documents, audio segments with precise timestamps, and image metadata.
Grounded Answers: Receive thorough summaries with clear citations, including filename, page, timestamp, and chunk ID.
Local LLM Option: Enhance your experience by integrating a llama.cpp GGUF checkpoint for more in-depth answer synthesis.
Modern UI: Enjoy a sleek chat interface powered by Next.js 15, featuring search options such as MMR, similarity, and multiple query capabilities.
To run the offline-rag-system, ensure your computer meets these basic requirements:
To get started with the offline-rag-system, follow these simple steps:
Visit the Release Page: Go to our Releases page to access the latest version. Here is the link: Download Here.
Choose Your Version: You will see a list of available downloads. Select the version of the application suited for your operating system.
Download the Package: Click on the appropriate link for your system. The file will start downloading automatically.
.exe file, double-click it, and follow the on-screen instructions to install..dmg file and drag the application to your Applications folder.chmod +x YourDownloadedFile.run followed by ./YourDownloadedFile.run.Run the Application: After installation, launch the application from your applications list or desktop shortcut.
Initial Setup: Upon first launch, the application will guide you through the setup process. Follow the prompts to download necessary models for offline use.
The application boasts a user-friendly interface. Hereβs what to expect once you open it:
Search Bar: At the top, you will find the search bar where you can type in your query.
Results Pane: The area below the search bar will display all relevant results based on your query.
Citation Control: For every retrieved answer, you will find citation details clearly listed next to your results, ensuring you can reference original sources easily.
If you want to enhance your experience even further, consider these advanced options:
Custom Checkpoint Integration: If you have a llama.cpp GGUF checkpoint, you can plug it into the system for richer answer generation. The application will guide you on how to do this during the setup.
Adjust Search Parameters: Use the settings to tweak how results are sorted. You can prioritize similarity, recentness, or other factors depending on your needs.
If you encounter any issues while setting up or using the application, consider these suggestions:
Check System Requirements: Ensure your system meets all required specifications.
Reinstall the Application: If experiencing bugs, deleting and reinstalling the application often resolves most issues.
Seek Help: If problems persist, refer to the Issues section on our GitHub repository for community support.
You can provide feedback or report issues on the GitHub Issues page for the project.
Currently, this software is designed for educational and personal use only. Check the license for more information.
While there are no strict limits, we recommend keeping individual file sizes below 500 MB for optimal performance.
To download the application, visit our Releases page: Download Here.
Enjoy exploring the rich capabilities of the offline-rag-system!