Cursor Chat History
Enables searching of vectorized Cursor IDE chat history using LanceDB and a local LLM.
About
Vectorizes and indexes chat history from the Cursor IDE to enable similarity searches via a self-hosted API. It extracts user prompts, generates embeddings using a local Ollama instance, and stores them in a LanceDB vector database. A FastAPI application serves as an API endpoint, allowing users to query the vectorized chat history for Retrieval Augmented Generation (RAG) and other LLM-based analysis.
Key Features
- Extracts prompts from Cursor IDE's local chat history
- Provides health check and search API endpoints
- Includes a Dockerized FastAPI application for searching
- 2 GitHub stars
- Generates text embeddings using a local Ollama instance (nomic-embed-text)
- Stores prompts and embeddings in a LanceDB vector database
Use Cases
- Retrieval Augmented Generation (RAG) for LLMs
- Analyzing chat history to identify patterns and improve code
- Searching past coding conversations and solutions