Provides a comprehensive server for the Model Context Protocol, integrating self-hosted LLM models via Ollama with a Supabase database for robust data persistence and retrieval.
This tool serves as a Model Context Protocol (MCP) server, designed to bridge the gap between large language models (LLMs) and structured databases. It enables seamless interaction with self-hosted LLMs like Llama2 and CodeLlama through Ollama, while leveraging Supabase for all data operations including storage, retrieval, and querying. Developers can utilize its well-defined API and integrated tools to build powerful AI applications that require both dynamic text generation and reliable data management, all within a containerized and easily deployable environment.