Provides LLMs with cached Google Drive data via an OpenAI-compatible proxy for efficient interaction.
This Python application serves as a backend for Large Language Models (LLMs), enabling them to access structured data from a Google Drive Excel file. To enhance efficiency and mitigate Google Drive API limitations, it implements a SQLite-based cache that refreshes daily and upon server restart. The tool integrates APScheduler for managing background synchronization tasks and utilizes `mcpo` to transform the MCP server into an OpenAI-compatible proxy, facilitating seamless integration with LLM frontends like OpenWebUI.