Server
Provides a robust backend for AI agents, delivering custom instruction files and executable scripts.
About
This Python-based MCP (Multi-Cloud Platform) server, built with FastAPI and Uvicorn, acts as a central hub for managing and serving custom AI agent instructions and bash scripts. It maintains an organized 'agents-library' containing AGENTS.md files for agent rules, prompts, and contextual information, alongside executable bash scripts. Designed for seamless integration, it enables AI command-line interfaces like `gemini-cli` to retrieve specific agent instructions and execute tools, enhancing AI's capability to understand context and perform actions.
Key Features
- Serves custom AGENTS.md instruction files for AI agents
- Supports containerization with Docker and Docker Compose
- Hosts and executes custom bash scripts via API
- Integrates with `gemini-cli` for AI agent context and tool invocation
- 0 GitHub stars
- Built with FastAPI for high-performance API delivery
Use Cases
- Enabling consistent AI agent behavior across development and deployment environments
- Providing dynamic, up-to-date agent prompts and scripts to AI-powered CLIs
- Centralizing and managing AI agent instructions and contextual rules