Provides a foundational example for integrating custom tools with the Gemini CLI using the Model Context Protocol (MCP).
Serves as a practical demonstration for developers looking to integrate custom functionality with the Gemini conversational AI environment. This example project showcases the fundamental steps of building a Model Context Protocol (MCP) server in Python, exposing a simple 'greet' tool. It further guides users through configuring the Gemini CLI to seamlessly discover and invoke this custom tool, illustrating how to extend Gemini's capabilities with external services and functions.