This tool functions as an MCP (Model Context Protocol) server, offering up-to-the-minute token pricing for a vast array of Large Language Models. It enables users to effortlessly query, compare, and estimate costs for over 60 models from 15 leading providers like OpenAI, Anthropic, Google, and Meta, integrating directly into AI assistants like Claude Desktop or Cursor. Built by the TokenCost team, it ensures pricing accuracy, reflecting new models, price changes, and deprecations promptly.
Características Principales
01Get pricing for specific LLM models
02Side-by-side pricing comparison for multiple models
03Estimate costs based on given token counts
04Find cheapest models with customizable filters
05List all available models and providers with pricing ranges
062 GitHub stars