LocaLLama icon

LocaLLama

CreatedHeratiki

Dynamically routes coding tasks between local LLMs, free APIs, and paid APIs to optimize costs.

About

LocaLLama intelligently manages the routing of coding tasks across various LLM resources to minimize expenses. By leveraging a decision engine, it compares the cost and quality trade-offs between local LLMs (like LM Studio or Ollama), free APIs, and paid APIs, offloading tasks to the most efficient option. It integrates with tools like Roo Code and Cline.Bot, enabling cost-effective coding assistance through dynamic routing and configurable thresholds.

Key Features

  • API integration with local LLMs and OpenRouter
  • 16 GitHub stars
  • Cost and token monitoring module
  • Fallback and error handling mechanisms
  • Decision engine with configurable thresholds
  • Benchmarking system for performance comparison

Use Cases

  • Integrate with existing coding tools like Roo Code and Cline.Bot
  • Optimize usage of local and cloud-based LLMs
  • Reduce costs associated with LLM-powered coding assistance