AIDive
BerriAI-litellm logo

BerriAI-litellm

Python SDK and proxy to unify multiple LLM APIs

Description

BerriAI-litellm is a Python SDK and proxy server for working with large language models through a single, OpenAI-compatible API format. It’s designed for teams that use multiple text-generation platforms and want a consistent interface for integration and operations.

What it does
  • Connects to 100+ LLM APIs using an OpenAI-style request/response format
  • Lets you call models from Bedrock, Azure, OpenAI, and VertexAI through one interface
  • Translates provider-specific calls into a unified format to reduce integration complexity
Operations and controls
  • Load balancing across configured providers
  • Budget tracking to monitor usage and spend
  • Request limiting to control traffic and protect resources
Setup notes

You install the SDK and configure the proxy server, then add the LLM providers you need. Initial setup requires familiarity with proxy configuration, and overall reliability depends on the external model providers you connect.

Summary

Tags

    Newsletter

    Get notified when new AI tools are added

    Join the community.