OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPI—use as library or standalone service.
-
Updated
Nov 23, 2025 - Python
OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPI—use as library or standalone service.
Comprehensive Guide Running Kuzco Inference Training LLM Models AI w/ CPU & Docker - Commit Deployment
🔗 Simplify API interactions with this OpenAI-compatible proxy designed for seamless integration in Python projects.
API Backend of my submission for the CapitalOne Yelp Challenge
Primer hito del proxy HTTP seguro para usar APIs internas/abiertas dentro de la red agregando autenticación Bearer. Actúa como “drop-in” delante de servicios existentes: reenvía cualquier ruta al destino, preserva headers/cuerpo, soporta streaming y expone endpoints de salud y métricas.
Add a description, image, and links to the api-proxy topic page so that developers can more easily learn about it.
To associate your repository with the api-proxy topic, visit your repo's landing page and select "manage topics."