Skip to content
#

openai-compatible

Here are 152 public repositories matching this topic...

⚡ Python-free Rust inference server — OpenAI-API compatible. GGUF + SafeTensors, hot model swap, auto-discovery, single binary. FREE now, FREE forever.

  • Updated Jan 16, 2026
  • Rust

A high-performance API server that provides OpenAI-compatible endpoints for MLX models. Developed using Python and powered by the FastAPI framework, it provides an efficient, scalable, and user-friendly solution for running MLX-based vision and language models locally with an OpenAI-compatible interface.

  • Updated Mar 6, 2026
  • Python

Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy‑first Chrome extension to chat with local LLMs via Ollama, LM Studio, and llama.cpp. Supports streaming, stop/regenerate, RAG, and easy model switching — all without cloud APIs or data leaks.

  • Updated Feb 8, 2026
  • TypeScript

AI Amnesia solved. 5-layer persistent memory for local AI assistants. Built by a non-coder running a live business. 353 sessions. 8 months. One nuclear reset. Still running.

  • Updated Mar 6, 2026
  • Python
AI-Worker-Proxy

OpenAI-compatible AI proxy: Anthropic Claude, Google Gemini, GPT-5, Cloudflare AI. Free hosting, automatic failover, token rotation. Deploy in 1 minute.

  • Updated Mar 4, 2026
  • TypeScript

Improve this page

Add a description, image, and links to the openai-compatible topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the openai-compatible topic, visit your repo's landing page and select "manage topics."

Learn more