Take control of your AI development experience. Switch between 85%+ AI models, maintain complete data privacy, and reduce costs with local deployment.
Download from Mac App StoreCompatible with 85%+ OpenAI-compatible APIs. Switch between models with just a few clicks.
Local deployment ensures your data never leaves your machine. Complete control over your AI interactions.
Choose cost-effective models or run open-source alternatives locally. No vendor lock-in.
Search "Claude Code Proxy" or use the direct link below
Follow the simple setup wizard to configure your first AI model
Begin using Claude Code with your preferred AI model instantly
Claude Code Proxy is available on the Mac App Store with a one-time purchase. There are no subscription fees - you only pay for the AI models you choose to use.
We support 85%+ of OpenAI-compatible APIs, including Claude, GPT-4, local Llama models, and custom endpoints. You can switch between models seamlessly.
Claude Code Proxy can redirect requests to locally running models like Ollama, ensuring your data never leaves your machine while maintaining full functionality.
Absolutely. With local deployment options, your code and conversations never leave your machine. We follow privacy-first principles in all our design decisions.