Providers
Overview

Provider Overview

ProxyAI connects to different AI services (providers) to power its features. Your choice of provider determines which Large Language Model (LLM) works behind the scenes in your IDE.

ProxyAI Cloud

Our cloud service offers the simplest way to get started. You'll get access to carefully selected powerful AI models, including some exclusive options. Setup is minimal - just sign up and add an API key.

Other Cloud Providers

Connect directly to major AI platforms like OpenAI, Anthropic, or Google. This gives you flexibility to use specific models from these services with your own API keys and accounts.

Local Models (Ollama, Llama.cpp)

Run LLMs directly on your machine using tools like Ollama or Llama.cpp. Your code and prompts never leave your computer, giving you complete privacy and control. This works well for offline use or sensitive data, but requires more setup and a capable computer.

Custom OpenAI Compatible

Connect to services that implement the OpenAI API. This works with alternative cloud providers (Groq, Anyscale, Together AI) or private LLM deployments that follow the OpenAI API structure.

Find detailed setup instructions for each provider type in the following sections.