Last updated: September 11th

If you’re trying to build a rough intuition of how Cody works from the client to the backend, this page is for you. Let’s start with a diagram showing all the different components that make up “Cody”

cody-architecture.svg

<aside> 💡

Here’s the Excalidraw file if you want to make changes to the diagram

cody-architecture.excalidraw

</aside>

The diagram above is a bit overwhelming so let’s try to break it down into smaller pieces, starting with the Cody client.

What are the Cody clients?

Cody clients are our user-facing Cody products. These are artifacts that we publish to various distribution channels allowing our customers and users to install and interact with Cody on their computer.

Client Programming Language Runtime Distribution channel Status
VS Code TypeScript Node.js VS Code Marketplace, Open VSX GA
JetBrains Kotlin JVM JetBrains Marketplace GA
Cody Web TypeScript Browsers Sourcegraph Code Search GA
Cody CLI TypeScript Node.js npm Experimental
Visual Studio C# CLR Visual Studio Marketplace EAP
Eclipse Java JVM GitHub Pages (Eclipse Site URL) EAP
VS Code (Web) TypeScript Browsers VS Code Marketplace ?

What features do the Cody clients support?

Our official docs have a detailed overview of what different features are supported by different clients. Simplified, Cody feature fall into three rough categories: Chat, Edit, and Autocomplete. Each of these product categories help our users solve problems with varying degree of complexity.

CleanShot 2024-09-11 at 09.36.29@2x.png

What are the LLM providers?

LLM providers are remote services that run language models on large GPU clusters. Unlike traditional web services that run on CPUs, language models require specialized GPU hardware that is more complicated to deploy and operate. Sourcegraph does not have the expertise to operate GPU clusters (yet) so we lean on external providers like Anthropic and OpenAI to provide this service.

LLM Provider Protocol Billing
OpenAI OpenAI BYOK, Cody Gateway
Anthropic Anthropic BYOK, Cody Gateway
Google Gemini Gemini BYOK, Cody Gateway
Fireworks OpenAI-ish Cody Gateway
Azure OpenAI OpenAI BYOK
AWS Bedrock Anthropic BYOK
Google Vertex Anthropic BYOK
Self-hosted (OpenAI-compatible) OpenAI BYOK

BYOK is exclusively available to Cody Enterprise users. For Cody Free/Pro users, all requests must be routed through Cody Gateway.

What is Cody Gateway?

We have a dedicated page for Cody Gateway. In short, Cody Gateway solves a billing problem. Without Cody Gateway, every Cody customer would need to provide their API key to access LLM providers like Anthropic and OpenAI. With Cody Gateway, Sourcegraph foots the bill for LLM spend and customers only have to pay Sourcegraph one bill to access Cody features. The Cost Gateway also handles functionality related to LLM access such as rate limiting and abuse monitoring.

Why does PLG Autocomplete go straight to Cody Gateway?

Our recommended Enterprise option (Cody Gateway) requires several network hops to serve an LLM Chat request: Client → Sourcegraph Instance → Cody Gateway → OpenAI/Anthropic/Fireworks. For typical Chat usage, these network hops add relatively little overhead because Chat uses large models like Sonnet 3.5 that have slow response times anyways.