--- parent: Connecting to LLMs nav_order: 200 --- # Anthropic To work with Anthropic's models, you need to provide your [Anthropic API key](https://docs.anthropic.com/claude/reference/getting-started-with-the-api) either in the `ANTHROPIC_API_KEY` environment variable or via the `--anthropic-api-key` command line switch. First, install aider: {% include install.md %} Then configure your API keys: ``` export ANTHROPIC_API_KEY= # Mac/Linux setx ANTHROPIC_API_KEY # Windows, restart shell after setx ``` Start working with aider and Anthropic on your codebase: ```bash # Change directory into your codebase cd /to/your/project # Aider uses Claude 3.7 Sonnet by default aider # List models available from Anthropic aider --list-models anthropic/ ``` {: .tip } Anthropic has very low rate limits. You can access all the Anthropic models via [OpenRouter](openrouter.md) or [Google Vertex AI](vertex.md) with more generous rate limits. You can use `aider --model ` to use any other Anthropic model. For example, if you want to use a specific version of Opus you could do `aider --model claude-3-opus-20240229`. ## Thinking tokens Aider can work with Sonnet 3.7's new thinking tokens, but does not ask Sonnet to use thinking tokens by default. Enabling thinking currently requires manual configuration. You need to add the following to your `.aider.model.settings.yml` [model settings file](/docs/config/adv-model-settings.html#model-settings). Adjust the `budget_tokens` value to change the target number of thinking tokens. ```yaml - name: anthropic/claude-3-7-sonnet-20250219 edit_format: diff weak_model_name: anthropic/claude-3-5-haiku-20241022 use_repo_map: true examples_as_sys_msg: true use_temperature: false extra_params: extra_headers: anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19 max_tokens: 64000 thinking: type: enabled budget_tokens: 32000 # Adjust this number cache_control: true editor_model_name: anthropic/claude-3-7-sonnet-20250219 editor_edit_format: editor-diff ``` More streamlined support will be coming soon.