LLM Models
Configure and customize LLM models in Marvin
Marvin uses Pydantic AI as its underlying LLM engine, providing a clean, type-safe interface for working with various LLM providers.
The Default Model
By default, Marvin uses OpenAI’s GPT-4o model. To use this default configuration, you only need to set your OpenAI API key:
Configuring Models
Using String Identifiers
The simplest way to specify a model is with a string identifier in the format provider:model_name
:
Model Settings
You can customize model behavior using the model_settings
parameter when creating an agent:
Changing the Default Model
You can change the default model for all agents in your application:
Environment Variables
You can also configure the default model and other settings using environment variables:
Model Providers
Marvin supports any model provider that is compatible with Pydantic AI. Common providers include:
- OpenAI
- Anthropic
- Azure OpenAI
Each provider may require its own API key and configuration. Refer to the provider’s documentation for specific setup instructions.
Passing Configuration to Models
For more control, you may pass pydantic-ai
models when creating an agent.
For example, to use Anthropic’s Claude 3.5 Sonnet model with a custom httpx
client: