Using MCP Servers
Connect Marvin agents to Model Context Protocol (MCP) servers for extended capabilities.
MCP support is experimental and subject to change. Open an issue on GitHub if you’d like to see a feature added or have any feedback.
Marvin agents can connect to Model Context Protocol (MCP) servers, allowing them to access a wide array of external tools and data sources.
What is MCP?
MCP provides a standardized way for AI agents to interface with tools, resources, prompts etc. If a thing in the world has an MCP server, you can trivially write an agent that interacts with it (on the terms of the MCP server implementation).
Connecting an Agent to an MCP Server
To give your agent access to an MCP server, you simply pass MCPServer
instances to the mcp_servers
argument when creating your Agent
.
Marvin uses pydantic-ai
under the hood for MCP server interactions. The most common server type you’ll likely use is MCPServerStdio
, which communicates with an MCP server running as a subprocess via its standard input/output.
Example: Using a Python Interpreter via MCP
Let’s say you want your agent to be able to run Python code. There are MCP servers that provide a Python interpreter as a tool. The jsr:@pydantic/mcp-run-python
package is one such server that can be run with Deno.
Make sure the command specified in MCPServerStdio
(e.g., "deno"
or "uvx"
) is available in your system’s PATH or provide the full path to the executable.
Example: Using a Git Tool Server via MCP
Another common use case is interacting with Git repositories. The mcp-server-git
package (runnable with uvx
, part of uv
) provides Git tools over MCP.
How it Works
tldr: It’s basically just function calling (for now)
When an agent configured with mcp_servers
runs:
- Marvin discovers the tools provided by each active MCP server.
- These discovered tools (e.g.,
run_python_code
from the Python server, orgit_log
from the Git server) are made available to the LLM just like standard Python tools. - If the LLM decides to use an MCP tool, Marvin’s engine handles the communication with the specified
MCPServer
instance. - The result from the MCP server is processed and returned to the LLM, which then formulates the final response.
Exploring Further
For a more comprehensive example demonstrating multiple MCP servers and tools, check out the example script in the Marvin repository:
This example showcases an agent that uses both a Python interpreter and Git tools via MCP to perform a multi-step task. You can run it locally to see MCP integration in action.
This setup allows Marvin agents to flexibly leverage a growing ecosystem of MCP-compatible tools and services.