Working with Multiple MCP Servers 🪐🪐

One MCP server is great, but real-world workflows often need several specialised back-ends – think scraping the web, querying a database, and editing files, all in one agent. Good news: MCP-Use was built for exactly this.

This tutorial shows you how to configure multiple servers, route tool calls, and keep resource usage in check.


1. Why Multi-Server?

Use caseRequired servers
Find Airbnb listings and screenshot the top resultairbnb, playwright
Generate a 3D model, then upload it via S3blender, aws
Crawl news sites, store results in Postgres, run SQL analysisplaywright, filesystem, database

2. Configuration File

Create multi_server_config.json:

{ "mcpServers": { "airbnb": { "command": "npx", "args": ["-y", "@openbnb/mcp-server-airbnb", "--ignore-robots-txt"] }, "playwright": { "command": "npx", "args": ["@playwright/mcp@latest"], "env": {"DISPLAY": ":1"} } } }

Pro-tip: You can mix local processes and remote HTTP/WS endpoints in the same file. See the docs for field definitions.


3. Simple Agent that Uses Both Servers

import asyncio from dotenv import load_dotenv from langchain_openai import ChatOpenAI from mcp_use import MCPClient, MCPAgent load_dotenv() async def main(): client = MCPClient.from_config_file("multi_server_config.json") agent = MCPAgent( llm=ChatOpenAI(model="gpt-4o"), client=client, max_steps=40, verbose=True ) result = await agent.run( "Find a beachfront Airbnb in Barcelona for 2 nights next month, " "then open the first listing in a browser and screenshot it." ) print("Final answer: ", result) await client.close_all_sessions() if __name__ == "__main__": asyncio.run(main())

By default the agent has access to all tools provided by any server. The LLM decides which to call.


4. Explicit Server Routing

If you want to force a query to a particular server:

await agent.run("Show me Airbnb listings in Tokyo", server_name="airbnb")

This is handy for multi-turn applications where you maintain conversation state yourself.


5. Automatic Server Manager 🤖

For bigger configs it's better to let MCP-Use handle routing:

agent = MCPAgent( …, use_server_manager=True # 🔥 auto-detects based on tool name )

The agent examines every tool call returned by the LLM, picks the correct server, and connects on-demand. Idle servers are disconnected to save RAM.


6. Performance Tips

  1. Set sensible max_steps – each step might spin up a new process.
  2. Reuse sessions – share a single MCPClient across multiple agents if you're running them in the same process.
  3. Limit tools – pass allowed_tools or disallowed_tools to the agent to reduce confusion.

7. Cleanup Checklist ✅

try: … finally: await client.close_all_sessions()

This closes WebSockets and kills subprocesses. Forgetting this can leave orphan browsers eating memory.


Congratulations – you now command an army of MCP servers! Combine them creatively and let your LLMs do the heavy lifting.