Quick Start with MCP-Use šŸš€

Welcome to the MCP-Use universe – the easiest way to plug any LLM into powerful, real-world tools! In this short guide you'll go from zero to a running agent in under five minutes.

This tutorial is the perfect place to start if you've just installed MCP-Use or if you want a refresher on the basics. When you're done, jump to the other tutorials in this folder for deeper dives.

0. Why MCP before MCP-Use?

Before diving into code let's understand the playing field.

MCP (Model-Context Protocol) is an open standard that lets an AI model call external tools in a safe, typed way – similar to how your smartphone apps request permissions. A tool can be anything:

  • open a web page and click a button
  • run a shell command
  • query a SQL database
  • generate a Blender mesh

MCP defines a common language so the model (host) says "I want to call browser.search with query=best ramen berlin" and an MCP server executes it, returning JSON results.

šŸ‘‰ In short: MCP turns any black-box LLM into a programmable agent that can actually get stuff done.

Real-world Stories

PersonaGoalHow MCP helps
āœˆļø Travel hackerCompare Airbnb prices, then check Skyscanner flights, stash results in Notion.Servers: airbnb, playwright, filesystem
šŸ“° Research analystDaily crawl of news sites, summarize top stories, dump into Slack.Servers: browser, filesystem, custom slack
šŸ› ļø 3D artistProcedurally create Blender scenes with an LLM, render thumbnails.Server: blender

1. Enter MCP-Use šŸ› ļø

MCP-Use is the Python glue that makes all of the above dead-simple:

  • Spin up or connect to many MCP servers āœ”ļø
  • Wrap them in an ergonomic async client āœ”ļø
  • Feed tool schemas to your favourite LangChain LLM āœ”ļø
  • Provide guard-rails like max_steps, allowed tools, automatic server routing āœ”ļø

If MCP is the protocol, MCP-Use is the power socket adapter.


1. Prerequisites

āœ… RequirementWhy you need it
Python 3.11+MCP-Use targets modern Python for async-await bliss.
LLM provider keysAny model that supports tool/function calling via LangChain chat models works. Examples: OPENAI_API_KEY, ANTHROPIC_API_KEY.
Node 18+ (optional but recommended)Many official MCP servers are distributed as npx <package>, so having Node installed unlocks instant servers like Playwright or Airbnb.

2. Installation

# 1. Core library pip install mcp-use # 2. Pick an LLM provider – here we use OpenAI pip install langchain-openai # 3. (Optional) Playwright browser MCP server npm install -g @playwright/mcp # or just use npx in the config later

Add your API keys to a .env file in your project root:

OPENAI_API_KEY="sk-…"

MCP-Use will automatically pick them up via python-dotenv.


3. Your First Agent – "Find me coffee ā˜•"

Create a new file called hello_mcp.py and paste the following code:

import asyncio import os from dotenv import load_dotenv from langchain_openai import ChatOpenAI from mcp_use import MCPAgent, MCPClient load_dotenv() # šŸ”‘ Load API keys # 1ļøāƒ£ Describe which MCP servers you want. Here we spin up Playwright in a headless browser. CONFIG = { "mcpServers": { "playwright": { "command": "npx", "args": ["@playwright/mcp@latest"], "env": {"DISPLAY": ":1"} # required if you run inside Xvfb / CI } } } async def main(): client = MCPClient.from_dict(CONFIG) llm = ChatOpenAI(model="gpt-4o") # 2ļøāƒ£ Wire the LLM to the client agent = MCPAgent(llm=llm, client=client, max_steps=20) # 3ļøāƒ£ Ask something that requires real web browsing result = await agent.run("Find the best specialty coffee in Berlin using Google Search") print("\nšŸ”„ Result:", result) # 4ļøāƒ£ Always clean up running MCP sessions await client.close_all_sessions() if __name__ == "__main__": asyncio.run(main())

Run it:

python hello_mcp.py

If everything is set up correctly you'll watch MCP-Use boot the Playwright server, let the LLM pick browser actions, and finally print a human-readable answer.


4. How it Works (TL;DR)

  1. MCPClient starts the external server in a separate process.
  2. The server exposes tools (e.g. browser.search, browser.click).
  3. MCPAgent sends the available tools to your LLM.
  4. The LLM decides which tool to call, returns a JSON tool invocation, and MCP-Use executes it.
  5. Steps 3-4 repeat until the agent decides it's "done" or hits max_steps.

5. Next Steps ā–¶ļø

  • Build your own agent – check out first-agent.md for a fully-commented walkthrough.
  • Multiple servers – want to combine Airbnb + Browser + Filesystem? Hop over to multi-server.md.
  • Debugging – learn pro tips for logging and tracing in debugging.md.

Happy hacking – and don't forget to ⭐ the project on GitHub if this saved you time!
(GitHub Repo)