Overview
mcp-use is an MCP server framework and an SDK to build MCP Clients & AI Agents connected to MCP servers.
| Section | Description |
|---|
| MCP Server | Set up MCP servers standalone or in an existing Express server |
| MCP Agent | Install and initialize and AI Agent with MCP Client |
MCP Server
mcp-use is the complete MCP server framework for TypeScript.
It combines the official Model Context Protocol SDK with Express.js and React to enable both MCP protocol communication and HTTP endpoints for UI widgets and custom routes.
You can expose UI components to chat clients like ChatGPT, Claude, and MCP-UI compatible ones.
Installation
The fastest way to scaffold a new MCP server is to use the create-mcp-use-app command.
npx create-mcp-use-app@latest my-mcp-server
cd my-mcp-server
npm run dev
This command will create a new MCP server with:
- A complete TypeScript MCP server project structure.
- Example MCP Tools and Resources to get you started.
- Example UI Widgets React components in
resources/ folder exposed as tools and resources in Apps SDK for ChatGPT and MCP-UI format.
- Pre-configured build tools and dev server.
- All necessary dependencies installed.
- MCP Inspector to test your server.
Project Structure
After creation, your project will have this structure:
my-mcp-server/
├── resources/
│ └── component.ts # MCP-UI / OpenAI Apps SDK example
├── index.ts # MCP server entry point
├── package.json
├── tsconfig.json
└── README.md
Running Your MCP Server
Commands:
npm run dev # start the development server
npm run build # build the server
npm run start # start the production server
When you run your MCP server, it will be available at:
- MCP Endpoint:
http://localhost:3000/mcp - For MCP client connections
- MCP Inspector:
http://localhost:3000/inspector - It will automatically launch an MCP Inspector for you to test you MCP server.
Deploy Your MCP Server
You can deploy your MCP server on any platform. Build your MCP server with npm run build and start the production server with npm run start.
Or you can deploy it on mcp-use Cloud.
Next Steps
- Core features: Learn how to create MCP tools, prompts and resources.
- UI Widgets: Expose UI components to chat clients compatible with ChatGPT Apps SDK and MCP-UI.
- Configuration: Advanced configuration and deployment options.
- Deploy Your Server - Deploy to production with one command
MCP Agent
Installing LangChain Providers
mcp-use works with various LLM providers through LangChain. You’ll need to install the appropriate LangChain provider package for your chosen LLM:
Tool Calling Required: Only models with tool calling capabilities can be used with mcp-use. Make sure your chosen model supports function calling or tool use.
Environment Setup
Set up your environment variables in a .env file for secure API key management:
# LLM Provider Keys (set the ones you want to use)
OPENAI_API_KEY=your_api_key_here
ANTHROPIC_API_KEY=your_api_key_here
GROQ_API_KEY=your_api_key_here
GOOGLE_API_KEY=your_api_key_here
Your First Agent
Here’s a simple example to get you started:
import { ChatOpenAI } from '@langchain/openai'
import { config } from 'dotenv'
import { MCPAgent, MCPClient } from 'mcp-use'
async function main() {
// Load environment variables
config()
// Create configuration object
const configuration = {
mcpServers: {
playwright: {
command: 'npx',
args: ['@playwright/mcp@latest'],
env: {
DISPLAY: ':1'
}
}
}
}
// Create MCPClient from configuration object
const client = new MCPClient(configuration)
// Create LLM
const llm = new ChatOpenAI({ model: 'gpt-4o' })
// Create agent with the client
const agent = new MCPAgent({
llm,
client,
maxSteps: 30
})
// Run the query
const result = await agent.run(
'Find the best restaurant in San Francisco USING GOOGLE SEARCH'
)
console.log(`\nResult: ${result}`)
// Clean up
await client.closeAllSessions()
}
main().catch(console.error)
Configuration Options
You can also load servers configuration from a config file:
import { loadConfigFile } from 'mcp-use'
const config = await loadConfigFile("browser_mcp.json")
const client = new MCPClient(config)
Example configuration file (browser_mcp.json):
{
"mcpServers": {
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"],
"env": {
"DISPLAY": ":1"
}
}
}
}
For multi-server setups, tool restrictions, and advanced configuration options, see the Configuration Overview.
Available MCP Servers
mcp-use supports any MCP server. Check out the Awesome MCP Servers list for available options.
Streaming Agent Output
Stream agent responses as they’re generated:
// Stream intermediate steps
for await (const step of agent.stream("your query here")) {
console.log(`Tool: ${step.action.tool}`)
console.log(`Result: ${step.observation}`)
}
// Or stream token-level events
for await (const event of agent.streamEvents("your query here")) {
if (event.event === 'on_chat_model_stream') {
process.stdout.write(event.data?.chunk?.text || '')
}
}
Next Steps
Need Help? Join our community discussions on GitHub or check out the comprehensive examples in our repository!