Requirements
Your chosen LLM must support:- Tool calling: Also known as function calling - required for MCP tool execution
- Structured output: For type-safe responses (optional but recommended)
- Streaming: For real-time response streaming (optional)
Connect agents to OpenAI, Anthropic, Google, and more
import { ChatOpenAI } from '@langchain/openai'
import { MCPAgent, MCPClient } from 'mcp-use'
// Initialize OpenAI model
const llm = new ChatOpenAI({
model: 'gpt-4o',
temperature: 0.7,
apiKey: process.env.OPENAI_API_KEY // Or set OPENAI_API_KEY env var
})
// Create agent
const agent = new MCPAgent({ llm, client })
import { ChatAnthropic } from '@langchain/anthropic'
import { MCPAgent, MCPClient } from 'mcp-use'
// Initialize Claude model
const llm = new ChatAnthropic({
model: 'claude-3-5-sonnet-20241022',
temperature: 0.7,
apiKey: process.env.ANTHROPIC_API_KEY // Or set ANTHROPIC_API_KEY env var
})
// Create agent
const agent = new MCPAgent({ llm, client })
import { ChatGoogleGenerativeAI } from '@langchain/google-genai'
import { MCPAgent, MCPClient } from 'mcp-use'
// Initialize Gemini model
const llm = new ChatGoogleGenerativeAI({
model: 'gemini-pro',
temperature: 0.7,
apiKey: process.env.GOOGLE_API_KEY // Or set GOOGLE_API_KEY env var
})
// Create agent
const agent = new MCPAgent({ llm, client })
import { ChatGroq } from '@langchain/groq'
import { MCPAgent, MCPClient } from 'mcp-use'
// Initialize Groq model
const llm = new ChatGroq({
model: 'llama-3.1-70b-versatile',
temperature: 0.7,
apiKey: process.env.GROQ_API_KEY // Or set GROQ_API_KEY env var
})
// Create agent
const agent = new MCPAgent({ llm, client })
Was this page helpful?