Skip to main content
The MCPAgent works with any modern LLM provider through LangChain’s unified interface. Connect to OpenAI, Anthropic, Google, Groq, or any other LangChain-compatible provider that supports tool calling.

Requirements

Your chosen LLM must support:
  • Tool calling: Also known as function calling - required for MCP tool execution
  • Structured output: For type-safe responses (optional but recommended)
  • Streaming: For real-time response streaming (optional)

Provider Integration Examples

OpenAI

import { ChatOpenAI } from '@langchain/openai'
import { MCPAgent, MCPClient } from 'mcp-use'

// Initialize OpenAI model
const llm = new ChatOpenAI({
  model: 'gpt-4o',
  temperature: 0.7,
  apiKey: process.env.OPENAI_API_KEY  // Or set OPENAI_API_KEY env var
})

// Create agent
const agent = new MCPAgent({ llm, client })

Anthropic

import { ChatAnthropic } from '@langchain/anthropic'
import { MCPAgent, MCPClient } from 'mcp-use'

// Initialize Claude model
const llm = new ChatAnthropic({
  model: 'claude-3-5-sonnet-20241022',
  temperature: 0.7,
  apiKey: process.env.ANTHROPIC_API_KEY  // Or set ANTHROPIC_API_KEY env var
})

// Create agent
const agent = new MCPAgent({ llm, client })

Google Gemini

import { ChatGoogleGenerativeAI } from '@langchain/google-genai'
import { MCPAgent, MCPClient } from 'mcp-use'

// Initialize Gemini model
const llm = new ChatGoogleGenerativeAI({
  model: 'gemini-pro',
  temperature: 0.7,
  apiKey: process.env.GOOGLE_API_KEY  // Or set GOOGLE_API_KEY env var
})

// Create agent
const agent = new MCPAgent({ llm, client })

Groq

import { ChatGroq } from '@langchain/groq'
import { MCPAgent, MCPClient } from 'mcp-use'

// Initialize Groq model
const llm = new ChatGroq({
  model: 'llama-3.1-70b-versatile',
  temperature: 0.7,
  apiKey: process.env.GROQ_API_KEY  // Or set GROQ_API_KEY env var
})

// Create agent
const agent = new MCPAgent({ llm, client })
For more LLM providers and detailed integration examples, visit the LangChain Chat Models documentation.