Skip to main content
Observability gives you visibility into your agent’s behavior in production, enabling debugging, performance optimization, and understanding of how your agents use tools and interact with LLMs.

What Gets Traced

When observability is enabled, mcp-use automatically captures:
  • Full execution traces: Complete agent workflow from start to finish
  • LLM calls: Model usage, prompts, completions, and token counts
  • Tool execution: Which tools were called, with what parameters, and their results
  • Performance metrics: Execution times for each step
  • Errors and exceptions: Full context when things go wrong
  • Conversation flow: Multi-turn conversation tracking

Langfuse Integration

Langfuse is an open-source LLM observability platform

Setup Langfuse

Set Environment Variables

export LANGFUSE_PUBLIC_KEY="pk-lf-..."
export LANGFUSE_SECRET_KEY="sk-lf-..."

Start Using

// Langfuse automatically initializes when mcp-use is imported
import { MCPAgent, MCPClient } from 'mcp-use'
import { ChatOpenAI } from '@langchain/openai'
import { config } from 'dotenv'

config() // Load Langfuse environment variables

const client = new MCPClient({
  mcpServers: {
    filesystem: {
      command: 'npx',
      args: ['-y', '@modelcontextprotocol/server-filesystem', '/path/to/allowed/files']
    }
  }
})

const llm = new ChatOpenAI({ model: 'gpt-4' })
const agent = new MCPAgent({
  llm,
  client,
  maxSteps: 30
})

// All agent runs are automatically traced!
const result = await agent.run("Analyze the sales data")

Advanced Configuration

Custom Metadata and Tags

You can add custom metadata and tags to your traces for better organization and filtering:
import { MCPAgent, MCPClient } from 'mcp-use'

const agent = new MCPAgent({
  llm,
  client,
  maxSteps: 30
})

// Set metadata that will be attached to all traces
agent.setMetadata({
  agent_id: 'customer-support-agent-01',
  version: 'v2.0.0',
  environment: 'production',
  customer_id: 'cust_12345'
})

// Set tags for filtering and grouping
agent.setTags(['customer-support', 'high-priority', 'beta-feature'])

// Run your agent - metadata and tags are automatically included
const result = await agent.run("Process customer request")

Custom Callbacks

You can provide custom Langfuse callback handlers or other LangChain callbacks:
import { CallbackHandler } from 'langfuse-langchain'
import { MCPAgent } from 'mcp-use'

// Create a custom Langfuse handler
const customHandler = new CallbackHandler({
  publicKey: 'pk-lf-custom',
  secretKey: 'sk-lf-custom',
  baseUrl: 'https://custom-langfuse.com'
})

const agent = new MCPAgent({
  llm,
  client,
  callbacks: [customHandler] // Use custom callbacks instead of auto-detected ones
})