vLLM Benchmark icon

vLLM Benchmark

eliovp-bv

4 stars
0 tools
6/26/2025

Benchmarks vLLM deployments by measuring throughput, latency, and token generation speed through natural language test configuration

Other

Quick Stats

GitHub Stars4
Tools AvailableN/A
Status
approved

Repository Information

Tools (0)

No tools are documented for this server.

Installation

  1. Clone the repository from GitHub
  2. Install dependencies (usually npm install)
  3. Configure environment variables (if required)
  4. Add the server to your MCP configuration
  5. Restart your MCP client

📖 See the repository's README for detailed installation instructions and configuration examples.