vLLM Benchmark
eliovp-bv
4 stars
0 tools
6/26/2025
Benchmarks vLLM deployments by measuring throughput, latency, and token generation speed through natural language test configuration
Other
Quick Stats
GitHub Stars4
Tools AvailableN/A
Status
approved
Repository Information
Repository URL:
Documentation:
Tools (0)
No tools are documented for this server.
Installation
- Clone the repository from GitHub
- Install dependencies (usually
npm install
) - Configure environment variables (if required)
- Add the server to your MCP configuration
- Restart your MCP client
📖 See the repository's README for detailed installation instructions and configuration examples.