Docker image for installing and running tools for LLM agents (MCP, OpenAPI, UVX, NPX, Python)
- Python / Node.js runtime - includes
python
,node
,uvx
,npx
- Includes extra packages for managing MCP/OpenAPI tools and connections
mcpo
- MCP to OpenAPI bridgesupergateway
- MCP STDIO/SSE bridge@modelcontextprotocol/inspector
- debugging tool for MCP
- Utils:
curl
,jq
,git
- Easy unified cache at
/app/cache
for all tools
# Launch MCP tools in stdio mode
docker run ghcr.io/av/tools uvx mcp-server-time
# Bridge from MCP to OpenAPI
docker run -p 8000:8000 ghcr.io/av/tools uvx mcpo -- uvx mcp-server-time --local-timezone=America/New_York
# http://0.0.0.0:8000/docs -> see endpoint documentation
# Run MCP inspector
docker run -p 6274:6274 -p 6277:6277 ghcr.io/av/tools npx @modelcontextprotocol/inspector
# Persist the cache volume for quick restarts
# -v cache:/app/cache - named docker volume
# -v /path/to/my/cache:/app/cache - cache on the host
docker run -v cache:/app/cache ghcr.io/av/tools uvx mcp-server-time
In docker compose:
services:
time:
image: ghcr.io/av/tools
command: uvx mcp-server-time
volumes:
- cache:/app/cache
fetch:
image: ghcr.io/av/tools
command: uvx mcpo -- uvx mcp-server-fetch
ports:
- 7133:8000
volumes:
- cache:/app/cache
Check out Harbor for a complete dockerized LLM environment.