jaspertvdm/mcp-server-ollama-bridge
PASSINGBridge to local Ollama LLM server. Run Llama, Mistral, Qwen and other local models through MCP.
Data & APIsClaude DesktopCursorWindsurf
Installation
install command
$ npx mcp-server-ollama-bridge
Test History
No test history yet. This tool is queued for testing.
Tool Info
GitHub
View on GitHubAuthor
jaspertvdm
GitHub Stars
⭐ 1
Last Tested
Apr 14, 2026, 06:31 PM
Status
PASSING