by Ollama Community
Local LLM execution through Ollama. Run Llama, Mistral, and other open-source models locally with MCP integration. Privacy-focused with no API costs.
npx -y @ollama/mcp-server{
"mcpServers": {
"ollama": {
"command": "npx",
"args": [
"npx", "-y",
""@ollama/mcp-server"
]
}
}
}Config file location: ~/Library/Application Support/Claude/claude_desktop_config.json