Server Directory
AI & LLMAdvanced

Ollama (Local LLMs)

by Ollama Community

GitHub
1.6k

Local LLM execution through Ollama. Run Llama, Mistral, and other open-source models locally with MCP integration. Privacy-focused with no API costs.

CAPABILITIES

  • Local model execution
  • Llama and Mistral support
  • No API costs
  • Privacy-focused
  • Custom model loading
  • On-premise deployment

INSTALLATION

Terminal
npx -y @ollama/mcp-server

CLAUDE DESKTOP CONFIG

claude_desktop_config.json
{
  "mcpServers": {
    "ollama": {
      "command": "npx",
      "args": [
        "npx", "-y",
        ""@ollama/mcp-server"
      ]
    }
  }
}

Config file location: ~/Library/Application Support/Claude/claude_desktop_config.json

QUICK FACTS

Category
AI & LLM
Difficulty
Advanced
Maintained By
Ollama Community
NPM Package
@ollama/mcp-server

NEW TO MCP?

Learn how to set up MCP servers with Claude Desktop step by step.

Read the tutorial →