drio
Open app

local-llm-obsidian-knowledge-base

Source

A template repository that includes a dev container for running a local LLM and included knowledge base. Add a git repo using `git subtree` or `git submodule` and update it using an MCP Client/Server relationship i.e. `VS Code` extension like `Cline` and the `Filesystem`/`Obsidian-MCP` MCP server.

Catalog onlyCatalog onlySTDIO

Overview

local-llm-obsidian-knowledge-base is a template repository designed for running a local Large Language Model (LLM) along with an integrated knowledge base.

To use this project, clone the repository and set up the development container. You can add a git repository using git subtree or git submodule, and manage updates through an MCP Client/Server relationship using a VS Code extension like Cline and the Filesystem/Obsidian-MCP MCP server.

  • Template repository for local LLM deployment - Integration with Obsidian for knowledge management - Support for git submodules and subtree for version control
  1. Running a local instance of a Large Language Model for personal projects.
  2. Managing and updating knowledge bases using version control.
  3. Enhancing productivity with integrated development tools in VS Code.

Add to your AI client

Use these steps to connect local-llm-obsidian-knowledge-base in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "local-llm-obsidian-knowledge-base-psfw": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-local-llm-obsidian-knowledge-base-psfw"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "local-llm-obsidian-knowledge-base-psfw": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-local-llm-obsidian-knowledge-base-psfw"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "local-llm-obsidian-knowledge-base-psfw": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-local-llm-obsidian-knowledge-base-psfw"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "local-llm-obsidian-knowledge-base-psfw": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-local-llm-obsidian-knowledge-base-psfw"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "local-llm-obsidian-knowledge-base-psfw": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-local-llm-obsidian-knowledge-base-psfw"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "local-llm-obsidian-knowledge-base-psfw": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-local-llm-obsidian-knowledge-base-psfw"
      ]
    }
  }
}

FAQ

Can I use this project for any LLM?

Yes! This template can be adapted for various LLMs as per your requirements.

Is there any support for beginners?

The repository includes documentation to help beginners set up and use the project effectively.

How do I contribute to this project?

You can contribute by submitting issues or pull requests on the GitHub repository.