drio
Open app

Todo Assistant with AI and Google Calendar Integration

Source

AI-powered todo assistant with Google Calendar integration using OpenAI's API and Model Context Protocol (MCP) support for natural language task management.

Catalog onlyCatalog onlySTDIO

Overview

OpenAI Todo Assistant Server is a simple Express server that utilizes OpenAI's Assistant API to manage a todo list effectively.

To use the server, clone the repository, install the dependencies, set up your OpenAI API key in a .env file, build the TypeScript code, and start the server. The server will be accessible at http://localhost:3000.

  • Integration with OpenAI's Assistant API for intelligent task management. - API endpoints for creating threads, sending messages, and retrieving chat history. - Easy setup and deployment using TypeScript and Express.
  1. Managing personal todo lists with AI assistance.
  2. Creating collaborative task management systems.
  3. Integrating with other applications for enhanced productivity.

Add to your AI client

Use these steps to connect Todo Assistant with AI and Google Calendar Integration in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "mcp-server-openai-chat-mertadali": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-openai-chat-mertadali"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "mcp-server-openai-chat-mertadali": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-openai-chat-mertadali"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "mcp-server-openai-chat-mertadali": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-openai-chat-mertadali"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "mcp-server-openai-chat-mertadali": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-openai-chat-mertadali"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "mcp-server-openai-chat-mertadali": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-openai-chat-mertadali"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "mcp-server-openai-chat-mertadali": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-openai-chat-mertadali"
      ]
    }
  }
}

FAQ

How do I set up the server?

Clone the repository, install dependencies, set up your OpenAI API key, build the code, and start the server.

What API endpoints are available?

The server provides endpoints for creating threads, sending messages, and retrieving chat history.

Is there any cost associated with using the OpenAI API?

Yes, usage of the OpenAI API may incur costs based on the API's pricing model.