AI

MCP

What MCP is, how it helps AI understand Dreapex TMM, and how to add the Dreapex TMM MCP in mainstream AI tools.

This page explains how to connect the Dreapex TMM MCP endpoint to AI tools so they can use the current Dreapex TMM documentation while answering questions.

The MCP URL is:

https://tmm-docs.dreapex.com/mcp

What MCP is

MCP stands for Model Context Protocol.

It is an open protocol that creates a standard connection between AI applications and external services such as documentation. After you connect an MCP server to an AI tool, the model can use that documentation during response generation instead of relying only on what it already remembers.

For Dreapex TMM, that means the AI can answer with documentation-backed details about:

  • how to build a multilayer stack;
  • how to configure wavelength, angle, and polarization;
  • how to interpret result pages such as reflectance, color, ellipsometry, and depth distribution;
  • how Dreapex TMM case studies are modeled.

MCP's role in AI understanding of Dreapex TMM

Thin-film simulation questions are usually specific. A useful answer often depends on the exact Dreapex TMM workflow, not on generic optics knowledge.

Once the Dreapex TMM MCP is connected:

  • the AI can consult the current documentation while answering;
  • the model can decide by itself when the Dreapex TMM docs are relevant;
  • the answer is more likely to match the real product workflow and terminology.

This is the main benefit: the AI becomes more grounded in the actual Dreapex TMM docs.

MCP URL

The Dreapex TMM MCP server is available at:

https://tmm-docs.dreapex.com/mcp

Use this exact URL when adding the server in an AI tool.

Add the Dreapex TMM MCP in AI tools

Claude Code

Add the server using the CLI:

claude mcp add --transport http dreapex-tmm-docs https://tmm-docs.dreapex.com/mcp

Cursor

Install in Cursor.

Or manually create or update .cursor/mcp.json in your project root:

{
  "mcpServers": {
    "dreapex-tmm-docs": {
      "type": "http",
      "url": "https://tmm-docs.dreapex.com/mcp"
    }
  }
}

Visual Studio Code

Ensure you have GitHub Copilot and GitHub Copilot Chat extensions installed.

Install in VS Code.

Or manually create or update the .vscode/mcp.json file:

{
  "servers": {
    "dreapex-tmm-docs": {
      "type": "http",
      "url": "https://tmm-docs.dreapex.com/mcp"
    }
  }
}

Windsurf

  1. Open Windsurf and navigate to Settings > Windsurf Settings > Cascade.
  2. Click the Manage MCPs button, then select the View raw config option.
  3. Add the following configuration:
{
  "mcpServers": {
    "dreapex-tmm-docs": {
      "type": "http",
      "url": "https://tmm-docs.dreapex.com/mcp"
    }
  }
}

This file is typically:

.codeium/windsurf/mcp_config.json

Zed

  1. Open Zed and go to Settings > Open Settings.
  2. Navigate to the JSON settings file.
  3. Add the following context server configuration:
{
  "context_servers": {
    "dreapex-tmm-docs": {
      "source": "custom",
      "command": "npx",
      "args": ["mcp-remote", "https://tmm-docs.dreapex.com/mcp"],
      "env": {}
    }
  }
}

This file is typically:

.config/zed/settings.json

Confirming MCP is working

After setup, ask a Dreapex TMM-specific question, for example:

  • How do I build a DBR structure in Dreapex TMM?
  • How do I locate a Tamm plasmon resonance?
  • Which result page should I use to inspect electric-field localization?

In many AI tools, you do not need to explicitly tell the model to use MCP. Once the server is connected, the model can decide to use the Dreapex TMM documentation when it determines that the docs are relevant.

However, if the answer still looks generic or you suspect the AI is not using the documentation, it is useful to say so directly in your prompt, for example:

Please use the Dreapex TMM Docs MCP to answer this question.

That is not always required, but it is a practical way to force a docs-grounded answer when needed.

Troubleshooting

If the setup does not work as expected, check these items first:

  1. Make sure the URL is exactly https://tmm-docs.dreapex.com/mcp.
  2. Make sure the server is configured as an http MCP server.
  3. Restart the AI tool after changing the configuration.
  4. Confirm the MCP server is shown as enabled in the AI tool's MCP settings.
  5. Ask a clearly Dreapex TMM-specific question so the model has a strong reason to consult the docs.
  6. If the answer still looks generic, explicitly say: Please use the Dreapex TMM Docs MCP to answer this question.
Copyright © 2026 Dreapex