Skip to main content

What This Enables

Hosted MCP lets an AI client connect to Lamina without asking you to paste an API key into tool arguments. The client discovers Lamina’s OAuth metadata, sends you to Lamina to approve access for a workspace, receives an access token, and then calls the remote MCP endpoint with Authorization: Bearer <token>. Remote MCP endpoint:
https://app.uselamina.com/mcp/agent
Hosted install surface:
https://app.uselamina.com/mcp/install
Use the install page when you want copyable endpoint values, OAuth metadata links, and setup patterns for remote OAuth-capable MCP clients or local stdio fallback clients. Prefer the hosted endpoint whenever the client supports remote Streamable HTTP MCP with OAuth. Use the local stdio fallback when a client only supports local servers or cannot complete OAuth.

Supported Clients

The list of popular AI tools that can connect to Lamina MCP: Additional clients can use the same hosted endpoint if they support remote Streamable HTTP MCP with OAuth, or the local stdio fallback if they only support local MCP servers.

Quick Compatibility Matrix

ClientRecommended Lamina setupNotes
ChatGPTHosted remote MCPRemote MCP only. Full write-tool support depends on plan, workspace settings, and developer mode availability.
Codex CLIHosted remote MCPWorks through the Codex MCP CLI/config flow.
VS Code with CopilotHosted remote MCPUse type: "http" in mcp.json; VS Code handles authentication-capable MCP servers.
Claude CodeHosted remote MCPAdd HTTP transport, then use /mcp to authenticate.
Claude.ai and Claude DesktopHosted remote MCP when available; local stdio fallback otherwiseUse hosted custom connectors when the Claude plan/client supports them.
CursorHosted remote MCPCursor supports Streamable HTTP and OAuth-capable MCP servers.
WindsurfHosted remote MCPAdd a Streamable HTTP server and refresh Cascade tools.
ContinueLocal stdio fallback firstUse remote streamable-http only if your Continue build can complete OAuth for remote MCP.
ClineHosted remote MCP when OAuth is available; local stdio otherwiseCline supports HTTP and stdio MCP configuration.
ZedHosted remote MCP on recent builds; local stdio fallbackRecent Zed builds support OAuth for remote MCP servers.
DevinHosted remote MCPAdd Lamina as a custom HTTP MCP server in Devin’s MCP Marketplace.
RaycastHosted remote MCPAdd Lamina through Raycast’s MCP server install flow.
GooseHosted remote MCP when OAuth is available; local command-line extension otherwiseAdd Lamina as a remote Streamable HTTP extension or a command-line extension.
Gemini Code AssistHosted remote MCP through Gemini settingsUse native HTTP support when available or mcp-remote fallback.
Gemini CLIHosted remote MCP when OAuth is available; local stdio otherwiseGemini CLI supports HTTP, SSE, and stdio MCP servers.

Setup

Connect your AI client to Lamina MCP and authorize access to your workspace.

Install with add-mcp

Install Lamina MCP for supported coding agents with the community add-mcp installer:
npx add-mcp https://app.uselamina.com/mcp/agent
Add -y to skip confirmation prompts. Add -g to install globally across projects:
npx add-mcp -g https://app.uselamina.com/mcp/agent
If add-mcp does not detect your client or cannot complete OAuth, use the manual client-specific instructions below.

ChatGPT

ChatGPT custom MCP connectors are remote-only. They are controlled by ChatGPT plan and workspace settings, and full MCP write actions may require developer mode or workspace admin approval.
  1. In ChatGPT, open Settings.
  2. Open Apps or Connectors, then enable Developer mode if your plan/workspace requires it.
  3. Create a custom app/connector for a remote MCP server.
  4. Use this server URL:
https://app.uselamina.com/mcp/agent
  1. Complete Lamina OAuth, choose the workspace, and test with lamina_discover.
If ChatGPT reports that the connector is unsupported, verify that the workspace supports full MCP connectors and write tools. Some ChatGPT connector surfaces are read/fetch-oriented and may not expose Lamina’s creative write tools.

Codex CLI

Add Lamina as a remote MCP server:
codex mcp add lamina --url https://app.uselamina.com/mcp/agent
codex mcp list
Codex shares MCP configuration between the CLI and IDE extension. If your Codex build prompts for browser authentication, complete Lamina OAuth and choose the workspace. You can also configure the hosted server directly in ~/.codex/config.toml:
[mcp_servers.lamina]
url = "https://app.uselamina.com/mcp/agent"

VS Code with Copilot

Create .vscode/mcp.json for a workspace install:
{
  "servers": {
    "lamina": {
      "type": "http",
      "url": "https://app.uselamina.com/mcp/agent"
    }
  }
}
Then restart the MCP server from the MCP server list or reload VS Code. When prompted, trust the server and complete OAuth. In Copilot Chat Agent mode, use the tools picker to enable Lamina tools. For a local fallback, use stdio with an input variable:
{
  "inputs": [
    {
      "id": "lamina-api-key",
      "type": "promptString",
      "description": "Lamina API key",
      "password": true
    }
  ],
  "servers": {
    "lamina-local": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@uselamina/mcp"],
      "env": {
        "LAMINA_API_KEY": "${input:lamina-api-key}"
      }
    }
  }
}

Devin

Devin supports custom MCP servers through the MCP Marketplace.
  1. Open Devin Settings > MCP Marketplace.
  2. Select Add Your Own.
  3. Choose HTTP as the transport.
  4. Set the server URL:
https://app.uselamina.com/mcp/agent
  1. Save the server, then complete Lamina OAuth when Devin prompts for authorization.
If Devin asks for headers, leave them empty for the hosted OAuth flow. Lamina issues workspace-scoped tokens after OAuth consent.

Raycast

Raycast supports installing MCP servers from its MCP server management commands.
  1. Open Raycast and run the MCP Install Server command.
  2. Enter these details:
Name: Lamina
Transport: HTTP
URL: https://app.uselamina.com/mcp/agent
  1. Click Install and authenticate with Lamina if prompted.
If your Raycast build only supports local stdio MCP servers, use:
Name: Lamina
Command: npx
Arguments: -y @uselamina/mcp
Environment: LAMINA_API_KEY=YOUR_LAMINA_API_KEY

Claude Code

Add the hosted HTTP server:
claude mcp add --transport http lamina --scope user https://app.uselamina.com/mcp/agent
Inside Claude Code, run:
/mcp
Choose Lamina and authenticate in the browser. Verify the install:
claude mcp list
claude mcp get lamina
For local stdio fallback:
claude mcp add --transport stdio lamina-local \
  --env LAMINA_API_KEY=YOUR_LAMINA_API_KEY \
  -- npx -y @uselamina/mcp

Claude.ai and Claude Desktop

Claude Desktop primarily uses local MCP servers and Desktop Extensions. Until Lamina ships a DXT package, configure the local stdio package in claude_desktop_config.json:
{
  "mcpServers": {
    "lamina": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@uselamina/mcp"],
      "env": {
        "LAMINA_API_KEY": "YOUR_LAMINA_API_KEY"
      }
    }
  }
}
Restart Claude Desktop after saving the file. If your Claude client exposes a remote MCP connector surface, use the hosted endpoint instead:
https://app.uselamina.com/mcp/agent

Cursor

Create a project config at .cursor/mcp.json or a global config at ~/.cursor/mcp.json:
{
  "mcpServers": {
    "lamina": {
      "url": "https://app.uselamina.com/mcp/agent"
    }
  }
}
In Cursor, open MCP settings, refresh servers, and authenticate if prompted. For Cursor CLI:
cursor-agent mcp list
cursor-agent mcp login lamina
cursor-agent mcp list-tools lamina
For local stdio fallback:
{
  "mcpServers": {
    "lamina-local": {
      "command": "npx",
      "args": ["-y", "@uselamina/mcp"],
      "env": {
        "LAMINA_API_KEY": "YOUR_LAMINA_API_KEY"
      }
    }
  }
}

Windsurf

In Windsurf, open Settings > Tools > Windsurf Settings > Add Server, or edit the raw ~/.codeium/windsurf/mcp_config.json file:
{
  "mcpServers": {
    "lamina": {
      "serverUrl": "https://app.uselamina.com/mcp/agent"
    }
  }
}
Use url instead of serverUrl if your Windsurf build expects that key. Press refresh in Cascade after adding the server, then authenticate if prompted. Local fallback:
{
  "mcpServers": {
    "lamina-local": {
      "command": "npx",
      "args": ["-y", "@uselamina/mcp"],
      "env": {
        "LAMINA_API_KEY": "YOUR_LAMINA_API_KEY"
      }
    }
  }
}

Continue

Continue MCP servers run in agent mode. Create .continue/mcpServers/lamina.yaml:
name: Lamina MCP
version: 0.0.1
schema: v1
mcpServers:
  - name: Lamina
    type: streamable-http
    url: https://app.uselamina.com/mcp/agent
Use this hosted config only if your Continue build supports OAuth for remote MCP. Otherwise, use the local stdio fallback:
name: Lamina MCP
version: 0.0.1
schema: v1
mcpServers:
  - name: Lamina
    type: stdio
    command: npx
    args:
      - "-y"
      - "@uselamina/mcp"
    env:
      LAMINA_API_KEY: YOUR_LAMINA_API_KEY
Restart Continue and switch to agent mode before testing.

Cline

For hosted HTTP MCP:
cline mcp add lamina https://app.uselamina.com/mcp/agent --type http
If your Cline build cannot complete remote OAuth, edit ~/.cline/data/settings/cline_mcp_settings.json for local stdio:
{
  "mcpServers": {
    "lamina": {
      "command": "npx",
      "args": ["-y", "@uselamina/mcp"],
      "env": {
        "LAMINA_API_KEY": "YOUR_LAMINA_API_KEY"
      },
      "disabled": false
    }
  }
}

Zed

In Zed, open settings and add a custom context server:
{
  "context_servers": {
    "lamina": {
      "url": "https://app.uselamina.com/mcp/agent"
    }
  }
}
Recent Zed builds show an authentication action when a remote MCP server requires OAuth. If your build does not, use the local stdio fallback:
{
  "context_servers": {
    "lamina-local": {
      "command": "npx",
      "args": ["-y", "@uselamina/mcp"],
      "env": {
        "LAMINA_API_KEY": "YOUR_LAMINA_API_KEY"
      }
    }
  }
}
Open the Agent Panel settings and verify the Lamina server status indicator is active.

Goose

In Goose Desktop, open Extensions, choose Add custom extension, and add a Remote Extension (Streamable HTTP):
Name: Lamina
Endpoint URL: https://app.uselamina.com/mcp/agent
In Goose CLI:
goose configure
Choose Add Extension, then Remote Extension (Streamable HTTP), and enter the Lamina endpoint. If your Goose build cannot complete OAuth for remote MCP, add a command-line extension instead:
Name: Lamina
Command: npx
Arguments: -y @uselamina/mcp
Environment: LAMINA_API_KEY=YOUR_LAMINA_API_KEY

Gemini Code Assist

Gemini Code Assist shares MCP configuration with Gemini’s MCP settings. Use native HTTP support when your IDE build supports OAuth-capable remote MCP servers:
{
  "mcpServers": {
    "lamina": {
      "httpUrl": "https://app.uselamina.com/mcp/agent"
    }
  }
}
If your build does not complete OAuth for remote HTTP MCP, use mcp-remote:
{
  "mcpServers": {
    "lamina": {
      "command": "npx",
      "args": ["mcp-remote", "https://app.uselamina.com/mcp/agent"]
    }
  }
}
Restart the IDE after updating ~/.gemini/settings.json, then authenticate when prompted.

Gemini CLI

For hosted HTTP MCP:
gemini mcp add --transport http --scope user lamina https://app.uselamina.com/mcp/agent
gemini mcp list
You can also configure ~/.gemini/settings.json:
{
  "mcpServers": {
    "lamina": {
      "httpUrl": "https://app.uselamina.com/mcp/agent"
    }
  }
}
If your Gemini CLI build cannot complete OAuth for remote HTTP MCP, use local stdio:
gemini mcp add --scope user \
  --env LAMINA_API_KEY=YOUR_LAMINA_API_KEY \
  lamina-local npx -y @uselamina/mcp
Avoid underscores in Gemini MCP server aliases because Gemini’s policy engine parses tool names with underscores.

OAuth Discovery

Lamina exposes standard MCP OAuth metadata for HTTP clients:
GET /.well-known/oauth-protected-resource/mcp/agent
GET /.well-known/oauth-authorization-server
The protected-resource response identifies the MCP resource:
{
  "resource": "https://app.uselamina.com/mcp/agent",
  "authorization_servers": ["https://app.uselamina.com"],
  "bearer_methods_supported": ["header"],
  "scopes_supported": [
    "lamina:creative:read",
    "lamina:creative:write",
    "lamina:brand:read"
  ]
}

Install Flow

1

Register the MCP client

Clients that support dynamic registration call POST /mcp/oauth/register with client_name and redirect_uris. Public clients use token_endpoint_auth_method: "none".
2

Start authorization

The client opens /mcp/oauth/authorize with response_type=code, client_id, redirect_uri, scope, state, resource, code_challenge, and code_challenge_method=S256.
3

Approve workspace access

Lamina shows a protected consent page. Choose the workspace the agent should use, then approve the requested scopes.
4

Exchange the code

The client posts to /mcp/oauth/token with grant_type=authorization_code, the authorization code, the same redirect_uri, client_id, resource, and the PKCE code_verifier.
5

Call MCP

The client calls /mcp/agent with Authorization: Bearer <access_token>.

Scopes

  • lamina:creative:read allows discovery and run status.
  • lamina:creative:write allows creating and batching creative runs.
  • lamina:brand:read allows reading brand memory and guidance.
If a token lacks a required scope, Lamina returns an insufficient-scope challenge so the client can request step-up authorization.

First Creative Run

After install, the agent should call:
{
  "tool": "lamina_create",
  "arguments": {
    "brief": "Create a cinematic launch image for a premium running shoe",
    "platform": "instagram",
    "modality": "image"
  }
}
The response includes a runId. If the run is still processing, call:
{
  "tool": "lamina_status",
  "arguments": {
    "runId": "run-id-from-create",
    "wait": true
  }
}

Local Alternative

If your MCP client cannot install remote OAuth MCP servers, use the local stdio package:
npx -y @uselamina/mcp
Set LAMINA_API_KEY or sign in through the Lamina CLI. Local stdio MCP remains useful for coding agents, CI sandboxes, and environments that do not support remote MCP OAuth yet.