Skip to content

Nx MCP Server Reference

The Nx MCP server is a Model Context Protocol implementation that gives LLMs deep access to your monorepo's structure: project relationships, file mappings, runnable tasks, ownership info, tech stacks, Nx generators, and Nx documentation.

For an overview of how Nx enhances AI assistants and powerful use cases, see the Enhance Your LLM guide.

There are a few ways to setup the Nx MCP server:

mcp.json
{
"servers": {
"nx-mcp": {
"type": "stdio",
"command": "npx",
"args": ["nx-mcp@latest"]
}
}
}

If you're using Cursor, VS Code, or JetBrains IDEs, install the Nx Console extension which automatically manages the MCP server for you.

Terminal window
claude mcp add nx-mcp npx nx-mcp@latest
Terminal window
code --add-mcp '{"name":"nx-mcp","command":"npx","args":["nx-mcp"]}'

Alternatively, configure it in your VS Code settings or use the Nx Console extension for automatic setup.

  1. Install Nx Console from the marketplace
  2. You'll receive a notification to "Improve Copilot/AI agent with Nx-specific context"
  3. Click "Yes" to configure the MCP server

If you miss the notification, run the nx.configureMcpServer command from the command palette (Ctrl/Cmd + Shift + P).

  1. Install Nx Console from the marketplace
  2. You'll receive a notification to "Improve Copilot/AI agent with Nx-specific context"
  3. Click "Yes" to configure the MCP server

If you miss the notification, run the Nx: Setup MCP Server command from the command palette (Ctrl/Cmd + Shift + A).

Go to Settings -> AI -> Manage MCP Servers -> + Add to add an MCP Server. Alternatively, use the slash command /add-mcp in the Warp Agent prompt.

{
"nx-mcp": {
"command": "npx",
"args": ["nx-mcp@latest"]
}
}

Configure the MCP server in your Claude Desktop settings:

mcp.json
{
"servers": {
"nx-mcp": {
"command": "npx",
"args": ["nx-mcp@latest"]
}
}
}

The nx-mcp command accepts the following options:

OptionAliasDescription
[workspacePath]-wPath to the Nx workspace root. Defaults to the current working directory if not provided.
--transport <type>Transport protocol to use: stdio (default), sse, or http
--port <port>-pPort to use for the HTTP/SSE server (default: 9921). Only valid with --transport sse or --transport http.
--tools <patterns...>-tFilter which tools are enabled. Accepts glob patterns including negation (e.g., "*", "!nx_docs", "cloud_*")
--disableTelemetryDisable sending of telemetry data
--debugLogsEnable debug logging
--helpDisplay help information

If you want to host the server instead of communicating via stdio, use the --transport and --port flags:

Terminal window
npx nx-mcp@latest --transport sse --port 9921

The HTTP transport supports multiple concurrent connections, allowing different clients to connect simultaneously with independent sessions.

You can limit which tools are available using the --tools option with glob patterns:

Terminal window
# Enable all tools (default)
npx nx-mcp@latest --tools "*"
# Disable specific tools
npx nx-mcp@latest --tools "*" "!nx_docs"
# Enable only cloud analytics tools
npx nx-mcp@latest --tools "cloud_*"
# Enable workspace tools only
npx nx-mcp@latest --tools "nx_workspace" "nx_project_details" "nx_docs"

This is useful when you want to restrict the LLM's capabilities or reduce noise in the tool list.

ToolDescription
nx_workspaceReturns a readable representation of the project graph and nx.json configuration. Also returns any project graph errors if present.
nx_workspace_pathReturns the path to the Nx workspace root
nx_project_detailsReturns complete project configuration in JSON format for a specific project, including targets, dependencies, and metadata
nx_docsReturns documentation sections relevant to user queries about Nx configuration and best practices
nx_available_pluginsLists available Nx plugins from the core team and local workspace plugins
ToolDescription
nx_generatorsReturns a list of available code generators in your workspace
nx_generator_schemaReturns the detailed JSON schema for a specific Nx generator, including all available options
nx_run_generatorOpens the Nx Console Generate UI with prefilled options (requires a running IDE instance with Nx Console)
ToolDescription
nx_current_running_tasks_detailsLists currently running Nx TUI processes and their task statuses
nx_current_running_task_outputReturns terminal output for specific running tasks
ToolDescription
nx_visualize_graphOpens interactive project or task graph visualizations (requires a running IDE instance with Nx Console)

These tools are only available when connected to an Nx Cloud-enabled workspace. They provide analytics and insights into your CI/CD data:

ToolDescription
cloud_analytics_pipeline_executions_searchAnalyzes historical pipeline execution data to identify trends and patterns in CI/CD workflows
cloud_analytics_pipeline_execution_detailsRetrieves detailed data for a specific pipeline execution to investigate performance bottlenecks
cloud_analytics_runs_searchAnalyzes historical run data to track performance trends and team productivity patterns
cloud_analytics_run_detailsRetrieves detailed data for a specific run to investigate command execution performance
cloud_analytics_tasks_searchAnalyzes aggregated task performance statistics including success rates and cache hit rates
cloud_analytics_task_executions_searchAnalyzes individual task execution data to investigate performance trends over time

When connected to an Nx Cloud-enabled workspace, the Nx MCP server automatically exposes recent CI Pipeline Executions (CIPEs) as MCP resources. Resources appear in your AI tool's resource picker, allowing the LLM to access detailed information about CI runs including:

  • Failed tasks and their error messages
  • Terminal output from task executions
  • Affected files in the pipeline run
  • Build timing and performance data

Ask your AI assistant about your workspace structure:

What is the structure of this workspace?
How are the projects organized?

The nx_workspace tool provides:

  • All applications and libraries in your workspace
  • Project categorization through tags
  • Technology types (feature, UI, data-access)
  • Project ownership and team responsibilities

To understand a specific project's configuration:

Show me the details of the my-app project
What targets are available for my-lib?

The nx_project_details tool returns:

  • Complete target configuration
  • Project dependencies
  • Source and output paths
  • Tags and metadata

Ask your AI assistant to scaffold new code:

Create a new React library in the packages/shared folder

The AI will use nx_generators to find available generators and nx_generator_schema to understand the options, then can invoke nx_run_generator to open the Generate UI with preset values.

Get accurate guidance without hallucinations:

How do I configure Nx release for conventional commits?
What are the caching options for tasks?

The nx_docs tool retrieves relevant, up-to-date documentation based on your query.

When a CI build fails, your AI assistant can:

What failed in my last CI run?
Help me fix the build error

Using the Nx Cloud tools, the AI can:

  1. Access detailed information about the failed build
  2. Retrieve terminal output from failed tasks
  3. Understand what changed and suggest fixes