Cloud
February 11, 2026

Introducing the Crusoe Cloud MCP server

We're releasing the Crusoe Cloud MCP server, connecting AI coding assistants like Claude Code and Cursor directly to your infrastructure with filtered responses and composite tools designed for how AI agents actually work.

Tim Harding
Senior Software Engineer
February 11, 2026

Claude Code and tools like Cursor are changing how developers work. A Google engineer recently described how Claude Code generated in one hour what her team had been working on for a year. Developers are using these AI coding assistants to build applications, refactor codebases, and automate complex workflows; all through natural language.

These tools work best when they can actually see and interact with the systems they're managing. That's where Model Context Protocol (MCP) comes in. MCP is an open standard that connects AI tools to external services through a standardized interface.

Today, we're releasing the Crusoe Cloud MCP server, which connects MCP-compatible tools like Claude Code and Cursor directly to your Crusoe Cloud infrastructure.

What it does

The Crusoe Cloud MCP server connects AI coding assistants directly to your Crusoe Cloud infrastructure. But it's not just a wrapper around the CLI or API. It's purpose-built for how AI agents actually consume information.

Every tool returns filtered, structured responses that give the AI exactly what it needs without flooding its context window with noise. When you ask about your VMs, you get names, states, IPs, and locations, not raw API payloads full of internal identifiers and empty fields. This matters because LLM context windows are finite and expensive: less noise means better reasoning and faster results.

The server also includes tools that go beyond what the CLI or console can do in a single operation, composing multiple API calls into higher-level views that would otherwise require manual scripting.

How MCP works

Model Context Protocol is an open standard developed by Anthropic that defines how AI applications securely connect to external data sources and tools. The MCP server acts as a bridge between your AI assistant and Crusoe Cloud's API, handling authentication, executing operations, and returning optimized responses that preserve actionable information while minimizing context usage.

The setup

Install the MCP server using npm. For Claude Code:

claude mcp add crusoe -- npx -y @crusoeai/cloud-mcp

For other MCP clients, add this configuration:

{
  "mcpServers": {
    "crusoe": {
      "command": "npx",
      "args": ["-y", "@crusoeai/cloud-mcp"]
    }
  }
}

The server uses your existing Crusoe Cloud credentials for authentication.

Use cases

The MCP server supports infrastructure workflows you're already running, just faster and through natural language:

Infrastructure discovery: Ask "what's running in my project?" and get_resource_relationships maps your entire infrastructure (VMs, disks, clusters, networks, load balancers, and how they connect) in a single call.

Cluster health monitoring: Tell your agent to "check my cluster health" and cluster_health_check returns per-pod health summaries with problem nodes flagged, organized by InfiniBand pod for locality-aware debugging.

Capacity planning: Check available instance types and specifications across regions, cross-referenced against your current quota and usage.

Troubleshooting: Start from the resource graph to identify relationships (which disk is attached to which VM, which VMs back a load balancer), then drill into specific resources for details.

These are read operations that give you visibility into your infrastructure. The server translates natural language queries into the right API calls and formats responses for your AI assistant to act on.

Designed for AI

Most cloud MCP servers wrap each CLI command or API endpoint as a tool. The AI calls the tool, gets back the same JSON a human would see, and figures it out from there.

We took a different approach. The Crusoe Cloud MCP server is built around two principles:

Filtered responses, not raw API dumps. Every tool returns a purpose-built summary type that strips away internal metadata, empty fields, and implementation details. For example, when listing VMs, the server returns a VMSummary with name, state, type, location, and IP addresses — not the full API object with dozens of fields the AI will never use. This keeps the AI's context window focused on actionable information.

Composite tools that do what the CLI can't. Two tools in particular illustrate this:

get_resource_relationships builds a graph of every resource in your project and the relationships between them; VMs attached to disks, subnets in networks, load balancers routing to instances, node pools containing VMs. It fetches 11 resource types in parallel, resolves cross-references (like matching load balancer backend IPs to VM private IPs), and returns a nodes-and-edges graph. This doesn't exist in the CLI or any single API call. The AI can use it to understand your infrastructure topology in one shot, then drill into specific resources as needed.

cluster_health_check provides real-time node-level health and utilization metrics for Kubernetes clusters, organized by InfiniBand pod for network locality awareness. It queries the observability API with PromQL, groups nodes by their physical pod placement, computes per-pod and cluster-level summaries, and flags problem nodes. Unhealthy nodes for health checks, overloaded nodes for CPU, underutilized nodes for GPU. The response is pre-analyzed: instead of raw metric time series, the AI gets a structured summary with the worst offenders highlighted. This kind of operational intelligence doesn't exist in the CLI at all.

Why we built it this way

MCP servers for cloud infrastructure are becoming standard across providers; AWS, Google Cloud, and others have released their own implementations. What sets the Crusoe Cloud MCP Server apart is treating the AI as a first-class consumer with different needs than a human at a terminal. Response filtering prevents context stuffing, composite tools provide views that don't exist in the CLI, and rate limiting and caching are designed for the burst-heavy access patterns of AI agents, which tend to make many rapid calls when exploring infrastructure.

Get started

Infrastructure management doesn't have to be slow. With the Crusoe Cloud MCP server, you get the visibility and control you need through natural language - without the overhead of raw API responses or manual scripting. Connect your AI coding assistant, ask what you need to know, and get back to building.

Ready to connect Claude Code or Cursor to Crusoe Cloud? Visit crusoe.ai/docs for setup instructions, or contact support@crusoe.ai with questions.

Latest articles

Chase Lochmiller - Co-founder, CEO
February 11, 2026
Introducing the Crusoe Cloud MCP server
Chase Lochmiller - Co-founder, CEO
February 9, 2026
Five examples of AI infrastructure done right
Chase Lochmiller - Co-founder, CEO
February 6, 2026
Up to 3X faster: Benchmarking Llama 3.1 fine-tuning on Crusoe Cloud with NVIDIA GB200 NVL72

Are you ready to build something amazing?