heroku-mcp-server

Official
Local
73
Signed
GitHub Repo

Overview

The heroku-mcp-server is a Model Context Protocol (MCP) server that enables AI assistants and agents to interact directly with Heroku applications and platform resources through a structured, AI-friendly interface. It allows AI-driven workflows to inspect apps, manage deployments, review configuration, and reason about runtime state without switching tools or manually using the Heroku CLI or dashboard.

This server is well suited for application operations, debugging, deployment workflows, and platform management tasks in teams that run workloads on Heroku.

Transport

stdio

Tools

  • list_apps
  • get_app_info
  • create_app
  • rename_app
  • transfer_app
  • deploy_to_heroku
  • deploy_one_off_dyno
  • ps_list
  • ps_scale
  • ps_restart
  • list_addons
  • get_addon_info
  • create_addon
  • maintenance_on
  • maintenance_off
  • get_app_logs
  • pipelines_create
  • pipelines_promote
  • pipelines_list
  • pipelines_info
  • list_teams
  • list_private_spaces
  • pg_psql
  • pg_info
  • pg_ps
  • pg_locks
  • pg_outliers
  • pg_credentials
  • pg_kill
  • pg_maintenance
  • pg_backups
  • pg_upgrade

Key Capabilities

  • Application discovery — Explore Heroku apps and their associated resources programmatically.
  • Configuration insight — Inspect environment variables, add-ons, and app metadata for debugging or auditing.
  • Operational awareness — Reason about dynos, process types, and runtime state.
  • Deployment and lifecycle workflows — Support AI-driven workflows around app updates and operational actions.
  • Developer productivity — Enable assistants to answer questions like “what’s running?” or “how is this app configured?”

How It Works

The heroku-mcp-server runs as an MCP service that connects to Heroku using an authenticated API token. AI clients communicate with the server over the MCP protocol to request application context or perform platform actions as part of broader reasoning workflows.

The server mediates all interaction with Heroku’s APIs, handling authentication, request execution, and response normalization. Results are returned in structured formats that AI assistants can reason over directly, while ensuring that all access respects Heroku’s permission model.

By abstracting Heroku’s platform APIs behind MCP, the server allows AI-driven workflows — such as application inspection, configuration review, and operational analysis — to happen conversationally and programmatically inside a single environment.