| description | Integrate Gemini CLI with Grafana via Docker MCP Toolkit for natural language observability. | ||
|---|---|---|---|
| keywords | mcp, grafana, docker, gemini, devops | ||
| title | Connect Gemini to Grafana via MCP | ||
| summary | Learn how to leverage the Model Context Protocol (MCP) to interact with Grafana dashboards and datasources directly from your terminal. | ||
| levels |
|
||
| subjects |
|
||
| aliases |
|
||
| params |
|
This guide shows how to connect Gemini CLI to a Grafana instance using the Docker MCP Toolkit.
- Gemini CLI installed and authenticated.
- Docker Desktop with the MCP Toolkit extension enabled.
- An active Grafana instance.
The MCP server requires a Service Account Token to interact with the Grafana API. Service Account Tokens are preferred over personal API keys because they can be revoked independently without affecting user access, and permissions can be scoped more narrowly.
- Navigate to Administration > Users and access > Service accounts in your Grafana dashboard.
- Create a new Service Account (e.g.,
gemini-mcp-connector). - Assign the Viewer role (or Editor if you require alert management capabilities).
- Generate a new token. Copy the token immediately—you won't be able to view it again.
The Docker MCP Toolkit provides a pre-configured Grafana catalog item. This connects the LLM to the Grafana API.
- Open the MCP Toolkit in Docker Desktop.
- Locate Grafana in the Catalog and add it to your active servers.
- In the Configuration view, define the following:
- Grafana URL: The endpoint or URL of your instance.
- Service Account Token: The token generated in the previous step.
To register the Docker MCP gateway within Gemini, update your global configuration file located at ~/.gemini/settings.json.
Ensure the mcpServers object includes the following entry:
{
"mcpServers": {
"MCP_DOCKER": {
"command": "docker",
"args": [
"mcp",
"gateway",
"run"
]
}
}
}
Restart your Gemini CLI session to load the new configuration. Verify the status of the MCP tools by running:
> /mcp list
A successful connection will show MCP_DOCKER as Ready, exposing over 61 tools for data fetching, dashboard searching, and alert inspection.
List all Prometheus and Loki datasources.
Gemini performs intent parsing and translates the request into a LogQL query: {device_name="edge-device-01"} |= "nginx". This query targets specific logs, extracting raw OpenTelemetry (OTel) data that includes container metadata and system labels, which Gemini then uses to identify the source of the issue.
Once the system identifies Loki as the active datasource, it translates the human intent into a precise technical command. The AI autonomously constructs a LogQL query: {device_name="edge-device-01"} |= "nginx". This query targets the specific Kubernetes pod logs, extracting raw OpenTelemetry (OTel) data that includes pod UIDs and container metadata. Instead of the user writing complex syntax, the prompt acts as the bridge to pull structured data from the containerized environment
In the final step, Gemini performs reasoning over the raw telemetry. After filtering through hundreds of lines to confirm the existence of Nginx logs, Gemini extracts a specific node_filesystem_device_error buried within the stream. By surfacing this critical event, it alerts the DevOps engineer to a volume mounting issue on the edge node, transforming raw data into an actionable incident report.
How many dashboards we have?
Tell me the summary of X dashboard
Imagine you get a page that an application is slow. You could:
- Use list_alert_rules to see which alert is firing.
- Use search_dashboards to find the relevant application dashboard.
- Use get_panel_image on a key panel to see the performance spike visually.
- Use query_loki_logs to search for "error" or "timeout" messages during the time of the spike.
- If you find the root cause, use create_incident to start the formal response and add_activity_to_incident to log your findings.
- Learn about Advanced LogQL queries
- Set up Team-wide MCP configurations
- Explore Grafana alerting with MCP
- Get help in the Docker Community Forums
Need help setting up your Docker MCP environment or customizing your Gemini prompts? Visit the Docker Community Forums or see the MCP Troubleshooting Guide.









