From the course: Prometheus and Grafana: Visualizing Application Performance
Unlock this course with a free trial
Join today to access over 24,900 courses taught by industry experts.
Lab 2: Upgrading the assistant with MCP
From the course: Prometheus and Grafana: Visualizing Application Performance
Lab 2: Upgrading the assistant with MCP
- [Instructor] Now, let's use the Grafana MCP server to provide context and access to our dashboard without writing specialized RAG code. MCP allows AI assistance to interact directly with Grafana, making the decisions smarter. Grafana has an official MCP server implementation that is open source and available on GitHub. It is still in early development, but we can use it locally to explore the power of MCP. There are several tools already like search_dashboards, get_dashboard_by_uid, query_prometheus and so on. When we configure the MCP server with Grafana credentials, it will be able to run commands that interact with our local Grafana instance on port 9090. The first thing we'll do is clone the repository. We'll then build the server command and remember the location. Next, we can configure our backend server to use MCP instead. We have changed the USE_MCP environment variable to true, and for the MCP server config, we have set the Grafana URL and the full path to the MCP server…
Contents
-
-
-
-
-
-
-
-
-
(Locked)
Introduction to generative AI3m 41s
-
(Locked)
How AI can help solve challenges in Grafana3m 14s
-
(Locked)
Grafana and LLMs: Current features3m 51s
-
(Locked)
The role of context awareness: RAG and MCPs4m 14s
-
(Locked)
Lab 1: An LLM assistant for troubleshooting6m 50s
-
(Locked)
Lab 2: Upgrading the assistant with MCP5m 38s
-
(Locked)
-
-
-