From the course: Prometheus and Grafana: Visualizing Application Performance
Unlock this course with a free trial
Join today to access over 24,900 courses taught by industry experts.
Lab 1: An LLM assistant for troubleshooting
From the course: Prometheus and Grafana: Visualizing Application Performance
Lab 1: An LLM assistant for troubleshooting
- [Instructor] In the following labs, we'll explore custom LLM plugin that connects Grafana dashboards with AI capabilities. This course is the only place you can currently find any in-depth knowledge about this, so take your time to understand it. If you can, follow along using the code in the repository. Before we begin, make sure you have cloned the cost repository from GitHub. Our custom plugin has two parts, a Grafana panel plugin that provides a text box to interact with an LLM and a backend server that communicates with open routers LLM API. First, let's set up the environment by configuring the required environment variables. (keyboard clicking) These are the required environment variables. Make sure that USE_MCP is set to false for this first demo. You will also need an Open AI API key for the next demo, which you can get at the URL shown on the screen as well. Next, it's time to start the backend server. (keyboard clicking) First, we're creating a virtual env. (keyboard…
Contents
-
-
-
-
-
-
-
-
-
(Locked)
Introduction to generative AI3m 41s
-
(Locked)
How AI can help solve challenges in Grafana3m 14s
-
(Locked)
Grafana and LLMs: Current features3m 51s
-
(Locked)
The role of context awareness: RAG and MCPs4m 14s
-
(Locked)
Lab 1: An LLM assistant for troubleshooting6m 50s
-
(Locked)
Lab 2: Upgrading the assistant with MCP5m 38s
-
(Locked)
-
-
-