From the course: Build with AI: SQL Agents with Large Language Models
Unlock this course with a free trial
Join today to access over 24,900 courses taught by industry experts.
Working with LLM APIs
From the course: Build with AI: SQL Agents with Large Language Models
Working with LLM APIs
- [Instructor] Are you ready to generate your very first SQL query with an LLM? In this video, we'll explore how to work with the OpenAI API Python SDK. The goal is simple. Send a question or a prompt to an LLM and then pass its response. We'll be working inside this Jupiter notebook called 02_04, and you can find it under the Chapter 2 folder. Before we'll dive into the code, let's use the following diagram to illustrate the OpenAI API SDK workflow. We'll start by defining the client object by providing the API best URL and the API key. Next we call the chat completion method, where you must set the model name and the prompt you want to send to the LLM. As optional, you can modify some of the default argument values, such as the temperature and max tokens. Lastly, you execute the method and send the prompt to the relevant API and get back a response. One quick tip. Some arguments can change between model versions…