Asking Chat Models Questions
00:00 Let’s start looking at the first conceptual aspect of LangChain that you’ll be looking at, which are the chat models. Chat models are a representation for API calls to a LLM provider, or for interfacing with the API of a local LLM.
00:15
In this course you’ll use ChatOpenAI
, which wraps OpenAI, LLM, such as GPT. You need to import it from the associated library that you’ve installed and then also instantiate it.
00:26 And when you instantiate one of these chat models, you generally need to pass the model as a string. So in this course, you’ll be using “gpt-4o” because at the time of recording, that’s relatively cheap, but capable LLM that’s available.
00:39 And you also set the temperature to zero just so that you can get mostly reproducible results. And that instantiates a chat model. So this is what you need to do to get access to making API requests basically to an LLM provider.
00:53
Now with minimal code, you can already start asking questions in your REPL. You’ll just need to make sure that your environment has access to the API key, which you’re doing by importing env
, and then loading the environment variables.
01:08
And then also of course, importing the class that we just discussed, instantiating it, and then you can use the .invoke()
method on one of these chat models, and pass it a string. Just a question, any sort of prompt basically that you want to pass.
01:23
And LangChain is going to make an API request and then return the result to you. Let’s go ahead and give this a try in the REPL. You need to import dotenv
and from langchain_openai
import ChatOpenAI
.
01:43
You need to load the environment variables, which you can do by using the load_dotenv()
function.
01:51 And you see it returns true. So you’re set with that. And now you can go ahead and instantiate the chat model. If something didn’t work with the API key, then you’ll get an error when you try to instantiate the class that represents your model, your chat model.
02:06
In this case, I’m gonna say model=
02:13
And we’ll set the temperature to 0
.
02:18
Alright, no errors, which means that it had access to the API key, and now I’m ready to make API calls. chat_model.invoke()
.
02:28 Going to ask the healthcare-related question, “What is blood pressure?”
02:36 Now you can see now the API request is happening. So you need to wait for a moment, and here is the response.
02:43
It’s maybe a little bit hard to see where the actual text response is, but you can see that this is a new class that’s an AIMessage
, and it has an attribute called .content
, and the content is a string that gives you the actual text back.
02:55 You can see it takes up most of this response in this case. But then there’s also some additional keyword arguments that give you information about how many tokens were used, etc.
03:05
So model name is in here somewhere, so this might also differ depending on which LLMs you’re interfacing with. But you can always go ahead and access .content
.
03:15 So let me first do this call once again and assign it to a variable so we can inspect it a little more.
03:25
So one more time, waiting for the response. There is a response .content
now gives me the string. And you can also print that of course. And then you get a response that looks kind of similar to what you might expect when you ask such a question in an online interface or in an app that interfaces with that LLM.
03:43 Alright. You can see it also has formatting here. This is markdown formatting, and it’s got the nice spacing, etc. So you could render this and it would look good right? Now here you just have the text output.
03:55
But this is just plainly sending a question directly to the invoke()
method on one of these chat models. And that took us maybe like five lines of code to get to this point that you can interface with an LLM like that using LangChain.
04:11 So I think that’s already pretty cool, but of course there’s more. And you already saw a pointer in the first response that we looked at here. You saw that what gets returned is an AI message.
04:23 So there’s a certain label and a specific class associated with it. And let’s look at those a bit more in the next lesson.
Become a Member to join the conversation.