3 The OpenAI Python library

 

This chapter covers

  • Installing the OpenAI library
  • Invoking GPT using Python
  • Configuration parameters

In the last chapter, we used GPT models via the OpenAI Web interface. This works well as long as we’re just trying to have a conversation or classify and summarize single reviews. However, imagine trying to classify hundreds of reviews. In that case, using the Web interface manually for each review becomes very tedious (to say the least). Also, perhaps we want to use a language model in combination with other tools. For instance, we might want to use GPT to translate questions to formal queries, and then seamlessly execute those queries in the corresponding tool (without having to manually copy queries back and forth between different interfaces). In all of these scenarios, we need a different interface.

In this chapter, we’ll discuss a Python library by OpenAI that enables you to call OpenAI’s language models directly from Python. This enables you to integrate calls to language models as a sub-function in your code. We will be using this library in most of the following book chapters. Therefore, it makes sense to at least skim this chapter before proceeding to the next chapters. While the current chapter focuses on OpenAI’s Python library, the libraries offered by other providers of language models (including the likes of Anthropic, Cohere, or Google) are quite similar.

3.1 Prerequisites

 
 

3.2 Installing OpenAI’s Python library

 
 

3.3 Listing available models

 
 

3.4 Chat completion

 
 
 

3.5 Customizing model behavior

 
 
 
 

3.5.1 Configuring termination conditions

 
 

3.5.2 Configuring output generation

 
 
 
 

3.5.3 Configuring randomization

 
 

3.5.4 Customization example

 
 

3.5.5 Further parameters

 
 
 

3.6 Summary

 
 
 
 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage
test yourself with a liveTest