Appendix A. Trying out LangChain

 

In this appendix, you will get a first hands-on experience with LangChain by performing simple sentence completions and experimenting with basic prompt engineering in a Jupyter Notebook.

A.1 Trying out LangChain in a Jupyter Notebook environment

We’ll begin with a straightforward example: completing a sentence using an OpenAI model and refining its output through prompt engineering. OpenAI’s models are a convenient starting point because they are accessible through a public REST API and require no local infrastructure setup.

If you prefer, you can use an open-source LLM inference engine such as Ollama. However, since many readers may not yet have run an open-source LLM locally, we will cover that setup in detail in Appendix E. For now, we will keep things simple and use the OpenAI REST API.

Before you proceed, ensure that you have Python 3.11 or later installed on your local machine and that the following prerequisites are met:

  1. Already own or generate an OpenAI key.
  2. Know how to set up a Python Jupyter notebook environment.

A.1.1 Sentence completion example

A.1.2 Prompt engineering examples

A.1.3 Creating chains and executing them with LCEL