3 Using Hugging Face Transformers and Pipelines for NLP Tasks
This chapter covers
- The Transformer architecture
- Using the Hugging Face Transformers library
- Using the pipeline function in the Transformers library
- Performing NLP tasks using the Transformers library
You have had a glimpse of the Hugging Face Transformers library and how to use it to perform object detection using one of the pre-trained models hosted by Hugging Face. Now, we will first go behind the scenes and learn about the Transformers package – the Transformer architecture and the various components that make it work. The aim of this book is not to dive into the detailed workings of the Transformer model, but we want to briefly discuss it so that you have some basic understanding of how things work.
Next, we will make use of the pipeline() function that ships with the transformers package to perform the various NLP tasks, such as text classifications, text generation, text summarization and more.