1 Introduction to Hugging Face

 

This chapter covers

  • What Hugging Face is known for
  • The Hugging Face Transformers library
  • Exploring the various models hosted by Hugging Face
  • The Gradio library

Hugging Face is an AI community that promotes the building, training, and deployment of open-source machine learning models. It has state-of-the-art models designed for different problem domains, such as: Natural Language Processing (NLP) tasks, Computer vision tasks, and Audio tasks. Besides providing tools for machine learning, Hugging Face also provides a platform for hosting pre-trained models and datasets. With AI at its peak now, Hugging Face is right at the epicenter of the whole AI revolution:

  • It unleashes a new wave of applications that capitalize on the large amount of data available.
  • A lot of complementary technologies are being developed, such as prototyping tools for LLM-based applications.
  • Instead of focusing on the fundamentals (such as building neural networks from scratch or learning machine learning algorithms), developers can now focus on building AI-based apps to solve their problems immediately. AI is now a tool that developers can use directly, and not something that developers have to build themselves from scratch.
  • Hugging Face’s philosophy is to promote open-source contributions and it is the hub of open-source models for NLP, computer vision, and other fields where AI plays vital roles.

1.1 Hugging Face Transformers Library

1.2 Hugging Face Models

1.3 Hugging Face Gradio Python Library

1.4 Understanding the Hugging Face Mental Model

1.4.1 Step 1: The User Need

1.4.2 Step 2: Model Hub Discovery

1.4.3 Step 3: The Model Card Bridge

1.4.4 Step 4: Two Execution Paths

1.4.5 Step 5: Results Delivered

1.5 Summary