chapter one

1 Introducing Hugging Face

 

This chapter covers

  • What Hugging Face is known for
  • The Hugging Face Transformers library
  • The various models hosted by Hugging Face
  • The Gradio library

Hugging Face is an AI community that promotes the building, training, and deployment of open source machine learning models. It has state-of-the-art models designed for different problem domains, such as natural language processing (NLP) tasks, computer vision tasks, and audio tasks. Besides providing tools for machine learning, Hugging Face provides a platform for hosting pretrained models and datasets. With AI at its peak, Hugging Face is at the epicenter of the AI revolution because

  • It unleashes a new wave of applications that capitalizes on the large amount of data available.
  • Many complementary technologies are being developed, such as prototyping tools for large learning model (LLM)–based applications.
  • Instead of focusing on the fundamentals (such as building neural networks from scratch or learning machine learning algorithms), developers can focus on building AI-based apps to solve their problems immediately. AI is now a tool that developers can use directly rather than having to build it from scratch.
  • Hugging Face’s philosophy is to promote open source contributions. It is the hub of open source models for NLP, computer vision, and other fields in which AI plays vital roles.

1.1 Hugging Face Transformers library

1.2 Hugging Face models

1.3 Hugging Face Gradio Python library

1.4 Understanding the Hugging Face mental model

1.4.1 Step 1: User need

1.4.2 Step 2: Model Hub discovery

1.4.3 Step 3: Model card

1.4.4 Step 4: Two execution paths

1.4.5 Step 5: Results delivered

Summary