1 The mechanics of learning

 

This chapter covers

  • Using Google Colab for coding
  • Introducing PyTorch, a tensor-based API for deep learning
  • Running faster code with PyTorch’s GPU acceleration
  • Understanding automatic differentiation as the basis of learning
  • Using the Dataset interface to prepare data

Deep learning, also called neural networks or artificial neural networks, has led to dramatic advances in machine learning quality, accuracy, and usability. Technology that was considered impossible 10 years ago is now widely deployed or considered technically possible. Digital assistants like Cortana, Google, Alexa, and Siri are ubiquitous and can react to natural spoken language. Self-driving cars have been racking up millions of miles on the road as they are refined for eventual deployment. We can finally catalog and calculate just how much of the internet is made of cat photos. Deep learning has been instrumental to the success of all these use cases and many more.

1.1 Getting started with Colab

1.2 The world as tensors

1.2.1  PyTorch GPU acceleration

1.3 Automatic differentiation

1.3.1  Using derivatives to minimize losses

1.3.2  Calculating a derivative with automatic differentiation

1.3.3  Putting it together: Minimizing a function with derivatives

1.4 Optimizing parameters

1.5 Loading dataset objects

1.5.1  Creating a training and testing split

Exercises

Summary

sitemap