2 Vectors, matrices, and tensors in machine learning

 

This chapter covers

  • Vectors and matrices and their role in datascience
  • Working with eigenvalues and eigenvectors
  • Finding the axes of a hyper-ellipse

At its core, machine learning, and indeed all computer software, is about number crunching. We input a set of numbers into the machine and get back a different set of numbers as output. However, this cannot be done randomly. It is important to organize these numbers appropriately and group them into meaningful objects that go into and come out of the machine. This is where vectors and matrices come in. These are concepts that mathematicians have been using for centuries—we are simply reusing them in machine learning.

In this chapter, we will study vectors and matrices, primarily from a machine learning point of view. Starting from the basics, we will quickly graduate to advanced concepts, restricting ourselves to topics relevant to machine learning.

We provide Jupyter Notebook-based Python implementations for most of the concepts discussed in this and other chapters. Complete, fully functional code that can be downloaded and executed (after installing Python and Jupyter Notebook) can be found at http://mng.bz/KMQ4. The code relevant to this chapter can be found at http://mng.bz/d4nz.

2.1 Vectors and their role in machine learning

 
 
 

2.1.1 The geometric view of vectors and its significance in machine learning

 

2.2 PyTorch code for vector manipulations

 
 

2.2.1 PyTorch code for the introduction to vectors

 
 

2.3 Matrices and their role in machine learning

 

2.3.1 Matrix representation of digital images

 
 

2.4 Python code: Introducing matrices, tensors, and images via PyTorch

 

2.5 Basic vector and matrix operations in machine learning

 
 
 
 

2.5.1 Matrix and vector transpose

 
 

2.5.2 Dot product of two vectors and its role in machine learning

 
 

2.5.3 Matrix multiplication and machine learning

 
 
 

2.5.4 Length of a vector (L2 norm): Model error

 
 
 

2.5.5 Geometric intuitions for vector length

 
 

2.5.6 Geometric intuitions for the dot product: Feature similarity

 
 
 
 

2.6 Orthogonality of vectors and its physical significance

 
 

2.7 Python code: Basic vector and matrix operations via PyTorch

 

2.7.1 PyTorch code for a matrix transpose

 
 
 
 

2.7.2 PyTorch code for a dot product

 
 

2.7.3 PyTorch code for matrix vector multiplication

 
 
 
 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage