chapter eleven

11 Advanced Deep Learning Algorithms

 

This chapter covers

  • Variational Auto Encoder (VAE) for time-series anomaly detection
  • Mixture Density Network (MDN) using amortized Variational Inference (VI)
  • Attention and Transformers
  • Graph Neural Networks
  • ML Research: Deep Learning

In the previous chapter, we looked at fundamental deep learning algorithms to help represent numerical, image, and text data. In this chapter, we continue our discussion with advanced deep learning algorithms. The algorithms have been selected for their state-of-the-art performance architectures and a wide range of applications. We will look into generative models based on Variational Auto-Encoder (VAE) and implement from scratch an anomaly detector for time-series data. We'll continue our journey with an intriguing combination of neural networks and classical Gaussian Mixture Models (GMMs) via amortized Variational Inference (VI) and implement Mixture Density Networks (MDNs). We will then focus on the concept of attention and implement a Transformer architecture from scratch for a classification task. Finally, we'll look into Graph Neural Networks (GNNs) and use one to perform node classification for a citation graph. We will be using Keras/TensorFlow deep learning library throughout this chapter.

11.1 Auto Encoders

11.1.1 VAE Anomaly Detection in Time-Series

11.2 Amortized Variational Inference

11.2.1 Mixture Density Networks

11.3 Attention and Transformers

11.4 Graph Neural Networks

11.5 ML Research: Deep Learning

11.6 Exercises

11.7 Summary