concept gradient ascent in category deep learning

appears as: gradient ascent
Deep Learning with JavaScript: Neural networks in TensorFlow.js

This is an excerpt from Manning's book Deep Learning with JavaScript: Neural networks in TensorFlow.js.

Figure 7.9. A schematic diagram showing the basic idea behind how the maximally activating image for a convolutional filter is found through gradient ascent in input space (panel B) and how that differs from the normal neural network training process based on gradient descent in weight space (panel A). Note that this figure differs from some of the model diagrams shown previously in that it breaks the weights out from the model. This is for highlighting the two sets of quantities that can be updated through backpropagation: the weights and the input.
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage
test yourself with a liveTest