concept linear regression in category Keras

appears as: linear regression, linear regression
Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability MEAP V06

This is an excerpt from Manning's book Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability MEAP V06.

Note that this two-step procedure is nothing special to DL but is also present in standard statistical modeling and machine learning. We agree with Pearl that the underlying principles of fitting are the same for DL, machine learning, and statistics. We’re convinced that you can profit a lot by using the knowledge that was gained in the field of statistics during the last centuries. This book acknowledges the heritage of traditional statistics and builds on it. Because of this, you can understand much of DL by looking at something as simple as linear regression, which we introduce in this chapter and use throughout the book as an easy example. You will see in chapter 4 that linear regression already is a probabilistic model providing more information than just one predicted output value for each sample. You will learn how to pick an appropriate distribution to model the variability of the outcome values. In chapter 5 we will show you how to use the TensorFlow probability framework to fit such a probabilistic DL model. You can then transfer this approach to new situations allowing you to design and fit appropriate probabilistic DL models that do not only provide high performance predictions but also capture the noise of the data.

  • Linear regression is the mother of all parametric models and is one of the smallest NNs you can think of.
  • Listing 5.2: Using TFP for linear regression with a constant variance
    from tensorflow.keras.layers import Input
    from tensorflow.keras.layers import Dense
    from tensorflow.keras.layers import Concatenate
    from tensorflow.keras.models import Model
    from tensorflow.keras.optimizers import Adam
     
    def NLL(y, distr): 
      return -distr.log_prob(y) #A
     
    def my_dist(params): #B
      return tfd.Normal(loc=params, scale=1)
    # set the sd to the fixed value 1
     
    inputs = Input(shape=(1,))
    params = Dense(1)(inputs)#C
     
    dist = tfp.layers.DistributionLambda(my_dist)(params) #D
    model_sd_1 = Model(inputs=inputs, outputs=dist) #E
    model_sd_1.compile(Adam(), loss=NLL) #F
    Figure 5.4 The synthetic validation data along with the predicted probabilistic model for linear regression, which is the same as a NN without a hidden layer. The mean of the CPD is modeled by the NN, and the standard deviation is assumed to be a constant. The black solid line indicates the positions of , and the dashed lines show the positions of the 0.025 and 0.975 quantiles.
    sitemap

    Unable to load book!

    The book could not be loaded.

    (try again in a couple of minutes)

    manning.com homepage
    test yourself with a liveTest