Appendix A

 

A.1  Glossary

Activation function

The function at the last stage of a neural-network layer. For example, a rectified linear unit (ReLU) function may be applied on the result of the matrix multiplication to generate the final output of a dense layer. An activation function can be linear or nonlinear. Nonlinear activation functions can be used to increase the representational power (i.e., capacity) of a neural network. Examples of nonlinear activations include sigmoid, hyperbolic tangent (tanh), as well as the abovementioned ReLU.

Area under the curve

Often abbreviated as AUC, a single number used to quantify the shape of a ROC curve. It is defined as the definite integral under the ROC curve, from false positive rate 0 to 1. See ROC curve.

Axis

In the context of TensorFlow.js, when we talk about a tensor, an axis (plural axes) is one of the independent keys indexing into the tensor.  For example, a rank-3 tensor has 3 axes; an element of a the rank-3 tensor is identified by three integers that correspond to the three axes. Also known as a dimension.

Balance (Dataset)

A quality of a dataset with categorical labels. The more equal the numbers of examples from different categories are, the more balanced a dataset is.

Batch

A.2   Comparing the Features of TensorFlow.js to Some Other JavaScript Deep-Learning Libraries

A.3  Installing tfjs-node-gpu and Its Dependencies

A.3.1  Installing tfjs-node-gpu on Linux

A.3.2 Installing tfjs-node-gpu on Windows

A.4  A Quick Tutorial of Tensors and Operations in TensorFlow.js

A.4.1  Tensor creation and tensor axis conventions

A4.2  Basic tensor operations

A.4.3  Memory Management in TensorFlow.js: tf.dispose() and tf.tidy()

A.4.4 Exercises for Appendix A4