Activation function
The function at the last stage of a neural-network layer. For example, a rectified linear unit (ReLU) function may be applied on the result of the matrix multiplication to generate the final output of a dense layer. An activation function can be linear or nonlinear. Nonlinear activation functions can be used to increase the representational power (i.e., capacity) of a neural network. Examples of nonlinear activations include sigmoid, hyperbolic tangent (tanh), as well as the abovementioned ReLU.
Area under the curve
Often abbreviated as AUC, a single number used to quantify the shape of a ROC curve. It is defined as the definite integral under the ROC curve, from false positive rate 0 to 1. See ROC curve.
Axis
In the context of TensorFlow.js, when we talk about a tensor, an axis (plural axes) is one of the independent keys indexing into the tensor. For example, a rank-3 tensor has 3 axes; an element of a the rank-3 tensor is identified by three integers that correspond to the three axes. Also known as a dimension.
Balance (Dataset)
A quality of a dataset with categorical labels. The more equal the numbers of examples from different categories are, the more balanced a dataset is.