# Custom Loss Function

Tensorleap enables you to write custom loss functions to be used within the platform. Once you define your custom loss function, and add it using

**add_custom_loss****,**you can use it within the platform by adding a**CustomLoss**node.This function has two parameters

**y_true***(tf.Tensor)*and**y_pred***(tf.Tensor)*which are correlated with the**CustomLoss**node's*ground truth*and*prediction*input tensors. These are of shape*(batch, dim-1,...,dim-n)*. The function returns a*tf.Tensor*that contains either an aggregated value or loss calculated for each sample within the batch. While training, this tensor is averaged to be considered as the overall batch loss.Example a custom loss function:

def weighted_categorical_crossentropy(y_true: tf.Tensor, y_pred: tf.Tensor) -> tf.Tensor:

# scale predictions so that the class probas of each sample sum to 1

y_pred /= K.sum(y_pred, axis=-1, keepdims=True)

# clip to prevent NaN's and Inf's

y_pred = K.clip(y_pred, K.epsilon(), 1 - K.epsilon())

# calc

weights = np.array([0.5, 2.1, 3, 4, 4, 4, 4, 4])

loss = y_true * K.log(y_pred) * weights

loss = -K.sum(loss, -1)

return loss

The leap_binder then adds the function above to the list of custom loss functions by calling the

**add_custom_loss function**.Last modified 9mo ago