Custom Loss Function

Tensorleap enables you to write custom loss functions to be used within the platform. Once you define your custom loss function, and add it using add_custom_loss, you can use it within the platform by adding a CustomLoss node.

This function can get multiple np.ndarray arrays which will be exposed in the CustomLoss UI connections. These are of shape (batch, dim-1,...,dim-n). The function returns a np.ndarray that contains a batch loss.

Example a custom loss function:

rom code_loader.contract.datasetclasses import PreprocessResponse
from code_loader.inner_leap_binder.leapbinder_decorators import tensorleap_custom_loss
import numpy as np
...

@tensorleap_custom_loss(name='weighted_ce')
def weighted_categorical_crossentropy(y_true :np.ndarray, y_pred: np.ndarray) -> np.ndarray:
    # Normalize predictions so each sample's probabilities sum to 1
    y_pred = y_pred / np.sum(y_pred, axis=-1, keepdims=True)
    
    # Clip predictions to avoid log(0) and ensure numerical stability
    epsilon = 1e-7  # Similar to K.epsilon()
    y_pred = np.clip(y_pred, epsilon, 1 - epsilon)
    
    # Define class weights
    weights = np.array([0.5, 2.1, 3, 4, 4, 4, 4, 4])
    
    # Compute weighted log loss
    loss = y_true * np.log(y_pred) * weights
    loss = -np.sum(loss, axis=-1)
    return loss

The @tensorleap_custom_loss decorator registers each custom loss into the Tensorleap integration.

Last updated

Was this helpful?