# Integration test

{% hint style="warning" %}
The integration test is a mandatory part of Tensorleap's Integration and must be implemented for the platform to work properly.
{% endhint %}

### The purpose and structure of a local integration test

before uploading code and models to the platform, an integration test should be created and run locally. The purpose of this test is to:\
(1) Instruct Tensorleap on which code needs to be executed during model analysis (loss, metrics, visualizers & metadata)\
(2) S**imulate the data flow** that would occur in the Tensorleap platform locally, ensuring that:

* All of the inputs, outputs, metrics, loss & ground truths are:
  * Parsed successfully
  * Have valid values
* Visualizations are as expected

The way to create the inegration test is by creating a simply python script (leap\_custom\_test.py), that would:

This integration test can be run or debugged locally on your machine in an appropriate python environment ([see example](#an-integration-test-example)). Since all of Tensorleap's decorators include a strong runtime validation for a mismatch in expected types and shapes, running the test locally will quickly highlight any existing integration issues.

{% hint style="info" %}
Debugging this file using an IDE will easily point you to integration issues.
{% endhint %}

### The integration test decorators

To run the integration test, two interfaces should be implemented: [tensorleap\_load\_model](#tensorleap_load_model) and [tensorleap\_integration\_test](#tensorleap_integration_test).

#### @tensorleap\_load\_model

{% hint style="warning" %}
Supported model formats are .h5 or .onnx
{% endhint %}

This decorator wraps a function that performs .onnx or .h5 model loading. It should return the model to be later used in @tensorleap\_integration\_test for inference. The decorator receives a list of [PredictionTypeHandler](https://docs.tensorleap.ai/tensorleap-integration/python-api/code_loader/datasetclasses/predictiontypehandler), which should match the number of model outputs in length (a full description is provided in the [load\_model](https://docs.tensorleap.ai/tensorleap-integration/python-api/code_loader/decorators/tensorleap_load_model) decorator page)

{% tabs %}
{% tab title=".onnx" %}

```python
import tensorflow as tf
from code_loader.inner_leap_binder.leapbinder_decorators import tensorleap_load_model, tensorleap_integration_test

#Define model outputs
prediction_type1 = PredictionTypeHandler(name='depth', labels=['high', 'low'], channel_dim=1)

#Load model
@tensorleap_load_model([prediction_type1])
def load_model():
    dir_path = os.path.dirname(os.path.abspath(__file__))
    model_path = 'models/GLPN_Kitti.onnx'
    sess = onnxruntime.InferenceSession(os.path.join(dir_path, model_path))
    return sess

#Instruct Tensorleap on how to infer model
@tensorleap_integration_test()
def integration_test(idx, subset):
    sess = load_model()
    # inputs
    x = input_image(idx, subset)
    # model
    input_name_1 = sess.get_inputs()[0].name
    pred = sess.run(None, {input_name_1: x})[0]
    ...
```

{% endtab %}

{% tab title=".h5" %}
For an .h5 model, models could be loaded and infer in the following way:

```python
import tensorflow as tf
from code_loader.inner_leap_binder.leapbinder_decorators import tensorleap_load_model, tensorleap_integration_test

prediction_type1 = PredictionTypeHandler(name='classes', labels=CONFIG['LABELS'])

@tensorleap_load_model([prediction_type1])
def load_model():
    dir_path = os.path.dirname(os.path.abspath(__file__))
    model_path = 'model/model.h5' # Relative path to .h5 model
    cnn = tf.keras.models.load_model(os.path.join(dir_path, model_path))
    return cnn

@tensorleap_integration_test()
def integration_test(idx, subset):
    # Get input and GT
    image = input_encoder(idx, subset)
    ...
    # Load Model and infer
    cnn = load_model()
    y_pred = cnn([image])
```

{% endtab %}
{% endtabs %}

{% hint style="info" %}
For more information on the attributes of the load\_model decorator and interface specifics refer [here](https://docs.tensorleap.ai/tensorleap-integration/python-api/code_loader/decorators/tensorleap_load_model).
{% endhint %}

#### @tensorleap\_integration\_test

This decorator wraps a function that serves as a way to instruct Tensorleap on what code needs to be run during model analysis. This function:

1. Receives a [PreprocessResponse](https://docs.tensorleap.ai/tensorleap-integration/python-api/code_loader/datasetclasses/preprocessresponse) and an idx as inputs
2. Calls the @tensorleap\_load\_model wrapped function to load the .onnx or .h5 model
3. Does the following:
   1. Call [Input Encoder ](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code/input-encoder)- to get the input of sample `idx`
   2. Call the ([decorated](#tensorleap_load_model)) model on the input to get predictions
   3. Call [Ground Truth Encoder](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code/ground-truth-encoder) - to get the ground truth of sample `idx`
   4. Use the above to call the [loss](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code/custom-loss-function), [metrics](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code/custom-metrics), and [metadata](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code/metadata-function).
   5. Use the above to call the [visualizers](https://docs.tensorleap.ai/user-interface/project/network/network-mapping/create-a-mapping-deprecated/visualizer-node).
   6. (optional) Use Tensorleap built-in methods to visualize the visualizers result locally and review the expected result in the platform
   7. (optional) Prints whatever values needed for debugging and integrity purposes (metadata, metrics, loss, etc.)

{% hint style="warning" %}
Only decorators that are called within the tensorleap\_integration\_test would be utilized in Tensorleap analysis. Any decorator that is defined in code, but not called within the integration test would not be executed in the platform.
{% endhint %}

To capture the connectivity of the different interfaces used in the integration test, Tensorleap passes pointers that maps the connection between the model and the decorators. For example, this instructs the platform on what should an[ image visualizer](https://docs.tensorleap.ai/tensorleap-integration/python-api/code_loader/decorators/tensorleap_custom_visualizer) defined in the script visualize - the input or the output of the model.

To support a correct registration and tracking of the pointers used, the integration test function should adhere to a specific format:

{% hint style="success" %}

* Only functions decorated with [Tensorleap decorators](https://docs.tensorleap.ai/tensorleap-integration/python-api/code_loader/decorators) should be called within the code. Any python logic (post & pre-processing, input and output manipulation) is placed within these functions.
* The model is loaded using the [tensorleap\_load\_model](https://docs.tensorleap.ai/tensorleap-integration/python-api/code_loader/decorators/tensorleap_load_model) decorator.
* Visualizing and viewing outputs can only be done using the build in `visualize` function and the python build-in `print` method.
  {% endhint %}

The integration test should not contain the following placed outside of other decorated functions:

{% hint style="danger" %}

* Arithmetics done on model inputs and outputs
* Usage of external libraries: NumPy, Pandas, etc.
* Indexing of anything other than the model prediction. This includes metadata\_result\['specific\_key'], input\_result\[3], etc.
* Adding or removing a batch dimension from an input, output or the return array of any of the decorators.
  {% endhint %}

{% hint style="info" %}
The integration test automatically adds a batch to every call to a ground truth or input encoder to support an inference script without the need of appending or removing a batch dimension
{% endhint %}

### Verification Table

After running the integration test script locally, Tensorleap prints a verification table summarizing which decorators were called:

```
Decorator Name                        | Added to integration
-------------------------------------------------
tensorleap_integration_test           | ✅
tensorleap_preprocess                 | ✅
tensorleap_input_encoder              | ✅
tensorleap_gt_encoder                 | ✅
tensorleap_load_model                 | ✅
tensorleap_custom_loss                | ✅
tensorleap_custom_metric (optional)   | ✅
tensorleap_metadata (optional)        | ✅
tensorleap_custom_visualizer (optional)| ✅
```

| Symbol | Meaning                                                            |
| ------ | ------------------------------------------------------------------ |
| ✅      | Decorator was called successfully during the integration test      |
| ❌      | Decorator was not called, or the script crashed before reaching it |
| ❔      | Status unknown — decorator was not encountered during this run     |

The table is followed by a message indicating whether the integration is complete:

* **All parts have been successfully set** — every decorator was called; you can push the project to Tensorleap.
* **All mandatory parts have been successfully set** — mandatory decorators passed; the next optional decorator to add is listed.
* **Some mandatory components have not yet been added** — the next recommended mandatory decorator to implement is listed.

{% hint style="info" %}
Only decorators **called within** `@tensorleap_integration_test` appear as ✅. Decorators defined in code but not called inside the integration test will show ❌ and will **not** be executed on the platform.
{% endhint %}

### An integration test example

Many integration tests could be found in the [leap hub github space](https://docs.tensorleap.ai/tensorleap-integration/broken-reference).\
Here, we share a basic integration test of the MNIST dataset. For a reference of the corresponding MNIST Tensorleap integration please check this [repo](https://github.com/Tensorleap-hub/mnist).

```python
import os
from code_loader.contract.datasetclasses import PredictionTypeHandler
from code_loader.plot_functions.visualize import visualize

from mnist.config import CONFIG
from leap_integration import (input_encoder, preprocess_func_leap, gt_encoder,
                         combined_bar, metrics, image_visualizer, categorical_crossentropy_loss,
                         metadata_sample_index, metadata_one_hot_digit, metadata_euclidean_distance_from_class_centroid)
import tensorflow as tf
from code_loader.inner_leap_binder.leapbinder_decorators import tensorleap_load_model, tensorleap_integration_test

prediction_type1 = PredictionTypeHandler(name='classes', labels=CONFIG['LABELS'])

@tensorleap_load_model([prediction_type1])
def load_model():
    dir_path = os.path.dirname(os.path.abspath(__file__))
    model_path = 'model/model.h5'
    cnn = tf.keras.models.load_model(os.path.join(dir_path, model_path))
    return cnn


@tensorleap_integration_test()
def integration_test(idx, subset):
    # Get input and GT
    image = input_encoder(idx, subset)
    gt = gt_encoder(idx, subset)

    # Load Model and infer
    cnn = load_model()
    y_pred = cnn([image])

    # Visualize the inputs and outputs of the model
    horizontal_bar_vis = combined_bar(y_pred, gt)
    img_vis = image_visualizer(image)

    visualize(img_vis)
    visualize(horizontal_bar_vis)

    # Compute metrics and loss
    metric_res = metrics(y_pred)
    loss_res = categorical_crossentropy_loss(gt, y_pred)
    print(metric_res)
    print(loss_res)

    # Compute metadata
    m1 = metadata_sample_index(idx, subset)
    m2 = metadata_one_hot_digit(idx, subset)
    m3 = metadata_euclidean_distance_from_class_centroid(idx, subset)
    print(m1)
    print(m2)
    print(m3)
    # here the user can return whatever he wants


if __name__ == '__main__':
    num_samples_to_test = 3
    train, val = preprocess_func_leap()
    for i in range(num_samples_to_test):
        integration_test(i, train)
        integration_test(i, val)
```
