Tensorleap Integration
Integrating your own model and dataset to the platform and reviewing the development proces
This page outlines the different elements of the tensorleap integration and reviews the integration flow and development cycle within the platform.
Prerequisites
Access to Tensorleap and an installed and authenticated CLI.
.onnx or .h5 model
A valid code integration that instructs Tensorleap on how to load the dataset and parse and visualize different elements.
Tensoleap configuration yamls:
A leap.yaml file that contains the setup of your integration.
(optionally) leap_mapping.yaml that instructs the platform on which visualization, metrics, and losses, should be connected to which predictions, inputs, or GTs.
(optionally) requirements.txt file if non-default requirements are needed
Expected Folder Structure
The expected file structure for a Tensorleap code integration is the following:
my_leap_project/
├── ...
├── leap_binder.py
├── leap.yaml
├── (optional)leap_mapping.yaml
Here, leap_binder.py contains the integration script.
The Tensorleap integration flow
Start from an existing repo from our Leap Hub or use the CLI to add template Tensorleap integration files into your existing repository.
Make sure your data is accessible to the Tensorleap Server.
Create a basic Integration script. This can include an input_encoder, GT encoder, loss, and a basic visualizer for the input, prediction and GT.
Verify your integration validity locally using an Integration test.
Use the CLI to push the code and models to the platform
If leap_mapping.yaml was not pushed to the server, create a new mapping file.
Validate the assets after uploading to the server
Use the Platform to Evaluate and process the data using your model.
Last updated
Was this helpful?