Quickstart using Leap Hub

How to Quickly load an off-the-shelf model & dataset using Leap Hub

Most of the off-the-shelf integration we have assume internet access or pre-existing dataset. We recommend going over the README of each repo to understand its requirements

In this section we will go over the steps needed to push an existing hub project into your local tensorleap installation. At the end of this section you would have a code integration (----LINK NEEDED----) and a model within the Tensorleap platform.

Choosing a hub project

In order to chose a hub model, we recommend visiting our github space and reviewing the different repositories. Each repository is a Tensorleap integration for a given model and dataset.

  • If it's your first time using Tensorleap, we recommend using the MNIST use-case. It does not require to download any datasets, and thus enables a straightforward integration.

  • Otherwise, we recommend browsing the hub for a repo that captures a similar data and model to your organization's use-case.

Nest sections will assume MNIST was chosen for the integration. If you chose any other project, some other steps like downloading a dataset or a model might be needed. Please Follow the relevant README to fill in the missing details

Setting up the repo

  1. Clone the repo into the computer on which the CLI was installed.

  2. Create a virtual environment for the repo and install the Repo's requirements.

Our repo usually have a pyproject.toml Poetry requirements file. This file details the expected python version and the expected dependencies. We recommend a combination of pyenv to set the local python interpreter, and Poetry to manage the dependencies.

Integration Test

From within the repo, run leap_custom_test.py and ensure a successful termination of the script.

Integrating the codebase ane model to the platform

To integrate a Tensorleap Project into the platform, from within the repo run:

leap projects push model/model.h5

You would then be prompted to provide:

  • Model Name

  • Project Name (choose create new)

  • Code Integration Name (choose create new)

The CLI would now upload the model and the codebase into the server. The first time running this after installing or upgrading Tensorleap might take up to 10 minutes to complete. Next uploads would finalize much quicker

A valid upload show show the following logs in the terminal:

...
INFO Code parsed successfully
...    
INFO Successfully imported model
...
INFO mapping was applied successfully applied with no validation errors

Once these appear within the terminal, the model and dataset are integrated into the platform.

Data validation and Evaluation in the platform

To review your project and run the data through the uploaded model:

  1. Login into your Tensorleap app

  2. Click on the project you've just uploaded

Selecting a Project
  1. Click the Code Integration Tab, and then click "Validate Assets" to make sure all of the assets were uploaded successfuly to the server. Wait for the "Validate Assets" button to become green to signify a succesful integration.

Validating the integration of the Tensorleap assets after uploading to the server
  1. Click on the Evaluate button to open the Evaluation panel. Click Evaluate again to start the evaluation process and infer your data using the uploaded model.

Evaluating the data
  1. Wait for the evaluation process to be completed. To follow on the progress of the evaluation you can open the "Runs and Processes" panel.

The Runs and Processes Tab

Once evaluation is completed the integration is finished. We can now go over the Dashboard tab and start an analysis of the model and the dataset:

Switching to the Dasboard Tab

Last updated

Was this helpful?