Quickstart using Leap Hub

How to Quickly load an off-the-shelf model & dataset using Leap Hub

Most of the off-the-shelf integration we have assume internet access or pre-existing dataset. We recommend going over the README of each repo to understand its requirements

In this section we will go over the steps needed to push an existing hub project into your local tensorleap installation. At the end of this section you would have a model and data integrated in the Tensorleap platform.

Choosing a hub project

In order to chose a hub model, we recommend visiting our github space and reviewing the different repositories. Each repository is a Tensorleap integration for a given model and dataset.

  • If it's your first time using Tensorleap, we recommend using the MNIST use-case. It does not require to download any datasets, and thus enables a straightforward integration.

  • Otherwise, we recommend browsing the hub for a repo that captures a similar data and model to your organization's use-case.

Nest sections will assume MNIST was chosen for the integration. If you chose any other project, some other steps like downloading a dataset or a model might be needed. Please Follow the relevant README to fill in the missing details

Setting up the repo

  1. Clone the repo into the computer on which the CLI was installed.

  2. Create a virtual environment for the repo and install the Repo's requirements.

Our repo usually have a pyproject.toml Poetry requirements file. This file details the expected python version and the expected dependencies. We recommend a combination of pyenv to set the local python interpreter, and Poetry to manage the dependencies.

Integration Test

From within the repo, run leap_integration.py and ensure a successful termination of the script.

Integrating the codebase and model to the platform

To upload your model and code, run the push command from the repository root:

You would then be prompted to provide:

  • Project name (choose create new)

  • Choose model path

  • Choose model name

The CLI would now upload the model and the codebase into the server. The first time running this after installing or upgrading Tensorleap might take up to 10 minutes to complete due to initial setup. Next uploads would finalize much quicker

A valid upload show the following logs in the terminal:

Once these appear within the terminal, the model and dataset are integrated into the platform.

Data validation and Evaluation in the platform

To review your project and run the data through the uploaded model:

  1. Login into your Tensorleap app

  2. Click on the project you've just uploaded

Selecting a Project
  1. Click on the Evaluate button to open the Evaluation panel. Click Evaluate again to start the evaluation process and infer your data using the uploaded model.

Evaluating the data
  1. Wait for the evaluation process to be completed. To follow on the progress of the evaluation you can open the "Runs and Processes" panel.

The Runs and Processes Tab

Once evaluation is completed the integration is finished. We can now go over the Dashboard tab and start an analysis of the model and the dataset:

Switching to the Dasboard Tab

Last updated

Was this helpful?