# CLI Assets upload

{% hint style="success" %}
prerequisites to upload assets to the Tensorleap platform are: <br>

* Having a Tensorleap server [installed](https://docs.tensorleap.ai/getting-started/tensorleap-setup/installation#tensorleap-server), with access to the dataset.
* Having a CLI [installed](https://docs.tensorleap.ai/getting-started/tensorleap-setup/installation#tensorleap-cli-installation) & [authenticated](https://docs.tensorleap.ai/getting-started/tensorleap-setup/cli-authentication), that is able to [access the Tensorleap server](https://docs.tensorleap.ai/getting-started/tensorleap-setup/installation#where-should-you-install-the-server-and-client) via port 4589.
* Access to a valid [integration script](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code) that passed the [integration test](https://docs.tensorleap.ai/tensorleap-integration/integration-test) OR/AND access to a .onnx or .h5 model that should be uploaded to the platform
  {% endhint %}

The Tensorleap CLI is used to upload codebases and models to the Tensorleap platform.&#x20;

Uploading assets to the platform could be done in two ways:

* [Uploading code that access local assets or publicly available assets](#uploading-code-only)
* [Uploading a code that requires to authenticate to a cloud or service](#uploading-a-codebase-that-requires-a-secret)
* [Uploading a code and model](#uploading-both-a-code-and-model)

### Uploading Model-Code pair

In order to upload a model and a codebase, from within the root of the repo (where leap.yaml is located), run:\
`leap  push`

This would first try to [upload your code](#uploading-code-only) and would then try to [upload the model](#uploading-a-model).&#x20;

This expects the [leap\_integration\_test](https://docs.tensorleap.ai/tensorleap-integration/integration-test) interface to be implemented so that the  platform knows what how to connect your code and model.

### Overriding existing Code

In order to override the code in a model-code pair within the platform, from within the root of the repo (where [leap.yaml ](https://docs.tensorleap.ai/tensorleap-integration/leap.yaml)is located), run: `leap push`  and instead of creating a new version, choose the one you already created.

{% hint style="danger" %}
If you override existing code the only supported change without running evaluation again is changes in the visualizations code. If you change any metric/metadata/data loading code you will need to re-evaluate.
{% endhint %}

The first action that happens on code push and requirement.txt upload is a creation of a virtual environment. This might take some time at first creation, but subsequent upload with the same requirements will utilize this existing virtual environment.&#x20;

Next, the code is uploaded to the platform. The server would try to run the [preprocess function](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code/preprocess-function), get an [input](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code/input-encoder), get a [GT](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code/ground-truth-encoder), and run all of the [metadata](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code/metadata-function) for the first index of the dataset.

If this is able to run successfully, the CLI should show:

```
INFO Code parsed successfully
```

Otherwise, the CLI should print the stack trace with the error it got. For a more complete log,  login to the platform and review the runs & processes menu.

{% hint style="danger" %}
Common code integration errors may occur when:

* [Integration](https://docs.tensorleap.ai/tensorleap-integration/writing-integration-code) interface contract (i.e. shapes, types) was not kept. Please ensure you have been able to run a [local integration test](https://docs.tensorleap.ai/tensorleap-integration/integration-test) successfully.
* Some of the files were not included in the leap.yaml.
* An error in accessing a specific path.  please ensure that all of the data you want to read from storage in the code integration (including configs, labels, or any other assets) is located within a folder that was mounted to the Tensorleap server [on installation ](https://docs.tensorleap.ai/getting-started/tensorleap-setup/installation#tensorleap-server). The **leap server info** command lists the mounted folders under the datasetvolumes property.
  {% endhint %}

#### Uploading A codebase that requires a secret

In order to access a cloud storage, or some other service that requires a secret within your code, you can Utilize the [Secret Manager](https://docs.tensorleap.ai/user-interface/secrets-management). After uploading a secret to the platform, you can associate this secret with your current integration by running:

```
leap secrets set
```

This would interactively let you choose which secret would you like to use within this code. Once set, the secret could be used within the code by referencing&#x20;

```python
import os
auth_secret_string = os.environ['AUTH_SECRET']
```

{% hint style="info" %}
This environment variable would be set automatically once the codebase is uploaded  & utilized within the Tensorleap server. For the[ local integration test](https://docs.tensorleap.ai/tensorleap-integration/integration-test) we recommend setting an environment variable with the same name in your local environment for consistency purposes.
{% endhint %}

If [leap.yaml](https://docs.tensorleap.ai/tensorleap-integration/leap.yaml) fields are invalid or if this is the first integration of the code and some fields are missing, the CLI would interactively ask the user to choose an existing project or create a new one. It would also inteartively ask for a model name.&#x20;
