Versions

Save versions and models, run analysis results, and export models out of Tensorleap

The Versions view is where you track all the saved versions and models of your project. From here, you can save versions and export models.

Versions Layout Overview

The Versions view lists all versions of your project. Expanding each version shows a list of its related models.

To view the analysis results for a model, toggle the buttons to the left of the model. The stars to the right of the buttons light up, indicating the results are running.

Things to keep in mind when working with versions and models:

  • You can save a version of a network after creating a new or editing an existing one (see Save a Version).

  • Models are saved under versions. You need to expand a version to see the models under that version.

  • You can export a model out of Tensorleap into a common standard format (see Export a Model).

The demo video below shows the results of the analysis performed on a model.

Save a Version

Once you complete laying out a new network or making changes to an existing one, you can save it as a new version.

To save a network:

  1. On the Versions view, enter a revision name.

  2. Click Save to add the new version to the list in the Versions view.

By default, when you create a new project, Tensorleap also creates the first version within the master branch. If you want to override this initial version, mark the Override Current Version checkbox when saving a network. If not selected, a new version is added.

Export a Model

When training a network, you will be required to name the model (see Evaluate / Train Model). The model appears under the source network, where you can export it out of Tensorleap into a common standard format.

To export a trained model out of Tensorleap:

  1. With the Versions view open, search for the model under your version.

  2. On the Export Model window, select the format in which the model will be saved.

  3. The job appears on the list to the right with status set to Pending. A notification message also appears briefly on your screen.

8. Once Tensorleap completes compiling the file, status is set to Finished.

If status is still set to Pending after some time, you may need to refresh the page to see the change in status to Finished.

Export Formats

JSON_TF2 - JSON format (TensorFlow 2)

The JSON format serializes the model layers, their properties and connectivity. It does not hold the state (weights) of the model.

Below is a code snippet for loading the exported JSON file:

import tensorflow as tf

json_data = open('exported_model.json').read()
model = tf.keras.models.model_from_json(json_data)

More info at https://www.tensorflow.org/api_docs/python/tf/keras/models/model_from_json.

H5_TF2 - H5 format (TensorFlow 2)

The h5 format serializes the model and model state (weights) as a single h5 file.

Below is a code snippet for loading the exported h5 file:

import tensorflow as tf

model = tf.keras.models.load_model('exported_model.h5')

More info at https://www.tensorflow.org/tutorials/keras/save_and_load.

ONNX format (PyTorch)

The onnx format is commonly used in PyTorch for serializing the model's layers and state (weights).

Below is a code snippet for loading the exported onnx file in PyTorch:

import onnx

# Load the ONNX model
model = onnx.load("exported_model.onnx")

More info at https://pytorch.org/docs/stable/onnx.html.

SavedModel_TF2 - SaveModel serialization (TensorFlow 2)

This format uses the TensorFlow 2 SaveModel format, and exports a folder with files containing the serialized model layers and state.

The serialized data is stored to a folder with this directory structure:

assets/ (folder)
variables/ (folder)
saved_model.pb

When downloading the SaveModel format from Tensorleap, the exported folder is contained within a compressed .gz file.

To load the exported model, you must extract the .gz file into a folder. One way to do it is by using tar:

tar -xf exported_model.tar.gz

Below is a code snippet to load the model from the extracted folder:

import tensorflow as tf

extracted_model_path = '/path/to/extracted_model'
model = tf.keras.models.load_model(extracted_model_path)

More info at https://www.tensorflow.org/api_docs/python/tf/keras/models/save_model.

Last updated