Merlin Models Example Notebooks#

The example notebooks demonstrate how to use Merlin Models with TensorFlow on a variety of datasets.

Inventory#

Running the Example Notebooks#

You can run the examples with Docker containers. Docker containers are available from the NVIDIA GPU Cloud. Access the catalog of containers at http://ngc.nvidia.com/catalog/containers.

Most example notebooks demonstrate how to use Merlin Models with TensorFlow. The following container can train a model and perform inference and is capable for all the notebooks:

  • merlin-tensorflow (contains Merlin Core, Merlin Models, Merlin Systems, NVTabular, TensorFlow, and Triton Inference Server)

Alternatively, you can install Merlin Models from source and other required libraries to run the notebooks on your host by following the instructions in the README from the GitHub repository.

To run the example notebooks using Docker containers, perform the following steps:

  1. Pull and start the container by running the following command:

    docker run --gpus all --rm -it \
      -p 8888:8888 -p 8797:8787 -p 8796:8786 --ipc=host \
      <docker container> /bin/bash
    

    The container opens a shell when the run command execution is completed. Your shell prompt should look similar to the following example:

    root@2efa5b50b909:
    
  2. Start the JupyterLab server by running the following command:

    jupyter-lab --allow-root --ip='0.0.0.0'
    

    View the messages in your terminal to identify the URL for JupyterLab. The messages in your terminal show similar lines to the following example:

    Or copy and paste one of these URLs:
    http://2efa5b50b909:8888/lab?token=9b537d1fda9e4e9cadc673ba2a472e247deee69a6229ff8d
    or http://127.0.0.1:8888/lab?token=9b537d1fda9e4e9cadc673ba2a472e247deee69a6229ff8d
    
  3. Open a browser and use the 127.0.0.1 URL provided in the messages by JupyterLab.

  4. After you log in to JupyterLab, navigate to the /models/examples directory to try out the example notebooks.