Merlin TensorFlow Inference Support Matrix
This container enables you to deploy NVTabular workflows and TensorFlow models to the Triton Inference Server for production.
TensorFlow is not included in the container. The container includes a backend for TensorFlow that Triton Inference Server uses to execute TensorFlow models.
22.xx Container Images
Container Release |
Release 22.04 |
Release 22.03 |
---|---|---|
DGX |
||
DGX System |
|
|
Operating System |
Ubuntu 20.04.4 LTS |
Ubuntu 20.04.3 LTS |
NVIDIA Certified Systems |
||
NVIDIA Driver |
NVIDIA Driver version 465.19.01 or later is required. However, if you’re running on Data Center GPUs (formerly Tesla) such as T4, you can use any of the following NVIDIA Driver versions:
Note: The CUDA Driver Compatibility Package does not support all drivers. |
NVIDIA Driver version 465.19.01 or later is required. However, if you’re running on Data Center GPUs (formerly Tesla) such as T4, you can use any of the following NVIDIA Driver versions:
Note: The CUDA Driver Compatibility Package does not support all drivers. |
GPU Model |
||
Base Container Image |
||
Container Operating System |
Ubuntu 20.04.4 LTS |
Ubuntu 20.04.3 LTS |
Base Container |
Triton version 22.03 |
Triton version 22.02 |
CUDA |
11.6.1.005 |
11.6.0.021 |
RMM |
21.12.0 |
21.12.00 |
cuDF |
22.2.0 |
21.12.02 |
cuDNN |
8.3.3.40+cuda11.5 |
8.3.2.44+cuda11.5 |
Merlin Core |
0.2.0 |
v0.1.1+3.gee1d59d |
Merlin Models |
0.3.0 |
Not applicable |
Merlin Systems |
0.1.0 |
Not applicable |
NVTabular |
1.0.0 |
0.11.0 |
Transformers4Rec |
0.1.7 |
0.1.6 |
HugeCTR |
Not applicable |
Not applicable |
SM |
Not applicable |
Not applicable |
TensorFlow |
Not applicable |
Not applicable |
Triton Inference Server |
2.20.0 |
2.19.0 |
Size |
9.43 GB |
9.2 GB |