merlin.systems.dag.Ensemble#

class merlin.systems.dag.Ensemble(ops, schema, label_columns=None)[source]#

Bases: object

Class that represents an entire ensemble consisting of multiple models that run sequentially in tritonserver initiated by an inference request.

__init__(ops, schema, label_columns=None)[source]#

Construct a systems ensemble.

Parameters:
  • ops (InferenceNode) – An inference node that represents the chain of operators for the ensemble.

  • schema (Schema) – The schema of the input data.

  • label_columns (List[str], optional) – List of strings representing label columns, by default None

Methods

__init__(ops, schema[, label_columns])

Construct a systems ensemble.

export(export_path[, runtime])

Write out an ensemble model configuration directory.

load(path)

Load up a saved ensemble object from disk

save(path)

Save this ensemble to disk

transform(transformable[, runtime])

Delegate transformation of input data to the executor with the necessary ensemble graph.

Attributes

property input_schema#
property output_schema#
transform(transformable: Transformable, runtime=None)[source]#

Delegate transformation of input data to the executor with the necessary ensemble graph. This will traverse each node of the graph and mutate the input according to the operator in each node.

Parameters:
  • transformable (Transformable) – Input data to the graph (DataFrame or DictArray).

  • runtime (Runtime, optional) – The graph runtime to use to transform the inputs, by default None

Returns:

transformed data

Return type:

Transformable

save(path)[source]#

Save this ensemble to disk

Parameters:

path (str) – The path to save the ensemble to

classmethod load(path) Ensemble[source]#

Load up a saved ensemble object from disk

Parameters:

path (str) – The path to load the ensemble from

Returns:

The ensemble loaded from disk

Return type:

Ensemble

export(export_path, runtime=None, **kwargs)[source]#

Write out an ensemble model configuration directory. The exported ensemble is designed for use with Triton Inference Server.