merlin.systems.dag.ops.softmax_sampling.SoftmaxSampling#

class merlin.systems.dag.ops.softmax_sampling.SoftmaxSampling(relevance_col, temperature=20.0, topk=10, _input_col=None)[source]#

Bases: InferenceOperator

Given inputs of ID and prediction, this operator will sort all inputs in descending order.

__init__(relevance_col, temperature=20.0, topk=10, _input_col=None)[source]#

Create a SoftmaxSampling Pipelineable Inference Operator.

Parameters:
  • relevance_col (string) – The column to judge sorting order with.

  • temperature (float, optional) – Value which will be used to effect the weights used in sorting, by default 20.0

  • topk (int, optional) – The max number of results you wish to receive as output, by default 10

  • _input_col (_type_, optional) – The column whose values will be sorted, by default None.

Methods

__init__(relevance_col[, temperature, topk, ...])

Create a SoftmaxSampling Pipelineable Inference Operator.

column_mapping(col_selector)

Compute which output columns depend on which input columns

compute_column_schema(col_name, input_schema)

compute_input_schema(root_schema, ...)

compute_output_schema(input_schema, col_selector)

Describe the operator's outputs

compute_selector(input_schema, selector[, ...])

Provides a hook method for sub-classes to override to implement custom column selection logic.

create_node(selector)

_summary_

export(path, input_schema, output_schema[, ...])

Write out a Triton model config directory

from_config(config, **kwargs)

Load operator and properties from Triton config

from_model_registry(registry, **kwargs)

Loads the InferenceOperator from the provided ModelRegistry.

from_path(path, **kwargs)

Loads the InferenceOperator from the path where it was exported after training.

load_artifacts(artifact_path)

Hook method that provides a way to load saved artifacts for the operator

output_column_names(col_selector)

Given a set of columns names returns the names of the transformed columns this operator will produce

save_artifacts([artifact_path])

Save artifacts required to be reload operator state from disk

transform(col_selector, transformable)

Transform the dataframe by applying this operator to the set of input columns

validate_schemas(parents_schema, ...[, ...])

Provides a hook method that sub-classes can override to implement schema validation logic.

Attributes

dependencies

dynamic_dtypes

export_name

Provides a clear common english identifier for this operator.

exportable_backends

is_subgraph

label

output_dtype

output_properties

output_tags

scalar_shape

supported_formats

supports

Returns what kind of data representation this operator supports

classmethod from_config(config, **kwargs) SoftmaxSampling[source]#

Load operator and properties from Triton config

property dependencies#
export(path: str, input_schema: Schema, output_schema: Schema, params: Optional[dict] = None, node_id: Optional[int] = None, version: int = 1, backend: str = 'ensemble')[source]#

Write out a Triton model config directory

compute_input_schema(root_schema: Schema, parents_schema: Schema, deps_schema: Schema, selector: ColumnSelector) Schema[source]#
compute_output_schema(input_schema: Schema, col_selector: ColumnSelector, prev_output_schema: Optional[Schema] = None) Schema[source]#

Describe the operator’s outputs

transform(col_selector: ColumnSelector, transformable: Transformable) Transformable[source]#

Transform the dataframe by applying this operator to the set of input columns