transformers4rec.tf.block package
Submodules
transformers4rec.tf.block.base module
-
class
transformers4rec.tf.block.base.
Block
(*args, **kwargs)[source] Bases:
transformers4rec.config.schema.SchemaMixin
,keras.engine.base_layer.Layer
-
class
transformers4rec.tf.block.base.
SequentialBlock
(*args, **kwargs)[source] Bases:
transformers4rec.tf.block.base.Block
The SequentialLayer represents a sequence of Keras layers. It is a Keras Layer that can be used instead of tf.keras.layers.Sequential, which is actually a Keras Model. In contrast to keras Sequential, this layer can be used as a pure Layer in tf.functions and when exporting SavedModels, without having to pre-declare input and output shapes. In turn, this layer is usable as a preprocessing layer for TF Agents Networks, and can be exported via PolicySaver. Usage:
c = SequentialLayer([layer1, layer2, layer3]) output = c(inputs) # Equivalent to: output = layer3(layer2(layer1(inputs)))
-
property
inputs
-
property
trainable_weights
-
property
non_trainable_weights
-
property
trainable
-
property
losses
-
property
regularizers
-
property
transformers4rec.tf.block.dlrm module
-
class
transformers4rec.tf.block.dlrm.
ExpandDimsAndToTabular
(*args, **kwargs)[source] Bases:
keras.layers.core.lambda_layer.Lambda
-
class
transformers4rec.tf.block.dlrm.
DLRMBlock
(*args, **kwargs)[source] Bases:
transformers4rec.tf.block.base.Block
-
classmethod
from_schema
(schema: merlin_standard_lib.schema.schema.Schema, bottom_mlp: Union[keras.engine.base_layer.Layer, transformers4rec.tf.block.base.Block], top_mlp: Optional[Union[keras.engine.base_layer.Layer, transformers4rec.tf.block.base.Block]] = None, **kwargs)[source]
-
classmethod
transformers4rec.tf.block.mlp module
transformers4rec.tf.block.transformer module
-
class
transformers4rec.tf.block.transformer.
TransformerPrepare
(*args, **kwargs)[source] Bases:
keras.engine.base_layer.Layer
-
class
transformers4rec.tf.block.transformer.
TransformerBlock
(*args, **kwargs)[source] Bases:
transformers4rec.tf.block.base.Block
Class to support HF Transformers for session-based and sequential-based recommendation models.
- Parameters
transformer (TransformerBody) – The T4RecConfig, The pre-trained HF model or the custom keras layer TF*MainLayer, related to specific transformer architecture.
masking – Needed when masking is applied on the inputs.
-
TRANSFORMER_TO_PREPARE
: Dict[Type[transformers.modeling_tf_utils.TFPreTrainedModel], Type[transformers4rec.tf.block.transformer.TransformerPrepare]] = {}
-
transformer
: transformers.modeling_tf_utils.TFPreTrainedModel
-
prepare_module
: Optional[transformers4rec.tf.block.transformer.TransformerPrepare]
-
classmethod
from_registry
(transformer: str, d_model: int, n_head: int, n_layer: int, total_seq_length: int, masking: Optional[transformers4rec.tf.masking.MaskSequence] = None)[source] Load the HF transformer architecture based on its name
- Parameters
transformer (str) – Name of the Transformer to use. Possible values are : [“reformer”, “gtp2”, “longformer”, “electra”, “albert”, “xlnet”]
d_model (int) – size of hidden states for Transformers
n_head – Number of attention heads for Transformers
n_layer (int) – Number of layers for RNNs and Transformers”
total_seq_length (int) – The maximum sequence length