SparseOperationKit Optimizers

There are several optimized optimizers inside SOK.

Adam optimizer

class sparse_operation_kit.tf.keras.optimizers.adam.Adam(*args, **kwargs)[source]

Abbreviated as sok.tf.keras.optimizers.Adam.

The unique and unsorted_segment_sum are replaced with GPU implementations.

When TF version <= 2.4, this optimizer can be used to obtain performance gain, while TF version >= 2.5, its performance should be identical to tf.keras.optimizers.Adam.

All the arguments are identical to tf.keras.optimizers.Adam, please refer to https://tensorflow.google.cn/api_docs/python/tf/keras/optimizers/Adam for documentation.

Local update Adam optimizer