deepof.model_utils module

deepof.model_utils

Utility functions for both training autoencoder models in deepof.models and tuning hyperparameters with deepof.hypermodels.

deepof.model_utils.cluster_frequencies_regularizer(...)

Compute the KL divergence between the cluster assignment distribution and a uniform prior across clusters.

deepof.model_utils.compute_kmeans_loss(...)

Add a penalty to the singular values of the Gram matrix of the latent means.

deepof.model_utils.compute_shannon_entropy(tensor)

Compute Shannon entropy for a given tensor.

deepof.model_utils.create_look_ahead_mask(size)

Create a triangular matrix containing an increasing amount of ones from left to right on each subsequent row.

deepof.model_utils.create_masks(inp)

Given an input sequence, it creates all necessary masks to pass it through the transformer architecture.

deepof.model_utils.create_padding_mask(seq)

Create a padding mask, with zeros where data is missing, and ones where data is available.

deepof.model_utils.dcl_loss(history, future, ...)

Compute the DCL loss function, as described in the paper "Debiased Contrastive Learning" (https://github.com/chingyaoc/DCL/).

deepof.model_utils.embedding_model_fitting(...)

Trains the specified embedding model on the preprocessed data.

deepof.model_utils.embedding_per_video(...)

Use a previously trained model to produce embeddings, soft_counts and breaks per experiment in table_dict format.

deepof.model_utils.fc_loss(history, future, ...)

Compute the FC loss function, as described in the paper "Fully-Contrastive Learning of Visual Representations" (https://arxiv.org/abs/2004.11362).

deepof.model_utils.find_learning_rate(model, ...)

Train the provided model for an epoch with an exponentially increasing learning rate.

deepof.model_utils.get_angles(pos, i, d_model)

Auxiliary function for positional encoding computation.

deepof.model_utils.get_callbacks(...[, ...])

Generate callbacks used for model training.

deepof.model_utils.get_hard_counts(soft_counts)

Compute hard counts per cluster in a differentiable way.

deepof.model_utils.get_k_nearest_neighbors(...)

Retrieve indices of the k nearest neighbors in tensor to the vector with the specified index.

deepof.model_utils.get_recurrent_block(x, ...)

Build a recurrent embedding block, using a 1D convolution followed by two bidirectional GRU layers.

deepof.model_utils.hard_loss(history, ...[, ...])

Compute the Hard loss function, as described in the paper "Contrastive Learning with Hard Negative Samples" (https://arxiv.org/abs/2011.03343).

deepof.model_utils.log_hyperparameters()

Log hyperparameters in tensorboard.

deepof.model_utils.nce_loss(history, future, ...)

Compute the NCE loss function, as described in the paper "A Simple Framework for Contrastive Learning of Visual Representations" (https://arxiv.org/abs/2002.05709).

deepof.model_utils.plot_lr_vs_loss(rates, losses)

Plot learning rate versus the loss function of the model.

deepof.model_utils.positional_encoding(...)

Compute positional encodings, as in https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf.

deepof.model_utils.select_contrastive_loss(...)

Select and applies the contrastive loss function to be used in the Contrastive embedding models.

deepof.model_utils.tune_search(...[, ...])

Define the search space using keras-tuner and hyperband or bayesian optimization.

deepof.model_utils.ClusterControl(*args, ...)

Identity layer.

deepof.model_utils.CustomStopper(...)

Custom early stopping callback.

deepof.model_utils.ExponentialLearningRate(factor)

Simple class that allows to grow learning rate exponentially during training.

deepof.model_utils.ProbabilisticDecoder(...)

Map the reconstruction output of a given decoder to a multivariate normal distribution.

deepof.model_utils.TransformerDecoder(*args, ...)

Transformer decoder.

deepof.model_utils.TransformerDecoderLayer(...)

Transformer decoder layer.

deepof.model_utils.TransformerEncoder(*args, ...)

Transformer encoder.

deepof.model_utils.TransformerEncoderLayer(...)

Transformer encoder layer.