deepof.model_utils

Utility functions for both training autoencoder models in deepof.models and tuning hyperparameters with deepof.hypermodels.

Functions

cluster_frequencies_regularizer(soft_counts, k)

Compute the KL divergence between the cluster assignment distribution and a uniform prior across clusters.

compute_kmeans_loss(latent_means[, weight, ...])

Add a penalty to the singular values of the Gram matrix of the latent means.

compute_shannon_entropy(tensor)

Compute Shannon entropy for a given tensor.

create_look_ahead_mask(size)

Create a triangular matrix containing an increasing amount of ones from left to right on each subsequent row.

create_masks(inp)

Given an input sequence, it creates all necessary masks to pass it through the transformer architecture.

create_padding_mask(seq)

Create a padding mask, with zeros where data is missing, and ones where data is available.

dcl_loss(history, future, similarity[, ...])

Compute the DCL loss function, as described in the paper "Debiased Contrastive Learning" (https://github.com/chingyaoc/DCL/).

embedding_model_fitting(preprocessed_object, ...)

Trains the specified embedding model on the preprocessed data.

embedding_per_video(coordinates, ...[, ...])

Use a previously trained model to produce embeddings, soft_counts and breaks per experiment in table_dict format.

fc_loss(history, future, similarity[, ...])

Compute the FC loss function, as described in the paper "Fully-Contrastive Learning of Visual Representations" (https://arxiv.org/abs/2004.11362).

find_learning_rate(model, data[, epochs, ...])

Train the provided model for an epoch with an exponentially increasing learning rate.

get_angles(pos, i, d_model)

Auxiliary function for positional encoding computation.

get_callbacks(embedding_model, encoder_type)

Generate callbacks used for model training.

get_hard_counts(soft_counts)

Compute hard counts per cluster in a differentiable way.

get_k_nearest_neighbors(tensor, k, index)

Retrieve indices of the k nearest neighbors in tensor to the vector with the specified index.

get_recurrent_block(x, latent_dim, ...)

Build a recurrent embedding block, using a 1D convolution followed by two bidirectional GRU layers.

hard_loss(history, future, similarity, ...)

Compute the Hard loss function, as described in the paper "Contrastive Learning with Hard Negative Samples" (https://arxiv.org/abs/2011.03343).

log_hyperparameters()

Log hyperparameters in tensorboard.

nce_loss(history, future, similarity[, ...])

Compute the NCE loss function, as described in the paper "A Simple Framework for Contrastive Learning of Visual Representations" (https://arxiv.org/abs/2002.05709).

plot_lr_vs_loss(rates, losses)

Plot learning rate versus the loss function of the model.

positional_encoding(position, d_model)

Compute positional encodings, as in https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf.

select_contrastive_loss(history, future, ...)

Select and applies the contrastive loss function to be used in the Contrastive embedding models.

tune_search(preprocessed_object, ...[, ...])

Define the search space using keras-tuner and hyperband or bayesian optimization.

Classes

ClusterControl(*args, **kwargs)

Identity layer.

CustomStopper(start_epoch, *args, **kwargs)

Custom early stopping callback.

ExponentialLearningRate(factor)

Simple class that allows to grow learning rate exponentially during training.

ProbabilisticDecoder(*args, **kwargs)

Map the reconstruction output of a given decoder to a multivariate normal distribution.

TransformerDecoder(*args, **kwargs)

Transformer decoder.

TransformerDecoderLayer(*args, **kwargs)

Transformer decoder layer.

TransformerEncoder(*args, **kwargs)

Transformer encoder.

TransformerEncoderLayer(*args, **kwargs)

Transformer encoder layer.