deepof.model_utils module
Utility functions for both training autoencoder models in deepof.models and tuning hyperparameters with deepof.hypermodels. |
|
Compute the KL divergence between the cluster assignment distribution and a uniform prior across clusters. |
|
Add a penalty to the singular values of the Gram matrix of the latent means. |
|
Compute Shannon entropy for a given tensor. |
|
Create a triangular matrix containing an increasing amount of ones from left to right on each subsequent row. |
|
Given an input sequence, it creates all necessary masks to pass it through the transformer architecture. |
|
Create a padding mask, with zeros where data is missing, and ones where data is available. |
|
|
Compute the DCL loss function, as described in the paper "Debiased Contrastive Learning" (https://github.com/chingyaoc/DCL/). |
Trains the specified embedding model on the preprocessed data. |
|
Use a previously trained model to produce embeddings, soft_counts and breaks per experiment in table_dict format. |
|
|
Compute the FC loss function, as described in the paper "Fully-Contrastive Learning of Visual Representations" (https://arxiv.org/abs/2004.11362). |
|
Train the provided model for an epoch with an exponentially increasing learning rate. |
|
Auxiliary function for positional encoding computation. |
|
Generate callbacks used for model training. |
|
Compute hard counts per cluster in a differentiable way. |
Retrieve indices of the k nearest neighbors in tensor to the vector with the specified index. |
|
Build a recurrent embedding block, using a 1D convolution followed by two bidirectional GRU layers. |
|
|
Compute the Hard loss function, as described in the paper "Contrastive Learning with Hard Negative Samples" (https://arxiv.org/abs/2011.03343). |
Log hyperparameters in tensorboard. |
|
|
Compute the NCE loss function, as described in the paper "A Simple Framework for Contrastive Learning of Visual Representations" (https://arxiv.org/abs/2002.05709). |
|
Plot learning rate versus the loss function of the model. |
Compute positional encodings, as in https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf. |
|
Select and applies the contrastive loss function to be used in the Contrastive embedding models. |
|
|
Define the search space using keras-tuner and hyperband or bayesian optimization. |
|
Identity layer. |
Custom early stopping callback. |
|
Simple class that allows to grow learning rate exponentially during training. |
|
Map the reconstruction output of a given decoder to a multivariate normal distribution. |
|
|
Transformer decoder. |
Transformer decoder layer. |
|
|
Transformer encoder. |
Transformer encoder layer. |