Author: Awais Farooq

  • LearningRateScheduler

    Learning rate scheduler. At the beginning of every epoch, this callback gets the updated learning rate value from schedule function provided at __init__, with the current epoch and current learning rate, and applies the updated learning rate on the optimizer. Arguments Example

  • EarlyStopping

    Stop training when a monitored metric has stopped improving. Assuming the goal of a training is to minimize the loss. With this, the metric to be monitored would be ‘loss’, and mode would be ‘min’. A model.fit() training loop will check at end of every epoch whether the loss is no longer decreasing, considering the min_delta and patience if applicable. Once it’s found…

  • TensorBoard

    Enable visualizations for TensorBoard. TensorBoard is a visualization tool provided with TensorFlow. A TensorFlow installation is required to use this callback. This callback logs events for TensorBoard, including: When used in model.evaluate() or regular validation in addition to epoch summaries, there will be a summary that records evaluation metrics vs model.optimizer.iterations written. The metric names will be prepended with evaluation,…

  • BackupAndRestore

    Callback to back up and restore the training state. BackupAndRestore callback is intended to recover training from an interruption that has happened in the middle of a Model.fit execution, by backing up the training states in a temporary checkpoint file, at the end of each epoch. Each backup overwrites the previously written checkpoint file, so at any given…

  • ModelCheckpoint

    Callback to save the Keras model or model weights at some frequency. ModelCheckpoint callback is used in conjunction with training using model.fit() to save a model or weights (in a checkpoint file) at some interval, so the model or weights can be loaded later to continue the training from the state saved. A few options this callback provides…

  • Base Callback class

    Base class used to build new callbacks. Callbacks can be passed to keras methods such as fit(), evaluate(), and predict() in order to hook into the various stages of the model training, evaluation, and inference lifecycle. To create a custom callback, subclass keras.callbacks.Callback and override the method associated with the stage of interest. Example If you want to use Callback objects in a…

  • Layer weight constraints

    Usage of constraints Classes from the keras.constraints module allow setting constraints (eg. non-negativity) on model parameters during training. They are per-variable projection functions applied to the target variable after each gradient update (when using fit()). The exact API will depend on the layer, but the layers Dense, Conv1D, Conv2D and Conv3D have a unified API. These layers expose two keyword arguments: Available weight constraints…

  • Layer weight regularizers

    Regularizers allow you to apply penalties on layer parameters or layer activity during optimization. These penalties are summed into the loss function that the network optimizes. Regularization penalties are applied on a per-layer basis. The exact API will depend on the layer, but many layers (e.g. Dense, Conv1D, Conv2D and Conv3D) have a unified API. These layers expose 3 keyword…

  • Layer weight initializers

    Usage of initializers Initializers define the way to set the initial random weights of Keras layers. The keyword arguments used for passing initializers to layers depends on the layer. Usually, it is simply kernel_initializer and bias_initializer: All built-in initializers can also be passed via their string identifier: Available initializers The following built-in initializers are available as part of…

  • Layer activation functions

    Usage of activations Activations can either be used through an Activation layer, or through the activation argument supported by all forward layers: This is equivalent to: All built-in activations may also be passed via their string identifier: Available activations relu function Applies the rectified linear unit activation function. With default values, this returns the standard ReLU activation: max(x, 0), the element-wise…