Apache Singa
A General Distributed Deep Learning Library
|
Apply constraints for parameters (gradient). More...
#include <optimizer.h>
Public Member Functions | |
Constraint (const ConstraintConf &conf) | |
Constraint (const string &type, float threshold) | |
void | Setup (const ConstraintConf &conf) |
void | Setup (const string &conf_str) |
void | Apply (int epoch, const Tensor &value, Tensor &grad, int step=-1) |
Apply the constraint to a single parmeter object, e.g., W, or b e.g., clip each gradient if it is too large w.r.t the threshold, https://www.reddit.com/r/MachineLearning/comments/31b6x8/gradient_clipping_rnns/. | |
void | Apply (int epoch, const vector< Tensor > &values, const vector< Tensor > &grads, int step=-1) |
Apply the constraint for multiple parameter objects together. More... | |
Apply constraints for parameters (gradient).
E.g., restrict the norm of parmeter gradients to be within a threshold. http://keras.io/constraints/ TODO(wangwei) implement a sub-class for each type of constraint
void singa::Constraint::Apply | ( | int | epoch, |
const vector< Tensor > & | values, | ||
const vector< Tensor > & | grads, | ||
int | step = -1 |
||
) |
Apply the constraint for multiple parameter objects together.
https://github.com/Lasagne/Lasagne/blob/master/lasagne/updates.py