Package org.nd4j.linalg.learning.config
Class AdaGrad
- java.lang.Object
-
- org.nd4j.linalg.learning.config.AdaGrad
-
- All Implemented Interfaces:
Serializable,Cloneable,IUpdater
public class AdaGrad extends Object implements IUpdater
- See Also:
- Serialized Form
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static classAdaGrad.Builder
-
Field Summary
Fields Modifier and Type Field Description static doubleDEFAULT_ADAGRAD_EPSILONstatic doubleDEFAULT_ADAGRAD_LEARNING_RATE
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description AdaGradclone()Clone the updaterdoublegetLearningRate(int iteration, int epoch)Get the learning rate - if any - for the updater, at the specified iteration and epoch.booleanhasLearningRate()GradientUpdaterinstantiate(Map<String,INDArray> updaterState, boolean initializeStateArrays)GradientUpdaterinstantiate(INDArray viewArray, boolean initializeViewArray)Create a new gradient updatervoidsetLrAndSchedule(double lr, ISchedule lrSchedule)Set the learning rate and schedule.longstateSize(long numParams)Determine the updater state size for the given number of parameters.
-
-
-
Field Detail
-
DEFAULT_ADAGRAD_LEARNING_RATE
public static final double DEFAULT_ADAGRAD_LEARNING_RATE
- See Also:
- Constant Field Values
-
DEFAULT_ADAGRAD_EPSILON
public static final double DEFAULT_ADAGRAD_EPSILON
- See Also:
- Constant Field Values
-
-
Method Detail
-
stateSize
public long stateSize(long numParams)
Description copied from interface:IUpdaterDetermine the updater state size for the given number of parameters. Usually a integer multiple (0,1 or 2) times the number of parameters in a layer.
-
instantiate
public GradientUpdater instantiate(INDArray viewArray, boolean initializeViewArray)
Description copied from interface:IUpdaterCreate a new gradient updater- Specified by:
instantiatein interfaceIUpdater- Parameters:
viewArray- The updater state size view awayinitializeViewArray- If true: initialise the updater state- Returns:
-
instantiate
public GradientUpdater instantiate(Map<String,INDArray> updaterState, boolean initializeStateArrays)
- Specified by:
instantiatein interfaceIUpdater
-
getLearningRate
public double getLearningRate(int iteration, int epoch)Description copied from interface:IUpdaterGet the learning rate - if any - for the updater, at the specified iteration and epoch. Note that if no learning rate is applicable (AdaDelta, NoOp updaters etc) then Double.NaN should be return- Specified by:
getLearningRatein interfaceIUpdater- Parameters:
iteration- Iteration at which to get the learning rateepoch- Epoch at which to get the learning rate- Returns:
- Learning rate, or Double.NaN if no learning rate is applicable for this updater
-
hasLearningRate
public boolean hasLearningRate()
- Specified by:
hasLearningRatein interfaceIUpdater- Returns:
- True if the updater has a learning rate hyperparameter, false otherwise
-
setLrAndSchedule
public void setLrAndSchedule(double lr, ISchedule lrSchedule)Description copied from interface:IUpdaterSet the learning rate and schedule. Note: may throw an exception ifIUpdater.hasLearningRate()returns false.- Specified by:
setLrAndSchedulein interfaceIUpdater- Parameters:
lr- Learning rate to set (typically not used if LR schedule is non-null)lrSchedule- Learning rate schedule to set (may be null)
-
-