Uses of Interface
org.nd4j.weightinit.WeightInitScheme
-
Packages that use WeightInitScheme Package Description org.nd4j.autodiff.samediff org.nd4j.weightinit org.nd4j.weightinit.impl -
-
Uses of WeightInitScheme in org.nd4j.autodiff.samediff
Methods in org.nd4j.autodiff.samediff with parameters of type WeightInitScheme Modifier and Type Method Description SDVariableSameDiff. var(@NonNull String name, @NonNull VariableType variableType, WeightInitScheme weightInitScheme, DataType dataType, long... shape)Variable initialization with a specifiedWeightInitSchemeThis method creates VARIABLE type SDVariable - i.e., must be floating point, and is a trainable parameter.SDVariableSameDiff. var(@NonNull String name, @NonNull LongShapeDescriptor shape, WeightInitScheme weightInitScheme)Creates aSDVariablewith the given shape and name
The underlying array will be initialized using the specified weight initilization scheme
This is a VARIABLE type SDVariable - i.e., must be floating point, and is a trainable parameter.SDVariableSameDiff. var(@NonNull String name, @NonNull WeightInitScheme weightInitScheme, @lombok.NonNull long... shape)Variable initialization with a specifiedWeightInitScheme.SDVariableSameDiff. var(@NonNull String name, @NonNull WeightInitScheme weightInitScheme, @NonNull DataType dataType, @lombok.NonNull long... shape)Variable initialization with a specifiedWeightInitSchemeThis method creates VARIABLE type SDVariable - i.e., must be floating point, and is a trainable parameter.SDVariableSameDiff. var(WeightInitScheme weightInitScheme, DataType dataType, long... shape)Creates aSDVariablewith the specified shape and a generated name. -
Uses of WeightInitScheme in org.nd4j.weightinit
Classes in org.nd4j.weightinit that implement WeightInitScheme Modifier and Type Class Description classBaseWeightInitScheme -
Uses of WeightInitScheme in org.nd4j.weightinit.impl
Classes in org.nd4j.weightinit.impl that implement WeightInitScheme Modifier and Type Class Description classConstantInitSchemeInitialize the weight to zero.classDistributionInitSchemeInitialize the weights based on a givenDistributionclassIdentityInitSchemeInitialize the weight to one.classLecunUniformInitSchemeInitialize the weight to: randn(shape) //N(0, 2/nIn);classNDArraySupplierInitSchemeclassOneInitSchemeInitialize the weight to one.classReluInitSchemeInitialize the weight to: randn(shape) //N(0, 2/nIn);classReluUniformInitSchemeInitialize the weight to: U(-sqrt(6/fanIn), sqrt(6/fanIn)classSigmoidUniformInitSchemeInitialize the weight to: range = 4.0 * Math.sqrt(6.0 / (fanIn + fanOut)) U(-range,range)classUniformInitSchemeInitialize the weight to: range = 1.0 / Math.sqrt(fanIn) U(-range,range)classVarScalingNormalFanAvgInitSchemeInitialize the weight to: U / sqrt((fanin _ fanout) / 2)classVarScalingNormalFanInInitSchemeInitialize the weight to: U / fanInclassVarScalingNormalFanOutInitSchemeInitialize the weight to: U / sqrt(fanout)classVarScalingNormalUniformFanInInitSchemeInitialize the weight to: range = = 3.0 / Math.sqrt(fanIn) U(-range,range)classVarScalingNormalUniformFanOutInitSchemeInitialize the weight to: randn(shape) //N(0, 2/nIn);classVarScalingUniformFanAvgInitSchemeInitialize the weight to: range = = 3.0 / Math.sqrt((fanIn + fanOut) / 2) U(-range,range)classXavierFanInInitSchemeInitialize the weight to: randn(shape) //N(0, 2/nIn);classXavierInitSchemeInitialize the weight to: range = = U * sqrt(2.0 / (fanIn + fanOut)classXavierUniformInitSchemeclassZeroInitSchemeInitialize the weight to zero.
-