public static class Neurons.Softmax extends Neurons.Output
Neurons.ExpRectifier, Neurons.ExpRectifierDropout, Neurons.Input, Neurons.Linear, Neurons.Maxout, Neurons.MaxoutDropout, Neurons.Output, Neurons.Rectifier, Neurons.RectifierDropout, Neurons.Softmax, Neurons.Tanh, Neurons.TanhDropout
Constructor and Description |
---|
Neurons.Softmax(int units) |
Modifier and Type | Method and Description |
---|---|
protected void |
fprop(long seed,
boolean training,
int n)
Forward propagation
|
protected void |
setOutputLayerGradient(double target,
int mb,
int n)
Part of backpropagation for classification
Update every weight as follows: w += -rate * dE/dw
Compute dE/dw via chain rule: dE/dw = dE/dy * dy/dnet * dnet/dw, where net = sum(xi*wi)+b and y = activation function
|
bprop
autoEncoderGradient, bpropOutputLayer, init, momentum, momentum, rate, toString
protected void fprop(long seed, boolean training, int n)
Neurons
protected void setOutputLayerGradient(double target, int mb, int n)
setOutputLayerGradient
in class Neurons
target
- actual class label (integer)