| Interface | Description |
|---|---|
| Neurons.Matrix |
Abstract matrix interface
|
| Neurons.Vector |
Abstract vector interface
|
| Class | Description |
|---|---|
| DeepLearning |
Deep Learning Neural Net implementation based on MRTask2
|
| DeepLearningModel |
The Deep Learning model
It contains a DeepLearningModelInfo with the most up-to-date model,
a scoring history, as well as some helpers to indicated the progress
|
| DeepLearningModel.DeepLearningModelInfo | |
| DeepLearningModel.Errors | |
| DeepLearningTask | |
| DeepLearningTask2 |
DRemoteTask-based Deep Learning.
|
| Dropout |
Helper class for dropout training of Neural Nets
|
| Neurons |
This class implements the concept of a Neuron layer in a Neural Network
During training, every MRTask2 F/J thread is expected to create these neurons for every map call (Cheap to make).
|
| Neurons.DenseColMatrix |
Dense column matrix implementation
|
| Neurons.DenseRowMatrix |
Dense row matrix implementation
|
| Neurons.DenseVector |
Dense vector implementation
|
| Neurons.Input |
Input layer of the Neural Network
This layer is different from other layers as it has no incoming weights,
but instead gets its activation values from the training points.
|
| Neurons.Linear |
Output neurons for regression - Softmax
|
| Neurons.Maxout |
Maxout neurons
|
| Neurons.MaxoutDropout |
Maxout neurons with dropout
|
| Neurons.Output |
Abstract class for Output neurons
|
| Neurons.Rectifier |
Rectifier linear unit (ReLU) neurons
|
| Neurons.RectifierDropout |
Rectifier linear unit (ReLU) neurons with dropout
|
| Neurons.Softmax |
Output neurons for classification - Softmax
|
| Neurons.SparseRowMatrix |
Sparse row matrix implementation
|
| Neurons.SparseVector |
Sparse vector implementation
|
| Neurons.Tanh |
Tanh neurons - most common, most stable
|
| Neurons.TanhDropout |
Tanh neurons with dropout
|
| Enum | Description |
|---|---|
| DeepLearning.Activation |
Activation functions
|
| DeepLearning.ClassSamplingMethod | |
| DeepLearning.InitialWeightDistribution | |
| DeepLearning.Loss |
Loss functions
CrossEntropy is recommended
|