Home | Trees | Indices | Help |
---|
|
The module module provides a modular architecture to build a neural network.
These modules can be either Standalone modules, i.e. modules that doesn't have submodules, or Container modules, i.e. modules that are composed of one or several other module(s). The Container modules are typically used to create arbitrarily complex neural network architectures, while Standalone modules acts as Linear regression layers or non-linear Activation layers.
See their respective documentations for more details about their use.
|
|||
Module A Module is a part of a neural network architecture, that may have parameters. |
|||
Standalone A Standalone module computes its outputs without relying on other modules, i.e. |
|||
Linear A Linear module computes its outputs as a linear transformation of its inputs. |
|||
Container A Container module computes its outputs thanks to other modules, i.e. |
|||
Sequential A Sequential module computes its outputs sequentially, i.e. |
|||
Concat A Concat module computes its outputs in parallel, i.e. |
|||
Activation An Activation module computes its outputs without any parameter, but with a function f : ℝn − > ℝn applied to its inputs vector. |
|||
Tanh A Tanh activation module computes its outputs with the non-linear element-wise hyperbolic tangent function, that can be defined as tanh(x) = [(exp(xi) − exp( − xi))/(exp(xi) + exp( − xi))]ni = 1, with x = [x1, x2, …, xn] ∈ ℝn. |
|||
Sigmoid A Sigmoid activation module computes its outputs with the non-linear element-wise sigmoid function, that can be defined as σ(x) = (1 + tanh(x ⁄ 2)) ⁄ 2 = [1 ⁄ (1 + exp( − xi))]ni = 1, with x = [x1, x2, …, xn] ∈ ℝn. |
|||
Softmax A Softmax activation module computes its outputs with the non-linear softmax function, that can be defined as softmax(x) = [exp(xi) ⁄ ∑ni = 1exp(xi)]ni = 1, with x = [x1, x2, …, xn] ∈ ℝn. |
|
|||
|
|
|
Home | Trees | Indices | Help |
---|
Generated by Epydoc 3.0.2 on Thu Aug 20 13:34:14 2015 | http://epydoc.sourceforge.net |