IActivationFunction InterfaceEncog Machine Learning Framework for .Net
This interface allows various activation functions to be used with the neural network. Activation functions are applied to the output from each layer of a neural network. Activation functions scale the output into the desired range. Methods are provided both to process the activation function, as well as the derivative of the function. Some training algorithms, particularly back propagation, require that it be possible to take the derivative of the activation function. Not all activation functions support derivatives. If you implement an activation function that is not derivable then an exception should be thrown inside of the derivativeFunction method implementation. Non-derivable activation functions are perfectly valid, they simply cannot be used with every training algorithm.

Namespace: Encog.Engine.Network.Activation
Assembly: encog-core-cs (in encog-core-cs.dll) Version: 3.3.0.0 (3.3.0.0)
Syntax

public interface IActivationFunction : ICloneable

The IActivationFunction type exposes the following members.

Methods

  NameDescription
Public methodActivationFunction
Implements the activation function. The array is modified according to the activation function being used. See the class description for more specific information on this type of activation function.
Public methodClone
Creates a new object that is a copy of the current instance.
(Inherited from ICloneable.)
Public methodDerivativeFunction
Calculate the derivative. For performance reasons two numbers are provided. First, the value "b" is simply the number that we would like to calculate the derivative of. Second, the value "a", which is the value returned by the activation function, when presented with "b". We use two values because some of the most common activation functions make use of the result of the activation function. It is bad for performance to calculate this value twice. Yet, not all derivatives are calculated this way. By providing both the value before the activation function is applied ("b"), and after the activation function is applied("a"), the class can be constructed to use whichever value will be the most efficient.
Top
Properties

  NameDescription
Public propertyHasDerivative
Does this activation function have a derivative.
Public propertyParamNames
Public propertyParams
Top
See Also