PROWAREtech

articles » current » dot-net » common-activation-functions

.NET: Common Activation Functions with Their Derivatives in C#

The neural network activation functions Rectified Linear Unit (ReLU), Leaky Rectified Linear Unit, Exponential Linear Unit (ELU), Hyperbolic Tangent (tanh), and Sigmoid with the derivates for each.

Leaky ReLU addresses the "dying ReLU" problem which comes from setting values to zero. It's recommended over the original ReLU procedure. Sigmoid is still used for binary output neural networks. For some networks, Sigmoid and tanh could suffice.

See the neural network and convolutional neural network examples.


public static class ActivationFunctionsWithDerivatives
{
	public static double ReLU(double x) // Rectified Linear Unit function
	{
		return x > 0 ? x : 0;
	}

	public static double ReLUPrime(double x) // derivative of ReLU
	{
		return x > 0 ? 1 : 0;
	}

	public static double LeakyReLU(double x, double alpha = 0.01) // Rectified Linear Unit function (Leaky variant, more modern than ReLU); alpha default is 0.01, but this can be modified, bigger or smaller
	{
		return x >= 0 ? x : (alpha * x);
	}

	public static double LeakyReLUPrime(double x, double alpha = 0.01) // derivative of Leaky ReLU; make sure to use same alpha value as passed to LeakyReLU()
	{
		return x >= 0 ? 1 : alpha;
	}

	public static double ELU(double x, double alpha = 1.0) // Exponential Linear Unit function
	{
		return x >= 0 ? x : (alpha * (Math.Exp(x) - 1.0));
	}

	public static double ELUPrime(double x, double alpha = 1.0) // derivative of ELU; make sure to use same alpha value as passed to ELU()
	{
		return x >= 0 ? 1 : (alpha * Math.Exp(x));
	}
	public static double Tanh(double x) // Hyperbolic Tangent function
	{
		double n = Math.Exp(-x);
		double p = Math.Exp(x);
		return (p - n) / (p + n);
	}
	public static double TanhPrime(double x) // derivative of Tanh
	{
		double n = Math.Exp(-x);
		double p = Math.Exp(x);
		return 1.0 - ((p - n) / (p + n)) * ((p - n) / (p + n)); // this is simply: 1 - (tanh(x) * tanh(x))
	}

	public static double Sigmoid(double x) // and oldie but goodie, ReLU has replaced it for the most part
	{
		return 1.0 / (1 + Math.Exp(-x));
	}

	public static double SigmoidPrime(double x) // derivative of Sigmoid
	{
		double n = Exp(-x);
		return (1.0 / (1.0 + n)) * (1.0 - (1.0 / (1.0 + n))); // this is simply: Sigmoid(x) * (1.0 - Sigmoid(x))
	}
}

PROWAREtech

Hello there! How can I help you today?
Ask any question

PROWAREtech

This site uses cookies. Cookies are simple text files stored on the user's computer. They are used for adding features and security to this site. Read the privacy policy.
ACCEPT REJECT