PROWAREtech
.NET: Convert a Softmax Loss to a Percentage Accuracy
How to convert a softmax cross-entropy neural network loss value to an accuracy percentage (0-1) for any number of classes or categories; written in C#.
- For perfect predictions, the loss approaches 0, which should correspond to 100% accuracy (1.0)
- For random predictions, the loss approaches -ln(1/numClasses), which should correspond to 0% accuracy (0.0)
- The function performs a linear interpolation between these points
/// <summary>
/// Converts a softmax cross-entropy loss value to an accuracy percentage (0-1).
/// For a perfectly predicted class, the loss approaches 0, resulting in accuracy approaching 1.
/// For completely random predictions, the loss approaches -ln(1/numClasses), resulting in accuracy approaching 0.
/// </summary>
/// <param name="loss">The cross-entropy loss value from the network</param>
/// <param name="numClasses">Number of possible classes in the classification task</param>
/// <returns>Accuracy value between 0 and 1 (0% to 100%)</returns>
public static float ConvertSoftmaxLossToAccuracy(float loss, int numClasses)
{
// Calculate the maximum possible loss (random prediction)
// For random prediction, probability is 1/numClasses for each class
float maxLoss = -MathF.Log(1.0f / numClasses);
// Calculate accuracy: 1 when loss is 0, 0 when loss equals maxLoss
// Using a linear interpolation between these points
float accuracy = 1.0f - (loss / maxLoss);
// Clamp the result between 0 and 1 to handle edge cases
return Math.Max(0.0f, Math.Min(1.0f, accuracy));
}
How to use this static method:
float loss = 0.01f;
int numberOfClasses = 10;
float percentAccuracy = ConvertSoftmaxLossToAccuracy(loss, numberOfClasses);
Console.WriteLine($"{percentAccuracy:P2}");
// --- OR ---
float percentAccuracy2 = ConvertSoftmaxLossToAccuracy(loss, numberOfClasses) * 100;
Console.WriteLine($"{percentAccuracy2:F2}%");
Comment