Pages

Sunday, 24 February 2019

Loss Functions in Machine Learning (MAE, MSE, RMSE)

Loss Function indicates the difference between the actual value and the predicted value. If the magnitude of the loss function is high, it means our algorithm is showing a lot of variance in the result and needs to be corrected. 

Lets look into the types of loss functions in Machine Learning in detail. There are broadly two types of losses based on the type of algorithm we are using:

Types of Losses:

1. Regression Losses
2. Classification Losses

Lets first discuss regression losses:

1. Mean Absolute Error (MAE) or (L1 Loss)
2. Mean Squared Error (MSE) or (Quadratic Loss) or (L2 Loss)
3. Root Mean Squared Error (RMSE)
4. Mean Bias Error

1. Mean Absolute Error (MAE) or (L1 Loss)

This is the average of the sum of absolute differences between predicted values and actual values.

2. Mean Squared Error (MSE) or (Quadratic Loss) or (L2 Loss)

This is the average of the sum of squared difference between predicted values and actual values.

3. Root Mean Squared Error (RMSE)

This is a square root of the MSE (L2)



4. Mean Bias Error

Just like MAE (L1), but we don't take absolute value here. So, there is a possibility of negative values cancelling out positive values. That is why it is not that much popular loss function. Although less accurate in practice, it could determine if the model has positive bias or negative bias.

Regression Loss Functions in Scikit Learn Library in Python

mean_absolute_error and mean_squared_error are present in sklearn.metrics. Just import them like this:

from sklearn.metrics import mean_absolute_error, mean_squared_error 

You can calculate loss function like below. Here "y_test" is the actual value and "y_pred" is the predicted value.

meanAbsoluteError = mean_absolute_error(y_test, y_pred)
meanSquaredError = mean_squared_error(y_test, y_pred)
rootMeanSquaredError = np.sqrt(meanSquaredError)

print('Mean Absolute Error:', meanAbsoluteError)  

print('Mean Squared Error:', meanSquaredError)  
print('Root Mean Squared Error:', rootMeanSquaredError)

I will write on classification losses later on. So, stay tuned.

No comments:

Post a Comment