Pages

Machine Learning Quiz (134 Objective Questions) Start ML Quiz

Deep Learning Quiz (152 Objective Questions) Start DL Quiz

Thursday, 9 May 2019

Linear Regression is a supervised machine learning algorithm which is very easy to learn and implement. Following are the advantages and disadvantage of Linear Regression:

1. Linear Regression performs well when the dataset is linearly separable. We can use it to find the nature of the relationship among the variables.

2. Linear Regression is easier to implement, interpret and very efficient to train.

3. Linear Regression is prone to over-fitting but it can be easily avoided using some dimensionality reduction techniques, regularization (L1 and L2) techniques and cross-validation.

1. Main limitation of Linear Regression is the assumption of linearity between the dependent variable and the independent variables. In the real world, the data is rarely linearly separable. It assumes that there is a straight-line relationship between the dependent and independent variables which is incorrect many times.

2. Prone to noise and overfitting: If the number of observations are lesser than the number of features, Linear Regression should not be used, otherwise it may lead to overfit because is starts considering noise in this scenario while building the model.

3. Prone to outliers: Linear regression is very sensitive to outliers (anomalies). So, outliers should be analyzed and removed before applying Linear Regression to the dataset.

4. Prone to multicollinearity: Before applying Linear regression, multicollinearity should be removed (using dimensionality reduction techniques) because it assumes that there is no relationship among independent variables.

In summary, Linear Regression is great tool to analyze the relationships among the variables but it isn’t recommended for most practical applications because it over-simplifies real world problems by assuming linear relationship among the variables