SAMI THAKUR
Linear Regression
Linear regression is a fundamental statistical and machine learning technique used to model the
relationship between a dependent variable (target) and one or more independent variables
(features). It assumes a linear relationship between the input variables and the output.
Assumptions of Linear Regression:
• Linearity: The relationship between features and target is linear.
• Independence: Observations are independent of each other.
• Homoscedasticity: The variance of residuals is constant across all levels of the
independent variables.
• Normality: Residuals are normally distributed (important for inference).
• No multicollinearity: Independent variables are not highly correlated with each other.
Cost Function in Linear Regression
A cost function is a mathematical function that measures the difference between the actual and
predicted values in a machine learning model. It helps in optimizing the model by minimizing
this difference.
, Least Squares Method
The Least Squares Method is a mathematical approach used in Linear Regression to minimize
the differences between actual and predicted values. It helps determine the best-fitting line by
minimizing the sum of the squared residuals.