Read more @ Quicademy.com
There are no shortage of modeling methods and techniques out there for statistical analysts to utilize when working with data. Sometimes, it can be helpful to have a resource to look at and compare the value of a modeling method based on available data and the type of information an individual is interested in pulling from a dataset.
In this regard, I have curated a list of statistical techniques that can be used and when to use them. Eventually, a follow up article will be provided that discusses which packages in python can be used to easily access and deploy these techniques as well. Enjoy!
1. Linear Regression:
- Summary: Linear regression models the relationship between a dependent variable and one or more independent variables using a linear equation. It’s used when you want to predict a continuous target variable based on linear relationships with one or more predictors.
-When to use: Use linear regression when you have a clear linear relationship between the target variable and predictor(s) and assumptions of linearity, independence, and constant variance are met.
2. Multiple Linear Regression:
- Summary: Multiple linear regression extends linear regression to multiple independent variables, allowing you to model more complex relationships. It’s useful when you have multiple predictors influencing the dependent variable.
- When to use: Use multiple linear regression when there are several predictors affecting the target variable, and you want to assess their combined impact.
3. Polynomial Regression:
- Summary: Polynomial regression models relationships by fitting a polynomial equation to the data. It’s employed when the relationship between the variables is nonlinear and can be approximated by a polynomial curve.
- When to use: Use polynomial regression when a simple linear model doesn’t capture the underlying relationship between the variables, and you suspect a curved pattern.
4. Ridge Regression: