Objective
Partial Least Squares (PLS) Regression is a form of regression that is especially useful when there are a large number of explanatory (i.e. independent) variable (especially when there are more such variables than data vectors for these variables) or when there is multicollinearity (i.e correlation) between the independent variables.
Essentially, PLS Regression maps the independent variables into a smaller number of latent variables and then uses ordinary multiple regression or multivariate regression to create a regression model. In this regard, it is similar to principal component analysis, except that in this case, the data for both the independent and dependent variables are used to build the model.
Topics
- Basic Concepts and NIPALS Algorithm
- Example
- Residuals and Predictions
- Eigenvalues and Eigenvectors
- Number of Latent Vectors to Use
- Real Statistics Support
References
Hervé Abdi (2003) Partial least squares (PLS) regression
https://www.utdallas.edu/~herve/Abdi-PLS-pretty.pdf
Ng, K. S. (2013) A simple explanation of partial least squares
https://users.cecs.anu.edu.au/~kee/pls.pdf
Del Zotto, S. (2013) The PLS regression model: algorithms and application to chemometric data
https://air.uniud.it/bitstream/11390/1132276/1/10990_248_PhDthesisDelZotto.pdf
Wise, B. M. (2019) Properties of partial least squares (PLS) regression, and differences between algorithms
https://eigenvector.com/Docs/Wise_pls_properties.pdf
Minitab (2025) Methods and formulas for model information in partial least squares regression
https://support.minitab.com/en-us/minitab/help-and-how-to/statistical-modeling/regression/how-to/partial-least-squares/methods-and-formulas/model-information/
Kavsek, B. (2002) Partial least squares (PLS) regression and its robustification
https://repositum.tuwien.at/bitstream/20.500.12708/13536/2/Kavsek%20Barbara%20-%202002%20-%20Partial%20least%20squares%20PLS%20regression%20and%20its…pdf