Method of Least Squares for Multiple Regression Detailed

Property 1: The regression line has form

image1817

where the coefficients bm are the solutions to the following k equations in k unknowns.

image1819

Proof: Our objective is to find the values of the coefficients bi for which the sum of the squares

image1671

is minimum where ŷi is the y-value on the best-fit line corresponding to xi1,…,xik. Now,

image3497

For any given values of (x11, …, x1k, y1), …, (xn1, …, xnk, yn), this expression can be viewed as a function of the bi, namely g(b0, …, bk):

image3499By calculus, the minimum value occurs when the partial derivatives are zero. i.e.

image3500

Transposing terms we have

image3502 image3503

Further simplifying

image3504 image3505

But since \sum\nolimits_{i=1}^n (x_{im}-\bar{x}_m) = 0, the last equation becomes

image3507

The remaining k equations are:

image3508

These are equivalent toimage1819

Since we have k equations in k unknowns (the bm), there can be a unique solution.

4 thoughts on “Method of Least Squares for Multiple Regression Detailed”

  1. Merhaba Charles,
    İstatistik ve araştırma deneme ile ilgili çok güzel site hazırlamışsınız.
    Size ve ekibinize çok teşekkür eder, saygılarımla.
    Ancak benim bir isteğim var. Excell hazırlanan konuların SAS veya SPSS veya Jump gibi uygulamaları var mı ?
    Best Regards

    Hakkı Akdeniz

    Reply
    • Hakki,
      Thank you for your kind words about Real Statistics.
      I am sorry, but I couldn’t understand your question about SAS, SPSS and Jump based on the Google Translate translation.
      Charles

      Reply
    • Hello GJ,
      Yes, you are correct. Thank you for catching this error. I have now corrected the webpage.
      I appreciate your help in improving the accuracy of the Real Statistics website.
      Charles

      Reply

Leave a Comment