Ordinary Least Square (OLS)


For those of you who love mathematics and would like to know from how the linear regression formula was derived, in this section of tutorial you will learn a powerful method called Ordinary Least Square (OLS). I assume that you know calculus to perform the OLS method. Knowing this method is important that you may learn to derive many regression formulas by yourselves.

Let us start with notation.

is data of independent variable from observation

is the mean of

is data of dependent variable from observation

is the mean of

is the estimated of , that is represented by the regression model.

is the number of observation data

To perform ordinary least square method, you do the following steps:

  1. Set a difference between dependent variable and its estimation:
  2. Square the difference:
  3. Take summation for all data
  4. To get the parameters that make the sum of square difference become minimum, take partial derivative for each parameter and equate it with zero,

For example:

Find for model parameter for model estimation using Ordinary Least square!

Answer:

The model only has two parameters, that is and .

We take partial derivative of the sum of square difference to the first parameter and equate it to zero . In taking the partial derivative, we assume , and are constant while is the only variable.

Equate it with zero we have

Actually, the two parameters, and , are the real constants and they can go out of the summation sign. Constant 2 is surely not equal to zero, thus we can cancel out to simplify.

We know that , thus we can simplify the last equation into

(1)

Now, we take partial derivative of the sum of square difference to the second parameter and equate it to zero . Similar to before, in taking partial derivative, we assume , and are constant, while is the only variable..

Equate it with zero we have

Actually, the two parameters, and , are the real constants and they can go out of the summation sign, Constant 2 is surely not equal to zero, thus we can cancel out to simplify.

We know that and , , thus we can further simplify the last equation into or,

(2)

Inputting equation (1) into equation (2), we have

Thus, the parameters of regression model are and

Notice that is actually equivalent to by simple algebra.

Example:

Find for model parameter for model estimation using Ordinary Least square!

Answer:

The model only has one parameter .

We take derivative and equate it to zero

Thus, the parameters of regression model is .

You may compare that the slope of the two models and are not the same.

Source: people.revoledu.com

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: