You are on page 1of 58

Quantitative Methods in

Management
Day-5

Simple Regression
Page: 430-445
Recap..
• Introduction
• Definition
• Terms and terminologies
• Types of statistics
• Types of data
• Levels of measurements
• Application of statistics in business
• Sources of data
Organizing and visualizing variables
• Tables
• Frequency distribution
• Relative frequency distribution
• Relative percent frequency distribution
• Cumulative frequency distribution
• Univariate
• Bivariate / cross tabulation
• Diagrams
• Bar charts
• Pie charts
• Graphs
• Histogram
• Frequency polygon
• Frequency curve
• Cumulative frequency curve ( Ogive)
• EDA
• Stem and leaf plot
• Scatter diagram
• Dot plots
• Pareto chart
Numerical descriptive statistics
Measures of location
Measures of dispersion
Measures of shapes
Kurtosis
Relative location
- Z score
- Chebyshev's inequality
- Empirical rule
Exploratory data analysis
- Five number summary
- Box plot
Relationship between two variables
- Co variance
- correlation
Simple linear regression
Chapter 12
Learning Objectives
• How to use regression analysis to predict the
value of a dependent variable based on an
independent variable
• The meaning of the regression coefficients b0
and b1
• Measures of variation ( SSE, SSR, SST)
• Coefficient of determination
Correlation vs. Regression
• A scatter plot can be used to show the relationship between
two variables
• Correlation analysis is used to measure the strength of the
association (linear relationship) between two variables
• Correlation is only concerned with strength of the relationship
• No causal effect is implied with correlation
Introduction to
Regression Analysis
• Regression analysis is used to:
• Predict the value of a dependent variable based on the value of at least
one independent variable
• Explain the impact of changes in an independent variable on the
dependent variable
Dependent variable: the variable we wish to predict or explain
Independent variable: the variable used to predict or explain the
dependent variable
Regression Analysis
• Regression analysis is a tool for building
mathematical and statistical models that characterize
relationships between a dependent (ratio) variable
and one or more independent, or explanatory
variables (ratio or categorical), all of which are
numerical.
• Simple linear regression involves a single
independent variable.
• Multiple regression involves two or more
independent variables.
Simple Linear Regression Model

• Only one independent variable, X


• Relationship between X and Y is described
by a linear function
• Changes in Y are assumed to be related to
changes in X
Using Statistics
• Regression refers to the statistical technique of modeling the
relationship between variables.
• In simple linear regression, we model the relationship
between two variables.
• One of the variables, denoted by Y, is called the dependent
variable and the other, denoted by X, is called the
independent variable.
• The model we will use to depict the relationship between X and
Y will be a straight-line relationship.
• A graphical sketch of the pairs (X, Y) is called a scatter plot.
Using Statistics
This scatterplot locates pairs of observations of Scatterplot of Advertising Expenditures (X) and Sales (Y)
advertising expenditures on the x-axis and sales 140

on the y-axis. We notice that: 120

100

Sales
80
 Larger (smaller) values of sales tend to be 60
associated with larger (smaller) values of 40

advertising. 20

0
0 10 20 30 40 50
A d ve rtising

 The scatter of points tends to be distributed around a positively sloped straight line.

 The pairs of values of advertising expenditures and sales are not located exactly on a
straight line.
 The scatter plot reveals a more or less strong tendency rather than a precise linear
relationship.
 The line represents the nature of the relationship on average.
Types of Relationships
Linear relationships Curvilinear relationships

Y Y

X X

Y Y

X X
Types of Relationships
(continued)
Strong relationships Weak relationships

Y Y

X X

Y Y

X X
Types of Relationships
(continued)
No relationship

X
Simple Linear Regression Model

Population Random
Population Independent Error
Slope
Y intercept Variable term
Coefficient
Dependent
Variable

Yi  β0  β1Xi  ε i
Linear component Random Error
component
Simple Linear Regression Model
(continued)

Y Yi  β0  β1Xi  ε i
Observed Value
of Y for Xi

εi Slope = β1

Predicted Value Random Error for this Xi


of Y for Xi value

Intercept = β0

Xi
X
Simple Linear Regression
Equation (Prediction Line)
The simple linear regression equation provides an estimate of the
population regression line

Estimated (or
predicted) Y Estimate of the Estimate of the
value for regression regression slope
observation i intercept

Value of X for

Ŷi  b0  b1Xi
observation i
The Least Squares Method
b0 and b1 are obtained by finding the values of
that minimize the sum of the squared differences
between Y and Ŷ :

min  (Yi Ŷi )  min  (Yi  (b0  b1Xi ))


2 2
Simple Linear Regression Model
 The equation that describes how y is related to x and
an error term is called the regression model.

 The simple linear regression model is:

y = b0 + b1x +e

where:
b0 and b1 are called parameters of the model,
e is a random variable called the error term.
Simple Linear Regression Equation
 Positive Linear Relationship

E(y)

Regression line

Intercept Slope b1
b0 is positive

x
Simple Linear Regression Equation

 Negative Linear Relationship

E(y)

Intercept
b0 Regression line

Slope b1
is negative

x
Simple Linear Regression Equation

 No Relationship

E(y)

Intercept Regression line


b0
Slope b1
is 0

x
Estimated Simple Linear Regression Equation

 The estimated simple linear regression equation

ŷ  b0  b1 x

• The graph is called the estimated regression line.


• b0 is the y intercept of the line.
• b1 is the slope of the line.
• ŷ is the estimated value of y for a given x value.
Estimation Process
Regression Model Sample Data:
y = b0 + b1x +e x y
Regression Equation x1 y1
E(y) = b0 + b1x . .
Unknown Parameters . .
b0, b1 xn yn

Estimated
Regression Equation
b0 and b1
provide estimates of ŷ  b0  b1 x
b0 and b1
Sample Statistics
b0, b1
Least Squares Method
• Least Squares Criterion

min  (y i  y i ) 2
where:
yi = observed value of the dependent variable
for the ith observation
y^i = estimated value of the dependent variable
for the ith observation
Least Squares Method
• Slope for the Estimated Regression Equation

b1   ( x  x )( y  y )
i i

 (x  x )
i
2

where:
xi = value of independent variable for ith
observation
yi = value of dependent variable for ith
_ observation
x = mean value for independent variable
_
y = mean value for dependent variable
Least Squares Method

 y-Intercept for the Estimated Regression Equation

b0  y  b1 x
Simple Linear Regression
 Example: Reed Auto Sales

Reed Auto periodically has a special week-long sale.


As part of the advertising campaign Reed runs one or
more television commercials during the weekend
preceding the sale. Data from a sample of 5 previous
sales are shown on the next slide.
Simple Linear Regression

 Example: Reed Auto Sales

Number of Number of
TV Ads (x) Cars Sold (y)
1 14
3 24
2 18
1 17
3 27
Sx = 10 Sy = 100
x2 y  20
Estimated Regression Equation
 Slope for the Estimated Regression Equation

b1   ( x  x )( y  y ) 20
i i
 5
 (x  x )i
2
4

 y-Intercept for the Estimated Regression Equation


b0  y  b1 x  20  5(2)  10
 Estimated Regression Equation
yˆ  10  5x
Using Excel’s Chart Tools for
Scatter Diagram & Estimated Regression Equation

Reed Auto Sales Estimated Regression Line


30

25
Cars Sold 20
y = 5x + 10
15
10
5
0
0 1 2 3 4
TV Ads
Coefficient of Determination
• Relationship Among SST, SSR, SSE
SST = SSR + SSE

 i
( y  y ) 2
  i
( ˆ
y  y ) 2
  i i
( y  ˆ
y ) 2

where:
SST = total sum of squares
SSR = sum of squares due to regression
SSE = sum of squares due to error
Coefficient of Determination

 The coefficient of determination is:

r2 = SSR/SST

where:
SSR = sum of squares due to regression
SST = total sum of squares

• Goodness of fit
• Perfect fit : SSR= SST or SST/SSR = 1
• Poorer fit result in larger values for SSE ( occurs when SSR=0 and SSE = SST)
Coefficient of Determination

r2 = SSR/SST = 100/114 = .8772


The regression relationship is very strong; 87.72%
of the variability in the number of cars sold can be
explained by the linear relationship between the
number of TV ads and the number of cars sold.
Sample Correlation Coefficient

rxy  (sign of b1 ) Coefficient of Determination


rxy  (sign of b1 ) r 2

where:
b1 = the slope of the estimated regression
equation
yˆ  b0  b1 x
Sample Correlation Coefficient

rxy  (sign of b1 ) r 2

The sign of b1 in the equation yˆ  10  5 x is “+”.

rxy = + .8772

rxy = +.9366
Simple Linear Regression Example

• A real estate agent wishes to examine the


relationship between the selling price of a home and
its size (measured in square feet)

• A random sample of 10 houses is selected


• Dependent variable (Y) = house price in $1000s
• Independent variable (X) = square feet
Simple Linear Regression
Example: Data
House Price in $1000s Square Feet
(Y) (X)
245 1400
312 1600
279 1700
308 1875
199 1100
219 1550
405 2350
324 2450
319 1425
255 1700
Simple Linear Regression Example: Scatter Plot
House price model: Scatter Plot

450
400

House Price ($1000s)


350
300
250
200
150
100
50
0
0 500 1000 1500 2000 2500 3000
Square Feet
Simple Linear Regression Example: Using Excel
Simple Linear Regression Example: Excel Output
Regression Statistics
Multiple R 0.76211 The regression equation is:
R Square 0.58082
Adjusted R Square 0.52842 house price  98.24833  0.10977 (square feet)
Standard Error 41.33032
Observations 10

ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Simple Linear Regression Example: Minitab Output
The regression equation is:
The regression equation is

Price = 98.2 + 0.110 Square Feet


house price = 98.24833 +
Predictor Coef SE Coef T P
0.10977
Constant 98.25 58.03 1.69 0.129 (square feet)
Square Feet 0.10977 0.03297 3.33 0.010

S = 41.3303 R-Sq = 58.1% R-Sq(adj) = 52.8%

Analysis of Variance

Source DF SS MS F P
Regression 1 18935 18935 11.08 0.010
Residual Error 8 13666 1708
Total 9 32600
Simple Linear Regression Example: Graphical
Representation

House price model: Scatter Plot and Prediction Line

450
400

House Price ($1000s)


350 Slope
300
= 0.10977
250
200
150
100
50
Intercept 0
= 98.248 0 500 1000 1500 2000 2500 3000
Square Feet

house price  98.24833  0.10977 (square feet)


Simple Linear Regression Example:
Interpretation of bo

house price  98.24833  0.10977 (square feet)

• b0 is the estimated mean value of Y when the value


of X is zero (if X = 0 is in the range of observed X
values)
• Because a house cannot have a square footage of 0,
b0 has no practical application
Simple Linear Regression Example:
Interpreting b1

house price  98.24833  0.10977 (square feet)

• b1 estimates the change in the mean value of


Y as a result of a one-unit increase in X
• Here, b1 = 0.10977 tells us that the mean value of a house
increases by 0.10977($1000) = $109.77, on average, for
each additional one square foot of size
Simple Linear Regression
Example: Making Predictions
Predict the price for a house
with 2000 square feet:

house price  98.25  0.1098 (sq.ft.)

 98.25  0.1098(200 0)

 317.85
The predicted price for a house with 2000
square feet is 317.85($1,000s) = $317,850
Simple Linear Regression Example:
Making Predictions
• When using a regression model for prediction, only
predict within the relevant range of data
Relevant range for
interpolation

450
400
House Price ($1000s)

350
300
250
200
150 Do not try to
100 extrapolate
50
0
beyond the
0 500 1000 1500 2000 2500 3000 range of
Square Feet observed X’s
Measures of Variation

• Total variation is made up of two parts:

SST  SSR  SSE


Total Sum of Regression Sum of Error Sum of
Squares Squares Squares

SST   ( Yi  Y)2 SSR   ( Ŷi  Y )2 SSE   ( Yi  Ŷi )2


where:
Y = Mean value of the dependent variable
Yi = Observed value of the dependent variable
Yˆi = Predicted value of Y for the given Xi value
Measures of Variation
(continued)

• SST = total sum of squares (Total Variation)


• Measures the variation of the Yi values around their mean
Y
• SSR = regression sum of squares (Explained Variation)
• Variation attributable to the relationship between X and Y
• SSE = error sum of squares (Unexplained Variation)
• Variation in Y attributable to factors other than X
Measures of Variation
(continued)
Y
Yi  
SSE = (Yi - Yi )2 Y

_
SST = (Yi - Y)2

Y  _
SSR = (Yi - Y)2
_ _
Y Y

Xi X
Coefficient of Determination, r2
• The coefficient of determination is the portion of
the total variation in the dependent variable that is
explained by variation in the independent variable
• The coefficient of determination is also called r-
squared and is denoted as r2

2 SSR regression sum of squares


r  
SST total sum of squares

note:
0 r 1
2
R2

• R2 (R-squared) is a measure of the “fit” of the line to the data.


• The value of R2 will be between 0 and 1.
• A value of 1.0 indicates a perfect fit and all data points would lie on the line;
the larger the value of R2 the better the fit.
Examples of r 2 Values
Y

r2 = 1

Perfect linear relationship between X and Y:

X 100% of the variation in Y is explained by


r2 = 1 variation in X
Y

X
r2 =1
Examples of r2 Values
Y

0 < r2 < 1

Weaker linear relationships between X


and Y:
X
Some but not all of the variation in Y is
explained by variation in X
Y

X
Examples of r2 Values

r2 = 0
Y

No linear relationship between X and Y:

The value of Y does not depend on X.


(None of the variation in Y is explained by
variation in X)
X
r2 = 0
Simple Linear Regression Example:
Coefficient of Determination, r2 in Excel
SSR 18934.9348
Regression Statistics
r  2
  0.58082
Multiple R 0.76211 SST 32600.5000
R Square 0.58082
Adjusted R Square 0.52842 58.08% of the variation in house
Standard Error 41.33032 prices is explained by variation in
Observations 10
square feet
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

You might also like