Professional Documents
Culture Documents
Management
Day-5
Simple Regression
Page: 430-445
Recap..
• Introduction
• Definition
• Terms and terminologies
• Types of statistics
• Types of data
• Levels of measurements
• Application of statistics in business
• Sources of data
Organizing and visualizing variables
• Tables
• Frequency distribution
• Relative frequency distribution
• Relative percent frequency distribution
• Cumulative frequency distribution
• Univariate
• Bivariate / cross tabulation
• Diagrams
• Bar charts
• Pie charts
• Graphs
• Histogram
• Frequency polygon
• Frequency curve
• Cumulative frequency curve ( Ogive)
• EDA
• Stem and leaf plot
• Scatter diagram
• Dot plots
• Pareto chart
Numerical descriptive statistics
Measures of location
Measures of dispersion
Measures of shapes
Kurtosis
Relative location
- Z score
- Chebyshev's inequality
- Empirical rule
Exploratory data analysis
- Five number summary
- Box plot
Relationship between two variables
- Co variance
- correlation
Simple linear regression
Chapter 12
Learning Objectives
• How to use regression analysis to predict the
value of a dependent variable based on an
independent variable
• The meaning of the regression coefficients b0
and b1
• Measures of variation ( SSE, SSR, SST)
• Coefficient of determination
Correlation vs. Regression
• A scatter plot can be used to show the relationship between
two variables
• Correlation analysis is used to measure the strength of the
association (linear relationship) between two variables
• Correlation is only concerned with strength of the relationship
• No causal effect is implied with correlation
Introduction to
Regression Analysis
• Regression analysis is used to:
• Predict the value of a dependent variable based on the value of at least
one independent variable
• Explain the impact of changes in an independent variable on the
dependent variable
Dependent variable: the variable we wish to predict or explain
Independent variable: the variable used to predict or explain the
dependent variable
Regression Analysis
• Regression analysis is a tool for building
mathematical and statistical models that characterize
relationships between a dependent (ratio) variable
and one or more independent, or explanatory
variables (ratio or categorical), all of which are
numerical.
• Simple linear regression involves a single
independent variable.
• Multiple regression involves two or more
independent variables.
Simple Linear Regression Model
100
Sales
80
Larger (smaller) values of sales tend to be 60
associated with larger (smaller) values of 40
advertising. 20
0
0 10 20 30 40 50
A d ve rtising
The scatter of points tends to be distributed around a positively sloped straight line.
The pairs of values of advertising expenditures and sales are not located exactly on a
straight line.
The scatter plot reveals a more or less strong tendency rather than a precise linear
relationship.
The line represents the nature of the relationship on average.
Types of Relationships
Linear relationships Curvilinear relationships
Y Y
X X
Y Y
X X
Types of Relationships
(continued)
Strong relationships Weak relationships
Y Y
X X
Y Y
X X
Types of Relationships
(continued)
No relationship
X
Simple Linear Regression Model
Population Random
Population Independent Error
Slope
Y intercept Variable term
Coefficient
Dependent
Variable
Yi β0 β1Xi ε i
Linear component Random Error
component
Simple Linear Regression Model
(continued)
Y Yi β0 β1Xi ε i
Observed Value
of Y for Xi
εi Slope = β1
Intercept = β0
Xi
X
Simple Linear Regression
Equation (Prediction Line)
The simple linear regression equation provides an estimate of the
population regression line
Estimated (or
predicted) Y Estimate of the Estimate of the
value for regression regression slope
observation i intercept
Value of X for
Ŷi b0 b1Xi
observation i
The Least Squares Method
b0 and b1 are obtained by finding the values of
that minimize the sum of the squared differences
between Y and Ŷ :
y = b0 + b1x +e
where:
b0 and b1 are called parameters of the model,
e is a random variable called the error term.
Simple Linear Regression Equation
Positive Linear Relationship
E(y)
Regression line
Intercept Slope b1
b0 is positive
x
Simple Linear Regression Equation
E(y)
Intercept
b0 Regression line
Slope b1
is negative
x
Simple Linear Regression Equation
No Relationship
E(y)
x
Estimated Simple Linear Regression Equation
ŷ b0 b1 x
Estimated
Regression Equation
b0 and b1
provide estimates of ŷ b0 b1 x
b0 and b1
Sample Statistics
b0, b1
Least Squares Method
• Least Squares Criterion
min (y i y i ) 2
where:
yi = observed value of the dependent variable
for the ith observation
y^i = estimated value of the dependent variable
for the ith observation
Least Squares Method
• Slope for the Estimated Regression Equation
b1 ( x x )( y y )
i i
(x x )
i
2
where:
xi = value of independent variable for ith
observation
yi = value of dependent variable for ith
_ observation
x = mean value for independent variable
_
y = mean value for dependent variable
Least Squares Method
b0 y b1 x
Simple Linear Regression
Example: Reed Auto Sales
Number of Number of
TV Ads (x) Cars Sold (y)
1 14
3 24
2 18
1 17
3 27
Sx = 10 Sy = 100
x2 y 20
Estimated Regression Equation
Slope for the Estimated Regression Equation
b1 ( x x )( y y ) 20
i i
5
(x x )i
2
4
25
Cars Sold 20
y = 5x + 10
15
10
5
0
0 1 2 3 4
TV Ads
Coefficient of Determination
• Relationship Among SST, SSR, SSE
SST = SSR + SSE
i
( y y ) 2
i
( ˆ
y y ) 2
i i
( y ˆ
y ) 2
where:
SST = total sum of squares
SSR = sum of squares due to regression
SSE = sum of squares due to error
Coefficient of Determination
r2 = SSR/SST
where:
SSR = sum of squares due to regression
SST = total sum of squares
• Goodness of fit
• Perfect fit : SSR= SST or SST/SSR = 1
• Poorer fit result in larger values for SSE ( occurs when SSR=0 and SSE = SST)
Coefficient of Determination
where:
b1 = the slope of the estimated regression
equation
yˆ b0 b1 x
Sample Correlation Coefficient
rxy (sign of b1 ) r 2
rxy = + .8772
rxy = +.9366
Simple Linear Regression Example
450
400
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000
Analysis of Variance
Source DF SS MS F P
Regression 1 18935 18935 11.08 0.010
Residual Error 8 13666 1708
Total 9 32600
Simple Linear Regression Example: Graphical
Representation
450
400
98.25 0.1098(200 0)
317.85
The predicted price for a house with 2000
square feet is 317.85($1,000s) = $317,850
Simple Linear Regression Example:
Making Predictions
• When using a regression model for prediction, only
predict within the relevant range of data
Relevant range for
interpolation
450
400
House Price ($1000s)
350
300
250
200
150 Do not try to
100 extrapolate
50
0
beyond the
0 500 1000 1500 2000 2500 3000 range of
Square Feet observed X’s
Measures of Variation
_
SST = (Yi - Y)2
Y _
SSR = (Yi - Y)2
_ _
Y Y
Xi X
Coefficient of Determination, r2
• The coefficient of determination is the portion of
the total variation in the dependent variable that is
explained by variation in the independent variable
• The coefficient of determination is also called r-
squared and is denoted as r2
note:
0 r 1
2
R2
r2 = 1
X
r2 =1
Examples of r2 Values
Y
0 < r2 < 1
X
Examples of r2 Values
r2 = 0
Y