You are on page 1of 12

Prof. Dr. H.

Trkolu 1
CURVE FITTING
x
i
y
i
x
1
y
1
x
2
y
2
x
3
y
3

x
4
y
4




x
n
y
n

To use the data effectively, a curve (an expression) is fitted to the given data
set, as

y(x)= a
o
+ a
1
x (linear function)

y(x)= a
o
+ a
1
x + a
2
x
2
(polynomial)

y(x) = a
1
e
b
1
x
(exponential)

Y(x) = Bx
a

Unknown coefficients a
o
, a
1
, a
2
, B

and b
1
are to be determined

There are two general approaches for curve fitting:
1) Least square regression
2) Interpolation
Objective of curve fitting is to represent a set of discrete data by a function
(curve). Consider a set of discrete data as given in table.
y= f(x)
Prof. Dr. H. Trkolu 2
Least square regression The strategy is
to derive a single curve that best represent
the general trend of the data set. We make
no effort to intersect every data point.
Rather, the curve is designed to follow the
pattern of the data point taken as a group.
Least square regression is applied to data
sets that may contain same error or noise.
Any individual data point may be incorrect.
Interpolation technique is used
to fit a curve that directly passes
through each of the data points.
Such data usually originates from
tables.
x
y
Linear
regression
x
y
Polynomial
regression
x
y
x
y
Linear
interpolation
Curvilinear
interpolation
Prof. Dr. H. Trkolu 3
LEAST SQUARE METHOD
A curve is fitted to a given data set such that sum of the squares of the
differences between the given data set and the values obtained from the fitted
curve is minimum.

1. Linear Regression
Data is fitted to a linear (first order) function.
y(x)= a
o
+ a
1
x a
o
= ? , a
1
= ?
Sum of the square of the differences can be obtained as
a
o
and a
1
are obtained in such a way that R is minimum. This is done as
follows:
0 ) ( 2 0
1
1
= + =
c
c

=
i i
n
i
o
o
y x a a
a
R
0 ) ( 2 0
1
1
1
= + =
c
c

=
i i i
n
i
o
x y x a a
a
R
Arranging these equations, we obtain,
(

=
(



i i
i o
i i
i
y x
y
a
a
x x
x n
1
2
From this set of equations, unknown coefficients a
o
and a
1
are solved.
| |
2
1
1

=
+ =
n
i
i i o
y x a a R
Prof. Dr. H. Trkolu 4
2. Second Order Polynomial Regression

Given data is fitted to a second order polynomial as,
y(x)= a
o
+ a
1
x + a
2
x
2
a
o
= ?, a
1
= ? , a
2
= ?
Sum of the square of the differences can be obtained as
Differentiating R with respect to a
o
, a
1
and a
2
and equating to zero, we obtain
the following set of algebraic equations for unknown coefficients.
(
(
(

=
(
(
(

(
(
(




i i
i i
i o
i i i
i i i
i i
y x
y x
y
a
a
a
x x x
x x x
x x n
2
2
1
4 3 2
3 2
2
Solution of these equations gives the unknown coefficients a
o
, a
1
and a
2
.
| |

=
+ + =
n
i
i i i o
y x a x a a R
1
2
2
2 1
Prof. Dr. H. Trkolu 5
MULTIPLE LINEAR REGRESSION

Linear regression method can be used to obtain a linear function of two or
more variables. For example, y might be a linear function of x
1
and x
2
, as in
y(x
1
, x
2
) = a
o
+ a
1
x
1
+a
2
x
2
For this two dimensional case, the regression line becomes a plane.

Coefficients are again determined by setting up the sum of the squares of the
residuals,
| |

=
+ + =
n
i
i i i o
y x a x a a R
1
2
2 2 1 1
Differentiating with respect to each of the unknown coefficients and setting to
zero, we get a set of linear equations for unknown coefficients a
o
, a
1
, and a
2
.
Arranging these equations as a matrix equation, we obtain,
(
(
(

=
(
(
(

(
(
(




i i
i i
i o
i i i i
i i i i
i i
y x
y x
y
a
a
a
x x x x
x x x x
x x n
2
1
2
1
2
2 2 1 2
2 1
2
1 1
2 1
Solution of this set of linear equations gives the unknown coefficients a
o
, a
1
, and
a
2
of the above linear function.
Prof. Dr. H. Trkolu 6
TRANSFORMATION FOR DATA LINEARIZATION

Linear regression provides a powerful technique for fitting a best line to
a data set. This technique is used if the relation between dependent and
independent variables is linear. However, this is not always the case. For
some cases, polynomial regression is appropriate. For others,
transformation can be used to express the data in the form that is
compatible with linear regression.



y
x
x b
e a y
1
1
=
Linearization
lny
x
To linearize this expression, we take the logarithm of both sides as follows:
x b
e a y
1
1
=
x b a y
1 1
ln ln + =
This can be written as
Where Y=lny, c
1
=lna
1
, c
2
=b
1
and t=x.

This is a linear equation. Hence, above linear regression method can be used
to find the unknown coefficient c
1
and c
2
.
t
2 1
c c Y + =
Prof. Dr. H. Trkolu 7
EXAMPLE: Temperature T of a small copper sphere cooling in the air is
measured as a function of time t to yield the following data set:
t (s) 0.2 0.6 1.0 1.8 2.0 3.0 5.0 6.0 8.0
T (oC) 146.0 129.55 114.8 90.3 85.1 63.0 34.6 25.6 14.1
Fit this data to a curve of
t a
Ae T

=
Prof. Dr. H. Trkolu 8
Example: Tomato paste was tested in a viscometer and the following data ere
obtained. Determine if the fluid is Newtonian and its descriptive equation.
t (N/m2) 51 71.6 90.8 124.0 162.0
dV/dy (rad/s) 0.95 4.7 12.3 40.6 93.5
Solution: A plot of these data appears in the figure. This figure implies that
the fluid is non-Newtonian (pseudoplastic) so we assume equation should
be in the form of
n
dy
dV
K
|
|
.
|

\
|
= t
Using data linearization, unknowns K and n can be determined utilizing least
square method.
Prof. Dr. H. Trkolu 9
Taking the natural logarithm of sides, we get
lnt = lnK+nln(dV/dy)
We can write in simpler notation as,
T = b
o
+b
1
V
Where T = lnt b
1
= n
b
o
= lnK V = ln(dV/dy)
Unknown constants b
o
and b
1
can be determined using the method of least
square from the transformed data given in the table below:
V=ln(dV/dy) -0.513 1.55 2.51 3.70 4.54 E=12.2
T=lnt 3.93 4.27 4.51 4.82 5.09 E=22.62
(

=
(



i i
i o
i i
i
y x
y
a
a
x x
x n
1
2
(

=
(

6 . 58
62 . 22
0 . 43 2 . 12
2 . 12 5
1
b
b
o
Solving for b
o
and b
1
, we get
b
o
=3.90
b
1
= 0.257
From these,
b
o
= lnK K = exp(bo) = exp (3.90) = 49.3
n = b
1
= 0.257
Final equation for the tomato paste
257 . 0
3 . 49
|
|
.
|

\
|
=
dy
dV
t
Based on the shear stress-strain curve, we conclude that tomato paste is a
pseudoplastic non-Newtonian fluid.
Prof. Dr. H. Trkolu 10
EXAMPLE: Flow rate in pipe is measured for different pressure drops and pipe
diameters. Results of the measurements are given in the table below. Fit a
curve to the given data as
b a
P BD Q A =
D (m)

AP (atm)
0.3 0.5 1.0 1.4
0.5 0.13 0.43 2.1 4.55
0.9 0.25 0.81 4.0 8.69
1.2 0.34 1.12 5.5 11.92
1.8 0.54 1.74 8.59 18.63
Solution: Linearzing the data, multiple linear least square regression
method can be used to determine unknowns B, a and b.
Prof. Dr. H. Trkolu 11
LAGRANGE INERPOLATION
The Lagrange interpolation can be formulated as
[
=
=

= =
n
i j
j
j i
j
i i i n
x x
x x
x L where x f x L x f
0
) ( ) ( ) ( ) (
Interpolation curve passes through all the
data points considered.
x
y
x
y
Linear
interpolation
Curvilinear
interpolation
Subscript n indicates the order of the interpolation polynomial.

First (linear) and second order interpolation polynomials can be written as
( )( )
( )( )
( )( )
( )( )
( )( )
( )( )
) ( ) ( ) ( ) (
2
1 2 2
1
1
2 1 1
2
2 1
2 1
2
x f
x x x x
x x x x
x f
x x x x
x x x x
x f
x x x x
x x x x
x f
o
o
o
o
o
o o


+


+


=
( )
( )
( )
( )
) ( ) ( ) (
1
1 1
1
1
x f
x x
x x
x f
x x
x x
x f
o
o
o
o

=
Prof. Dr. H. Trkolu 12
EXAMPLE: Use Lagrange interpolating polynomial of second order to fit a
curve to points

At x
o
=1 f(x
o
)=0
x
1
=4 f(x
1
)=1.3863
x
2
=6 f(x
2
)=1.7918

Solution: General interpolation polynomial can be written as
( )( )
( )( )
( )( )
( )( )
( )( )
( )( )
) ( ) ( ) ( ) (
2
1 2 2
1
1
2 1 1
2
2 1
2 1
2
x f
x x x x
x x x x
x f
x x x x
x x x x
x f
x x x x
x x x x
x f
o
o
o
o
o
o o


+


+


=
( )( )
( )( )
( )( )
( )( )
( )( )
( )( )
7918 . 1
4 6 1 6
4 1
3863 . 1
6 4 1 4
6 1
0
6 1 4 1
6 4
) (
2


+


+


=
x x x x x x
x f
( )( ) ( )( ) 4 1 17918 . 0 6 1 231 . 0 ) (
2
+ = x x x x x f
[
=
=

= =
n
i j
j
j i
j
i i i n
x x
x x
x L where x f x L x f
0
) ( ) ( ) ( ) (
Second order interpolation polynomial can be written as
Substituting the values,
7872 . 5 349 . 7 05082 . 0 ) (
2
2
+ = x x x f
Performing the calculations, we obtain,

You might also like