Professional Documents
Culture Documents
(3.1)
where H is the Heaviside function, I is the image is the centre
pixel.
Xp are the P local samples taken at a radius R around x.
c) Feature Histogram
In statistics, a histogram is a graphical representation,
showing a visual impression of the distribution of
experimental data. It is an estimate of the probability
distribution of a continuous variable and was first introduced
by Karl Pearson. A histogram consists of tabular frequencies,
shown as adjacent rectangles, erected over discrete intervals
(bins), with an area equal to the frequency of the observations
in the interval. The height of a rectangle is also equal to the
frequency density of the interval, i.e., the frequency divided
by the width of the interval. The total area of the histogram is
equal to the number of data. A histogram may also be
normalized displaying relative frequencies. It then shows the
proportion of cases that fall into each of several categories,
with the total area equalling 1.In an image processing context,
the histogram of an image normally refers to a histogram of
the pixel intensity values. This histogram is a graph showing
the number of pixels in an image at each different intensity
value found in that image. Histograms are used to plot density
of data, and often for density estimation: estimating the
probability density function of the underlying variable. The
total area of a histogram used for probability density is always
normalized to 1. The histogram plots the number of pixels in
the image (vertical axis) with a particular brightness value
(horizontal axis).
d) Quantification
The identification and precise quantification of
pulmonary emphysema in life are an attractive proposition,
but are made virtually impossible by the pathological basis of
its definition. Pulmonary emphysema is a progressive lung
disease mostly affecting elderly people. Because emphysema
is defined as "abnormal permanent enlargement of the air
spaces distal to the terminal bronchioles, accompanied by
destruction of their walls and without obvious fibrosis"
characterization of the disease with morphological and/or
morph metrical parameters is a prerequisite to study the
pathogenesis. Although lung function tests are useful tools to
diagnose chronic obstructive pulmonary disease, it is difficult
to distinguish emphysema from other obstructive lung
diseases.
Prior to classication of the lung eld, the lung
parenchyma pixels are segmented in the HRCT slice, using a
combination of thresholding. Manual editing was needed
afterward in one third of the cases and required simple
outlining of a few of the larger airways. Each segmented lung
parenchyma pixel is classied by classifying the ROI centred
on the pixel.
It should be noted that pixels that are not part of the
lung segmentation are not classied, but they can still
contribute to the classication. For example, part of the
exterior of the lung is in the local neighbourhood when
classifying a pixel at the border of the lung. In this way, all
potentially relevant structural information is incorporated,
such as proximity to the border of the lung or to the large
vessels and airways. The pixel probabilities are fused to obtain
one measure for the complete lung eld that can be used for
emphysema quantication. There are several ways of doing
this, e.g., averaging, voting, or the maximum rule. The
considered quantitative measures for emphysema are the and
is given by
sin(2 / ), cos(2 / )
T
Xp R p P R p P X
International Journal of Computer Trends and Technology (IJCTT) volume 4 Issue 6June 2013
ISSN: 2231-2803 http://www.ijcttjournal.org Page 1922
Where |S |is the number of pixels in the segmentation. P is the
probability of the class W
i
and X
j
is the center pixel.
e) Classifier (KNN)
In pattern recognition, the k-nearest neighbours algorithm (k-
NN) is a method for classifying objects based on closest
training examples in the feature space. k-NN is a type of
instance-based learning, or lazy learning where the function is
only approximated locally and all computation is deferred
until classification. The k-nearest neighbour algorithm is
amongst the simplest of all machine learning algorithms: an
object is classified by a majority vote of its neighbours, with
the object being assigned to the class most common amongst
its k nearest neighbours (k is a positive integer, typically
small). If k =1, then the object is simply assigned to the class
of its nearest neighbour.
The same method can be used for regression, by
simply assigning the property value for the object to be the
average of the values of its k nearest neighbours. It can be
useful to weight the contributions of the neighbours, so that
the nearer neighbours contribute more to the average than the
more distant ones. (A common weighting scheme is to give
each neighbour a weight of 1/d, where d is the distance to the
neighbour. This scheme is a generalization of linear
interpolation.)
The neighbours are taken from a set of objects for
which the correct classification (or, in the case of regression,
the value of the property) is known. This can be thought of as
the training set for the algorithm, though no explicit training
step is required. The k-nearest neighbour algorithm is
sensitive to the local structure of the data.
Nearest neighbour rules in effect compute the
decision boundary in an implicit manner. It is also possible to
compute the decision boundary itself explicitly, and to do so
in an efficient manner so that the computational complexity is
a function of the boundary complexity. The k-NN algorithm
can also be adapted for use in estimating continuous variables.
One such implementation uses an inverse distance weighted
average of the k-nearest multivariate neighbours.
This algorithm functions as follows:
1. Compute Euclidean from target plot to those that
were sampled.
2. Order samples taking for account calculated
distances.
3. Choose heuristically optimal k nearest neighbour
based on RMSE done by cross validation technique.
4. Calculate an inverse distance weighted average with
the k-nearest multivariate neighbours.
KNN is a versatile algorithm and is used in a huge
number of fields.
Texture measures such as LBP and Gaussian can be used for
quantitative analysis of pulmonary emphysema in CT images
of the lung. The ROI classication using LBP showed good
classication performance, compared to Gaussian . Thus LBP
seems to perform better than Gaussian in finding the
quantitative value. Also KNN have a greater sensitivity to
emphysema.LBP analysis is a sensitive method for diagnosing
emphysema, assessing its severity, and determining its
subtype since both visual and quantitative HRCT assessment
are closely correlated with pathological extend of emphysema.
SCOPE FOR FURTHER STUDIES
LBP have shown promising results in various applications in
computer vision and can be successfully applied in other
medical image analysis tasks, e.g., in mammographic mass
detection and magnetic resonance image analysis of the brain.
5.REFERENCES
1. A. H. Mir, M. Hanmandlu, and S. N. Tandon, Texture analysis of
CT images,IEEE Eng. Med. Biol. Mag. ,vol. 14, no. 6, pp. 781
786,Nov./ Dec. 1995
2. A. Bharathi Dr.A.M.Natarajan, Minimal feature selection using
SVM based on anova J ournal of Theoretical and Applied Information
Technology-Mar2009
3. Ching Ming J immy Wang Mamatha Rudrapatna Arcot Sowmya
Lung Disease Detection Using Frequency Spectrum Analysis Amer. J.
Respir. Crit. Care Med.Apr. 2005
4.Daniel I. Morariu, Lucian N. Vintan, and Volker Tresp, Meta-
Classification using SVM Classifiers for sText Documents
International J ournal of Mathematical and Computer Sciences-J an 2005
5. G. Zhao and M. Pietikinen, Dynamic texture recognition using
local binary patterns with an application to facial expressions, IEEE
Trans. Pattern Anal. Mach . Intell., J un. 2007.
6. Lauge Srensen, Saher B. Shaker, and Marleen de Bruijne
Quantitative Analysis of Pulmonary Emphysema Using Local Binary
Patterns IEEE transaction on medical imaging, vol. 29, no. 2, February
2010
7. Lin Zhang, Lei Zhang, Zhenhua Guo, and David
Zhang, monogenic-lbp: a new approach for rotation
invariant texture classification, Proceedings of 2010
IEEE 17th International Conference on Image Processing
September 26-29, 2010, Hong Kong
8. L. Srensen, S. B. Shaker, and M. de Bruijne, Texture
Classication in lung CT using local binary patterns, D.
N. Metaxas, L. Axel, G. Fichtinger, and G. Szkely,
Eds. New York: Springer-Verlag, Sep. 2008
9. .Michael Fitzpatrick, Milan Sonka,Automatic
detection systemfor pulmonary emphysema on 3-D
Chest images". Proceedings of SPIE-mar,2008