You are on page 1of 15

CORRELATIONAL RESEARCH

This is also a type of descriptive research in which we try to study the existing relationships
between two or more variables. It should be remembered that the main aim of educational
research is not only to discover what is presently unknown but also to predict the future
relationships between various variables. These predictions are comparatively easy and
possible to be made if we explore the existing strong relationships between certain
variables. In order to explore these relationships we conduct correlational studies.
1. Purposes of Correlational Research
The major aim of a correlational research is to explore the correlation between or
among the variables. These correlations help us better understand the conditions and
events in a meaningful way, and in making predictions about the future conditions
and events. These research studies ultimately enable us to explain, predict and, up to
some extent, control certain conditions and events.
For example, to B. F. Skinner, a great behavioral psychologist, most events could be
expressed as: X (f) Y, i.e. X is the function (f) of Y, and this is possible only
because both are correlated. In his experiments, X refers to the behaviour of the
pigeon and Y refers to the reinforcement given to the pigeon after it performs some
particular behavior (e.g., pecking at a tray). The pigeon learns to peck at the tray
because it leads to some reward (food). On the basis of his experiments, he
concluded that one thing caused another that is, the proper administration of
reinforcement led or caused the bird to behave in a certain manner.
Now, on the basis of this information and knowledge, we can conduct some
correlational study in the educational and/or classroom settings and predict the
behaviour of the students and up to some extent we can control their behaviour by
applying various types or schedules of reinforcements.
2. Major Topics of Correlational Research
In the educational settings, correlational research is targeted toward the following
four broad categories of topics :
Researching various human traits related to learning, viz. personality,
motivation, intelligence, etc.
Researching various classroom conditions related to learning, viz. class size,
teacher behaviour, peer interaction, etc.
Researching various teaching practices, procedures, and materials related to
learning.
Researching the validation of educational tests and measurements.
3. Sources of Data for Correlational Research
Actually, correlational research requires only a few sources of data, but these
sources must provide or supply two measures or scores for each subject studied. For
example, if we want to explore the relationship between the level of anxiety and
student performance, we essentially need scores on these two variables each for all
the subjects of the sample.


4. Types of Correlations
As is just described, we require pairs of test scores for each subject to calculate
correlation between these pairs. But the nature of these pairings (or data obtained)
requires us to calculate correlation by any of the following different methods of
correlation :
Pearson r : It is the most commonly used correlational procedure. In this, we need
pairs of raw scores on two tests, one pair for each subject of the sample, e.g. marks
obtained by a student in the tests of science and math.
Spearman r : Sometimes we cannot obtain raw scores but we obtain the ranking of
the subjects. Then, we have to calculate Spearman's rank order correlation. For
example, to explore the relationship between self-confidence and leadership we
shall have to use this method. Here we may be unable to obtain the raw scores of the
subjects on these two variables but we may rank them on these.
Biserial r : It is calculated when we have scores of the subjects on one variable or
trait, but on the second variable, we have to put them into a dichotomy
(dichotomous means 'cut into two parts'), which means that we have to place them
in either this or another category. For example, we may plan to explore the
relationship between mental age (M.A., a measurable variable into scores) and
number of parents in the family (dichotomous variable-either one parent i.e. either
mother or father or two i.e. both mother and father).
Tetrachoric r : In biserial r, we have one continuous variable (expressed in test
scores) and second dichotomous variable (a two-fold classification). But sometimes
we may get both the variables dichotomous (or a 2 x 2 or four-fold table). Then we
have to compute tetrachoric r. Here our both variables are not measured in scores
but are capable of being separated into two categories. For example, we may wish to
discover the relationship between intelligence (above average/below average) and
self-confidence (above average/below average). Here we have, as per our research
objectives, decided to study the relationship between two categories of intelligence
and two categories of self-confidence.
Partial Correlation : In correlational approach, mostly the third variable problem
refrains us from drawing inferences on the basis of the observed r between two
variables. According to Christensen (1994), "the third variable problem refers to the
fact that the two variables may be correlated not because they are causally related
but because some third variable caused both of them." For example, it is found that
reading ability and vocabulary are highly correlated, but, in fact, both of these
variables are strongly affected by intelligence. Hence, if anybody wants to study the
actual correlation between these two variables, he must first partial out the effect of
intelligence which is done by the method of partial correlation.
5. Research Tools
As you have just read, in correlational research, we require data in the form of
numbers, rankings or dichotomies. To obtain these types of data, as per our research
design, we may use "standardized tests" (like intelligence tests), "other measuring
devices" (e.g., heart beat, pulse rate, etc.), or "established criteria" (to be used in
making rankings and dichotomies).


6. Steps in Correlational Research
The correlational research is very simple and uncomplicated to conduct. It involves
the following steps, most of which you will find a little bit similar to other research
methods :
Selecting and Defining a Problem : Like other types of research,
correlational research also requires first of all a researcher to select and
define his/her research problem. Here the researcher should select at least
"two variables" (one can select even more).
Formulating Hypothesis : Generally, null hypothesis is formulated in
correlational research because it seems much easier to reject a null
hypothesis than to retain a research hypothesis. It simply states, "No
relationship exists between A and B."
Data Collection : As per the nature of the variables, the next step of this
research method is to collect the data in pairings of scores, rankings, or
groupings by applying the appropriate research tools.
Data Compilation : After collecting the data, we must next compile it in
such a way that two measures (i.e. scores, rankings, or groupings) can be
shown for each subject of the sample.
Analysis and Interpretation of Data : Our next step is to treat the data
statistically by applying appropriate correlational technique to compute the
correlation between the two sets of scores. Then we interpret our findings in
the light of (i) the size of the correlation, (ii) the direction of the correlation
(positive or negative), and (iii) its significance level.
a. The Size of the Correlation: The degree or size or strength of the
relationship between two variables is expressed by the coefficient of
correlation. Whether positive or negative, the more the coefficient
the stronger or closer the relationship. It should be noted that,
irrespective of the correlational procedures followed, the range of the
coefficient lies in between 0 (means no relationship at all) to 1.00
(means perfect correlation). However, in research, these values of 0
and/or 1.00 are never or rarely obtained.
b. The Direction of the Correlation: You can find the two variables
correlated in either positive or negative direction. The direction of
the correlation is independent of the size of the correlation, and both
have nothing to do with each other. Correlations of +.62 and -.62 are
of exactly the identical size but show a different type of relationship
(the former is positive correlation and the latter shows negative
correlation). Positive correlations indicate that increase or decrease
in one variable tends to accompany the increase or decrease in
another variable in parallel fashion. If, on the other hand, increase in
one variable tends to decrease in another and vice versa, it indicates a
negative correlation. The higher or lower the correlation (either
positive or negative), the more accurately we can predict one
variable from the other.
c. Significance Level of Correlation: And, as far as the significance
level of obtained correlation is concerned, we first compute the
standard error (SE) of the correlation and then multiply this SE by


1.96 (for .05 level of significance) or by 2.56 (for .01 level of
significance). The obtained correlation is considered significant if it
is larger than the value obtained by the multiplication of SE with
either 1.96 or 2.56.
To learn these statistical computations of various types of correlations and
the SE, we advice you to carefully and extensively go through Garrett
(1981), Ferguson (1966), Charles (1988) or any other available textbook on
statistical procedures and/or educational research that you may find useful.
Report Writing : This is, as in other types of research, the last step of a
correlational research in which we very skillfully and logically report our
problem, variables, hypothesis, results of the study, analysis and
interpretation of the findings in terms of coefficients, direction and
significance of the correlation obtained between or among the variables




THE MODEL UNDERLYING CORRELATIONAL
RESEARCH METHODS

Correlational research designs are founded on the assumption that
reality is best described as a network of interacting and mutually-
causal relationships. Everything affects--and is affected by--
everything else. This web of relationships in not linear, as in
experimental research.
Thus, the dynamics of a system--how each parts of the whole system
affects each other part--is more important than causality. As a rule,
correlational designs do not indicate causality. however, some
correlational designs such as path analysis and cross-lagged panel
designs, do permit causal statements. Correlational research is
quantitative

Types of correlational research Designs
1. BIVARIATE
CORRELATION
The relationship between two variables is measured.
The relationship has a degree and a direction.
The degree of relationship (how closely they are
related) is usually expressed as a number between -1
and +1, the so-called correlation coefficient. A zero
correlation indicates no relationship. As the


correlation coefficient moves toward either -1 or +1,
the relationship gets stronger until there is a "perfect
correlation" at either extreme.
The direction of the relationship is indicated by the "-
" and "+" signs. A negative correlation means that as
scores on one variable rise, scores on the other
decrease. A positive correlation indicates that the
scores move together, both increasing or both
decreasing.
A student's grade and the amount of studying done,
for example, are generally positively correlated.
Stress and health, on the other hand, are generally
negatively correlated.

2. REGRESSION AND
PREDICTION

If there is a correlation between two variables, and we
know the score on one, the second score can be
predicted. Regression refers to how well we can make
this prediction. As the correlation coefficients
approach either -1 or +1, our predictions get better.
For example, there is a relationship between stress
and health. If we know my stress score, we can
predict my future health status score.

3. MULTIPLE
REGRESSION

This extends regression and prediction by adding
several more variables. The combination gives us
more power to make accurate predictions.
What we are trying to predict is called the
CRITERION VARIABLE.
What we use to make the prediction, the known
variables, are called PREDICTOR VARIABLES.
If we know not only my stress score, but also a health
behavior score (how well I take care of myself) and
how my health has been in the past (whether I am
generally healthy or ill), we can more closely predict
my health status. Thus, there are 3 predictors--stress,
health behavior, and previous health status--and one
criterion--future health.



4. FACTOR ANALYSIS This statistical procedure identifies underlying
patterns of variables. A large number of variables are
correlated and the presence of high inter-correlations
indicates a common underlying factor.
For example, we could measure a great many aspects
of physical, emotional, mental, and spiritual health.
Each questions would give us a score. High
correlations (either positive or negative) among
several of these scores would indicate a common
underlying factor. Many diferent questions might all
be measuring an "emotional health" factor, in which
case there would high correlations between questions
about anger, anxiety, depression, etc. Or, on the other
hand, if these are each separate factors, there would
be little correlation between the questions relating to
anger, anxiety, and so on.

5. CORRELATIONAL
DESIGNS USED TO
MAKE CAUSAL
CONCLUSIONS
Two designs used to make statements of cause and
effect use correlational methods. These are PATH
ANALYSIS and CROSS-LAGGED PANEL
DESIGNS. I won't go into much detail on them here,
but it is important to know that there are some times
when correlational designs can be used to determine
causality.
PATH ANALYSIS is used to determine which of a
number of pathways connects one variable with
another. For instance, we know there is a relationship
between stress and health. Path analysis has been used
to show that while there is a small path that "goes
through" physiology, the predominate path
connecting stress and health goes through health
behaviors. That is, we know stress affects
physiological factors such as coronary and immune
functions. We also know that when we are stressed,
we stop taking good care of ourselves, we sleep less,
eat less well, fail to get proper exercises, etc.
Research has shown that there is a stronger
connection between stress, health behaviors, and
health than there is between stress, physiology, and
health. And this research used correlational statistics
to draw this conclusion.


CROSS-LAGGED PANEL DESIGNS measures 2
variables at two points in time. It has been used, for
example, to show that watching violence on TV leads
to violent behavior, more than the other way around.

6. SYSTEMS ANALYSIS This involves the use of complex mathematical
procedures to determine dynamic processes, i.e.,
changes over time, feedback loops, and the elements
and flow of relationships.
It has been used, for example, to diagram the
differences between successful and unsuccessful
elementary schools. Some of the elements in these
systems are teachers' expectations of student
performance, teaching effort, and student
performance. Each of these affects the other and
changes over time.


Purpose
The correlation is a way to
measure how associated or
related two variables are.
The researcher looks at
things that already exist
and determines if and in
what way those things are
related to each other. The
purpose of doing
correlations is to allow us
to make a prediction about
one variable based on what
we know about another
variable.
For example, there is a
correlation between
income and education. We
find that people with
higher income have more
years of education. (You
can also phrase it that


people with more years of
education have higher
income.) When we know
there is a correlation
between two variables, we
can make a prediction. If
we know a groups
income, we can predict
their years of education.
Back to top

Direction
There are two types or
directions of correlation. In
other words, there are two
patterns that correlations
can follow. These are
called positive correlation
and negative correlation.
Remember that in a
correlational study, the
researcher is measuring
conditions that already
exist. She or he is asking
questions of a sample of
participants, and finding
out in what way pairs of
variables are related. For
example, a researcher
could ask about the
participants yearly income
and years of education, to
see if those two attributes
are correlated.
Positive correlation
In a positive correlation, as
the values of one of the
variables increase, the
values of the second


variable also increase.
Likewise, as the value of
one of the variables
decreases, the value of the
other variable also
decreases. The example
above of income and
education is a positive
correlation. People with
higher incomes also tend
to have more years of
education. People with
fewer years of education
tend to have lower income.
Here are some examples of
positive correlations:
1. SAT scores and college
achievementamong
college students, those
with higher SAT scores
also have higher grades
2. Happiness and
helpfulnessas peoples
happiness level increases,
so does their helpfulness
(conversely, as peoples
happiness level decreases,
so does their helpfulness)
This table shows some
sample data. Each person
reported income and years
of education.
Particip
ant
Inco
me
Years
of
Educat
ion
#1
125,0
00
19
#2
100,0
00
20


#3
40,00
0
16
#4
35,00
0
16
#5
41,00
0
18
#6
29,00
0
12
#7
35,00
0
14
#8
24,00
0
12
#9
50,00
0
16
#10
60,00
0
17
In this sample, the
correlation is .79.
We can make a graph,
which is called a
scatterplot. On the
scatterplot below, each
point represents one
persons answers to
questions about income
and education. The line is
the best fit to those points.
All positive correlations
have a scatterplot that
looks like this. The line
will always go in that
direction if the correlation
is positive.





Back to top

Negative correlation
In a negative correlation, as the values of one of the variables
increase, the values of the second variable decrease. Likewise,
as the value of one of the variables decreases, the value of the
other variable increases.
This is still a correlation. It is like an inverse correlation.
The word negative is a label that shows the direction of the
correlation.
There is a negative correlation between TV viewing and class
gradesstudents who spend more time watching TV tend to
have lower grades (or phrased as students with higher grades


tend to spend less time watching TV).
Here are some other examples of negative correlations:
1. Education and years in jailpeople who have more years of
education tend to have fewer years in jail (or phrased as people
with more years in jail tend to have fewer years of education)
2. Crying and being heldamong babies, those who are held
more tend to cry less (or phrased as babies who are held less
tend to cry more)
We can also plot the grades and TV viewing data, shown in
the table below. The scatterplot below shows the sample data
from the table. The line on the scatterplot shows what a
negative correlation looks like. Any negative correlation will
have a line with that direction.
Participant GPA
TV in
hours
per
week
#1 3.1 14
#2 2.4 10
#3 2.0 20
#4 3.8 7
#5 2.2 25
#6 3.4 9
#7 2.9 15
#8 3.2 13
#9 3.7 4
#10 3.5 21
In this sample, the correlation is -.63.





Back to top


Strength
Correlations, whether positive or negative, range in their
strength from weak to strong.
Positive correlations will be reported as a number between 0
and 1. A score of 0 means that there is no correlation (the
weakest measure). A score of 1 is a perfect positive
correlation, which does not really happen in the real world.
As the correlation score gets closer to 1, it is getting stronger.
So, a correlation of .8 is stronger than .6; but .6 is stronger
than .3.
The correlation of the sample data above (income and years of
education) is .79.
Negative correlations will be reported as a number between 0


and -1. Again, a 0 means no correlation at all. A score of 1 is
a perfect negative correlation, which does not really happen.
As the correlation score gets close to -1, it is getting stronger.
So, a correlation of -.7 is stronger than -.5; but -.5 is stronger
than -.2.
Remember that the negative sign does not indicate anything
about strength. It is a symbol to tell you that the correlation is
negative in direction. When judging the strength of a
correlation, just look at the number and ignore the sign.
The correlation of the sample data above (TV viewing and
GPA) is -.63.

Imagine reading four correlational studies with the following
scores. You want to decide which study had the strongest
results:
-.3 -.8 .4 .7
In this example, -.8 is the strongest correlation. The negative
sign means that its direction is negative.
Back to top


Advantage
1. An advantage of the correlation method is that we can make
predictions about things when we know about correlations. If
two variables are correlated, we can predict one based on the
other. For example, we know that SAT scores and college
achievement are positively correlated. So when college
admission officials want to predict who is likely to succeed at
their schools, they will choose students with high SAT scores.
We know that years of education and years of jail time are
negatively correlated. Prison officials can predict that people
who have spent more years in jail will need remedial
education, not college classes.
Back to top



Disadvantage
1. The problem that most students have with the correlation
method is remembering that correlation does not measure
cause. Take a minute and chant to yourself: Correlation is not
Causation! Correlation is not Causation! I always have my in-
class students chant this, yet some still forget this very crucial
principle.
We know that education and income are positively correlated.
We do not know if one caused the other. It might be that
having more education causes a person to earn a higher
income. It might be that having a higher income allows a
person to go to school more. It might also be some third
variable.
A correlation tells us that the two variables are related, but we
cannot say anything about whether one caused the other. This
method does not allow us to come to any conclusions about
cause and effect.

You might also like