You are on page 1of 15

Research Proposal

Research Proposal
Is there a relationship between preferred testing methods and student success on the
English 10 provincial exam?

Courtney OConnor
University of British Columbia

Research Proposal

Introduction
Over the years, computer word processers have been evolving and are being used
by students to type written assignments. Throughout the school year, students are able to
type a majority of their assignments and are encouraged to partake in the editing process
using a computer. According to several studies, writing with a computer can increase the
amount of writing students perform, the extent to which students edit their writing and
both of these factors lead to higher quality writing (Russell & Plati 2000). However,
when it comes to formal testing, the majority of students in the Surrey School District
write their exam using pen and paper.
In December 2013, I was preparing my two English 10 classes to write their
provincial exam in January. This exam constitutes for 20% of their overall grade and tests
them on their reading comprehension and writing. At my specific school, only students
who have an Individual Education Plan (IEP) can have the option to write the exam
electronically on a computer, due to space and availability. My interest in this study came
about when I was required to submit names to my administration of students, who did not
have an IEP and whom I felt would benefit from taking their exam electronically. While
some schools in the province have all their students write their exams electronically,
many schools do not have the option to provide an e-exam to all their students. Therefore,
it is not known the effects of student success and performance on an electronic exam as
opposed to paper. In informal discussions with my students, each class differed in their
preferred method of testing. What would happen if our school district were able to
provide students the option to pick which exam they would prefer to write? Will student
success improve if they are able to write the exam using their preferred method? The

Research Proposal

purpose of this study is to examine whether there is a relationship between student


success on the English 10 provincial and preferred testing methods within the Surrey
School District.
Problem Statement
As provincial exams are worth a good majority of a students grade, it is important as
educators that we are providing multiple testing options for students to be successful. The
main question that will be driving this research is: Is there a relationship between
preferred testing methods and student success on the English 10 provincial exam? There
are several other questions that I wish to explore from my original statement:

How does familiarity with writing on a computer influence student success on the

e-exam?
Will student preference towards a specific exam type help reduce anxiety?
As English teachers from the same school mark the written section of the exam,
would online exams provide more anonymity and effectively remove marker
bias?

Literature Review
There is a variety of research that supports higher success for student preference
and familiarity with specific testing methods. Research exploring the use of student
preferred method of testing can be examined using four categories: emergence of
technological assessment; marker bias; student preference and learning styles; and
familiarity with writing.

Research Proposal

Emergence of Technological Assessment


Over the years, technology has been growing and changing to become more
adaptable, cost effective and easily available. Cowan and Morrison argue that with the
investment of large sums of money in new technologies for schools, and with the time
spent training teachers to incorporate technology into their classroom, it is remarkable
that old and outdated models of assessment still remain (2001). It seems unfair that we
are encouraging students to have a 21st century approach to learning, but the assessment
that is worth a majority of their grade does not match that goal. In order to lend
authenticity to assessment Darling-Hammond, Ancess and Falk argue that assessment
needs be connected to students lives and to their learning experiences (1995). As an
educator, we give multiple opportunities for our students to produce and show their
learning. However, when it comes to major testing, students are limited in their ability to
showcase what they have learned.
Thomas, Price, Paine and Richards conducted a survey in which they interviewed
students experiences with an electronic examination. The study confirmed that the
majority of the participants found the experience at least as good as a traditional pen and
paper examination (2002). However, their study did find that anxiety regarding the length
of the examination did not change for students writing their exam electronically. Even
when typing their responses, some students were still not able to produce answers to all
questions, as they ran out of time. The findings of their study were conducted using
students of higher education, therefore it is important to have a similar survey regarding
high school student feedback. It should also be noted that the students were from remote
locations and did not have access to the institution that was testing them.

Research Proposal

Marker Bias
As every learner is different, every marker has their own approach. Therefore it is
likely that marker bias will occur in any scenario. In an attempt to determine the
influence of computer responses on marker scores, Russell presented the same set of
written responses to markers in three different written formats: hand written, typed
double-spaced 12 points times font; and typed double-spaced 14 point script font. His
results found that responses presented in handwritten form received significantly higher
scores than the same responses presented in a typed format. He concluded that this effect
was due to the visibility of errors and higher expectations for computer generated
responses (2002). Alternatively, as markers were able to partially identify individuals
with their personal handwriting, they felt connected to the writer and were more
sympathetic towards their overall grade. Russells study was replicated to mimic a similar
study from 1992. In both cases, the researchers inaccurately predicted that computergenerated texts would receive higher marks.
On the other hand, Klein and Taub (2005) attempted to determine if there was a
relationship between a students grade and the legibility of their writing response in an
examination. In their study, fifty-three teachers marked essay answers that had already
been awarded grades of 80% by impartial experts. Their results concluded that papers
that had illegible penmanship or used illegible typeface font scored significantly lower
than papers that were easier to read. No matter the mode of delivery, if the paper was
difficult to read, it received a lower mark. Their findings also show that legible writing,
few errors and typed compositions scored on average 8% higher than the original
markers. Many students struggle to receive an accurate assessment of their work as their

Research Proposal

penmanship is illegible. What is interesting to note is that marking criteria for most tests
do not include legible written work in their assessment criteria, however legibility still
appears to influence a students grade. If there was an alternative mode of delivery, some
students may be able to increase their mark. Klein and Taub recognized that it is still not
clear how markers succeed in strictly following unbiased assessment. Therefore,
additional studies are needed in this field.
Student Preference and Learning Styles
Researchers argue that when students become accustomed to one mode of writing,
whether it is through a computer or not, there needs to be more authentic assessment to
match that mode of writing (Russell & Haney, 1997). Russell and Haney conducted a
study examining the effect that mode of administration, electronic exam or traditional pen
and paper exam has on middle school students performance on multiple-choice and
written test questions. Their findings concluded that multiple-choice results did not differ
much, however students who were accustomed to writing on a computer had substantially
higher responses on their electronic exam than their responses written by hand. As
educators, we should be able to recognize our students strengths and abilities and give
them an opportunity to perform in an environment in which they feel comfortable and
will be able to maximize their performance.
As every learner is different, not all students will be successful with the same
form of assessment. Another study conducted by Wingenbach (2000) attempted to
determine if there was a statistical relationship between academic achievement and exam
delivery method for students enrolled in a university agriculture course. In the study,
independent learners had significantly more computer enjoyment than dependent learners

Research Proposal

did. It was also found that independent learners had less anxiety concerning the
introduction of the computer and electronic exam. The students also reported that taking
the exams electronically was not as easy as taking the exams in a more traditional paper
fashion. As the majority of students now have access to a computer in their everyday
lives, it would be important to replicate this study to see if the increase in the use of
computer technology would have an impact on the results.
Familiarity with Writing
With the rise in educational technologies that are available to teachers and
students, there has been a push to change the current form of curriculum and assessment.
Technology helps to engage students in particular activities and lessons. Russell and Plati
(2000) explored the effects of familiarity with writing on a computer in a study that
focused on students in grades 8 and 10. Their sample of students generally tested high on
state exams and were from a suburban location where computers are easily available.
Students were divided into two groups one wrote their exams electronically and the
other used pen and paper. The exams consisted of extended composition items that were
designed to be completed in a time span of 45-60 minutes. Before the exam, students
were tested on their computer familiarity and their typing capacity. This study concluded
that the majority of the sample students had extensive experience with writing on a
computer. The results of the study found that responses given on the electronic exam
were longer, and received higher scores in terms of topic development and overall
English standards. Students who wrote the exam were also able to finish their test in a
shorter amount of time.

Research Proposal

Research Method
This study will use correlational quantitative research methods with two
qualitative surveys. Students who participate in this study will write two English 10
British Columbia provincial exams. One test will use the electronic exam format and
another using the traditional pen and paper method. Students will also be required to fill
out a survey before their first exam, indicating which method of testing they prefer.
Following their final exam, students will complete a second survey with the same
questions as the first survey.
Participants
The participants who will take part in this study will be a pool of grade 10
students from the 19 various high schools within the Surrey School District. Ideally, a
group of 100-150 students is preferred in order to ensure variety within the participants
and to keep the group manageable for the research team. The number of students who
will participate in the study will depend on the number of students who are willing to
participate and have their parent/guardian complete the permission form.
It is preferred to have a group of students who vary in terms of their gender, IQ
level, whether English is their first language, ethnic backgrounds and preferred testing
methods. Students will be randomly divided into testing groups in order to ensure that the
characteristics and experiences of the groups are equal. In order to recruit students to
participate, an agreement will need to be in place with the Surrey School District in
which students who participate in the study will receive volunteer hours towards their
graduation application. A group of teachers will also need to be recruited in order to mark

Research Proposal

the exams. Teachers who mark the English 10 provincial are not paid as it is within their
job description to mark the exams collectively as a department. However, teachers who
are selected to participate in marking the exams of research participants would need to
train as a group in order to ensure similar marking practices. As this study will require
extra marking and training, teachers will be paid for their time and participation in the
study.
The students who participate in this study will be between the ages of 14-16 years
old. In order to address any ethical concerns, it will be important to receive parental
permission for the participants. Participants and guardians will also need to be fully
informed about the nature of the study and any possible dangers that might arise. It is also
significant to maintain confidentiality of the participants and to not to disclose their
identity or information. Additionally, students who have an individual education plan will
be excluded from this survey as their IEP may prohibit them from writing the exam using
pen and paper. It would be unethical to test students using a method that is inappropriate
for their success.
Instruments
The English 10 provincial exam is a standardized test that is created by the British
Columbia Ministry of Education. There are seven opportunities throughout the year for
students to take the exam. Each specific date has a different version of the exam. Very
few versions of the exam are released to the public and teachers are not allowed to keep
copies. Each exam that is given is designed to be fair, so that the assessment of students
will not vary. The two tests include multiple-choice questions that are marked using a
Scantron machine. Teachers who mark the exam use a 6-point scale to mark both written

Research Proposal

10

sections holistically. Each written section will be marked twice in order to ensure validity
and reliability in results.
There will also be two surveys that will be identical in content and will be
administrated. Within the surveys, students will have the opportunity to indicate their
preferred method of testing and why. As many students within the Surrey School District
have had little experience with e-exams, the second survey will help identify which
students changed their preference after having the opportunity to write both exams.
Procedure
Beginning in September 2014, a team of researchers will visit all 19 Surrey
School District high schools in an attempt to recruit students to participate in the study.
After receiving completed student applications and signed parental/guardian permission
forms, the team of researchers will work to randomly divide the list of willing
participants into equal testing groups based on their IQ, preferred testing method, ethic
background and gender. The successful participants will fill out their initial survey with a
member of the research team in a spare classroom after school on the date outlined in the
schedule of activities. Students will be informed ahead of time of the date and location
for the survey. Students will write their exams at their appropriate high school to avoid
changing their environment. Pen and paper exams will be written in the gym and
electronic exams will be written where large amounts of computers are available (likely
in computer labs and libraries, although this will vary depending on the school).
Following provincial exam procedures, an administrator and a group of four teachers
with a prep block in first semester will invigilate the pen and paper exams. For electronic
exams, a second administrator and a second group of four teachers will invigilate. Even

Research Proposal

11

though there are separate locations, all tests will be written at the same time. Students
who are participating in the study will write alongside their classmates who are not
participating. All provincial exams will be initially handed in to administration. The
exams of students who participated in the study will be separated from students who did
not and given to the research team who will then deliver the exams to the trained
markers.
Design and Analysis
The research will follow a correlational and a counterbalanced design. A
correlational approach has been selected in order to help examine whether there is a
relationship between the two variables of student success and their preferred method of
testing. Additionally, a counterbalanced design is the most appropriate selection in order
to help control variables within the group testing. Both groups will receive the same pen
and paper test and the same electronic test, but in a different order. An importance has
been placed on having two equal groups as a counterbalanced design requires equal
groups for equal treatment.
Students will be randomly placed into either Group 1 or Group 2. Group 1 will
write the electronic test first and Group 2 will write the pen and paper test. Two days
later, in order to give the students a break, they will switch. After the final test, students
will be able to answer a survey in which they can indicate if their exam preference has
changed.
The markers for the exam will also be split into two groups. One group of markers
will be responsible for marking all exams written by Group 1 and the second group of

Research Proposal

12

markers will be responsible for marking all exams written by Group 2. Through this
design, markers will have a chance to mark both electronic exams and the traditional pen
and paper exam. They will also have the opportunity to mark the participants twice in two
fashions. Having the same marker for both versions of the exams will help ensure
consistency in marking and will help researchers determine if there is a bias towards
neater printing.
Figure 1.1 Group Assignment to various conditions
November 2014

January 26, 2015

January 30, 2015

January 31, 2015

All participants
complete pre-survey
indicating their
preferred method of
testing

Group 1
Electronic Exam

Group 1 Pen &


Paper Exam

All participants
complete postsurvey indicating if
their preferred
method of testing
has changed since
November 2014

Group 2 Pen &


Paper exam

Group 2
Electronic Exam

The results of the two exams will be recorded to determine the variance between
the two scores both individually and as a group. Additionally, scores will be compared to
note if a student performed better on their first exam or their second exam. Individual
student surveys regarding their preferred method of testing will be read and graphed to
see if their preferred testing method resulted in higher scores. Additionally, researchers
will determine (based on student preference) which exam form is most preferred overall
and if that has a positive or negative relationship with their exam results. Finally, as
markers were able to mark both the written and electronic exam their scores will be
compared to see if there is a relationship in marker bias towards electronic exams.

Research Proposal

13

Schedule of Activities
September 2014

Students are placed into their specific


English classrooms for the semester.

September 2014-December 2014

Students will be recruited to participate in


the study. Researchers will work to
randomly divide the list of willing
participants into equal testing groups based
on their IQ, preferred testing method, ethic
background and gender. Markers will also
be recruited within this time frame.

November 2014

Students will complete the first survey and


indicate their preferred method of testing. A
member of the research team will visit all
19 schools within the month to have the
survey filled out. Markers will be trained as
a group in order to ensure consistency in
marking.

January 2015
Monday January 26th

Students will write their first exam. Group


A will write the electronic exam and group
B will write the pen and paper exam.

January 2015
Wednesday January 30th

Students will write their second exam.


Group A will write the pen and paper exam
and group B will write the electronic exam.

January 2015
Thursday January 31st

Students will have the opportunity to fill


out the second survey to see if their
preferred testing methods have changed
since November.

January February 2015

Exams are marked by instructors. Sent to


the ministry of education to record and post
marks.

Discussion

Research Proposal

14

As with any other research there are internal and external threats that can pose
possible implications on my research. As students will write the electronic exam and pen
and paper exam on different dates and in a different order, there is the possibility of
treatment diffusion. Different treatment groups may communicate with one and another
and learn from each other about the exam. Students will also be writing both exams
within a short period of time. As a result, there is the opportunity for pre-test sensitization
and for students to improve their performance on their second exam, no matter the order
of testing method. Additionally, while there is no requirement for students to write both
tests, students may feel that writing two tests will be too difficult when the time
approaches and drop out of the study. Alternatively, the provincial exam is not designed
specifically by me. While the exam is intended to be fair for all students, there is the
possibility for a lack of consistency in the difficulty of multiple choice questions and
essay topics. As with any type of technology, there is also the opportunity for it to break
down and not work. Instructors will need to ensure that the electronic exam does not
crash and students do not lose their answers.
While the presence of the electronic exam is being embraced by many schools within
British Columbia, many schools including those in the Surrey School District have
not been able to take advantage of this different testing opportunity. As indicated in the
literature review, students who have an increased familiarity to one environment have
shown increased success when tested in that same environment. If students are given the
option to be tested in a format that highlights their strengths as a writer then it could lead
to a greater sense of confidence in their writing and help them continue to improve their

Research Proposal

15

skills for the future. Through my research, I predict that students will have a greater
success on the English 10 provincial exam using their preferred method of testing.

References
Cowan, P., & Morrison, H. (2001). Assessment in the Twenty-first Century: A Role of
Computerised Adaptive Testing in National Curriculum Subjects. Teacher
Development, 5(2), 241-257.
Darling-Hammond, L., Acness, J. & Falk, B. (1995). Authentic Assessment in Action.
New York, NY: Teachers College Press.
Klein, J., & Taub, D. (2005). The Effect of Variations in Handwriting and Print on
Evaluation of Student Essays. Assessing Writing, 10(2), 134-148.
Russell, M. (2002). The influence of computer-print on rater scores. Technology and
Assessment Study Collaborative. Boston College. Retrieved from
http://www.bc.edu/research/intasc/PDF/ComputerPrintRaterScores.pdf
Russell, M. & Haney, W. (1997). Testing writing on computers: An experiment
comparing student performance on tests conducted via computer and via
paper-and-Pencil. Educational Policy Analysis Archives, 5(3).
Russell, M. & Plati, T. (2000). Effects of computer versus paper administrations of a
state-mandated writing assessment. Technology and Assessment Study
Collaborative. Boston College. Retrieved from
http://www.bc.edu/research/intasc/PDF/ComputerVsPaperStateWriting.pdf
Thomas, P., Price, B., Paine, C., & Richards, M. (2002). Remote Electronic
Examinations: Student Experiences. British Journal Of Educational Technology,
33(5), 537-49.
Wingenbach, G. J. (2000). Agriculture Students' Computer Skills and Electronic Exams.
Journal Of Agricultural Education, 41(1), 69-78.

You might also like