Professional Documents
Culture Documents
www.jtla.org
A publication of the Technology and Assessment Study Collaborative
Caroline A. & Peter S. Lynch School of Education, Boston College
Volume 2, Number 1
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002
Amie Goldberg, Michael Russell, and Abigail Cook
JTLA is a free on-line journal, published by the Technology and Assessment Study Collaborative,
Caroline A. & Peter S. Lynch School of Education, Boston College.
Copyright ©2002 by the Journal of Technology, Learning, and Assessment (issn ı540-2525).
Permission is hereby granted to copy any article provided that the Journal of Technology, Learning,
and Assessment is credited and copies are not sold.
Preferred citation:
Goldberg, A., Russell, M., & Cook, A. (2003). The effect of computers on student writing: A meta-
analysis of studies from 1992 to 2002. Journal of Technology, Learning, and Assessment, 2(1).
Available from http://www.jtla.org.
Abstract:
Meta-analyses were performed including 26 studies conducted between 1992–2002 focused on the
comparison between k–ı2 students writing with computers vs. paper-and-pencil. Significant mean
effect sizes in favor of computers were found for quantity of writing (d=.50, n=ı4) and quality of
writing (d=.4ı, n=ı5). Studies focused on revision behaviors between these two writing conditions
(n=6) revealed mixed results. Other studies collected for the meta-analysis which did not meet the
statistical criteria were also reviewed briefly. These articles (n=35) indicate that the writing process is
more collaborative, iterative, and social in computer classrooms as compared with paper-and-pencil
environments. For educational leaders questioning whether computers should be used to help stu-
dents develop writing skills, the results of the meta-analyses suggest that, on average, students who
use computers when learning to write are not only more engaged and motivated in their writing,
but they produce written work that is of greater length and higher quality.
The Effect of Computers on Student Writing:
A Meta-analysis of Studies from 1992 to 2002
Introduction
Over the past two decades, the presence of computers in schools has increased
rapidly. While schools had one computer for every ı25 students in 1983, they had
one for every 9 students in 1995, one for every 6 students in 1998, and one for
every 4.2 students in 2001 (Glennan & Melmed, 1996; Market Data Retrieval, 1999,
2001). Today, some states, such as South Dakota, report a student to computer ratio
of 2:ı (Bennett, 2002).
Just as the availability of computers in schools has increased, their use has
also increased. A national survey of teachers indicates that in 1998, 50 percent
of k–ı2 teachers had students use word processors, 36 percent had them use CD
ROMS, and 29 percent had them use the World Wide Web (Becker, 1999). More
recent national data indicates that 75 percent of elementary school-aged students
and 85 percent of middle and high school-aged students use a computer in school
(U.S. Department of Commerce, 2002). Today, the most common educational use
of computers by students is for word processing (Becker, 1999; inTASC, 2003).
Given that, it is logical to ask: Do computers have a positive effect on students’
writing process and quality of writing they produce?
As is described more fully below, the study presented here employs meta-ana-
lytic techniques, commonly used in fields of medicine and economics, to integrate
the findings of studies conducted between 1992–2002. This research synthesis
allows educators, administrators, policymakers, and others to more fully capitalize
on the most recent findings regarding the impact of word processing on students’
writing.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
5
The study presented here differs in two ways from the two previous meta-
analyses described above. First, while Cochran-Smith’s (1991) study was qualita-
tive in nature and Bangert-Drowns’ (1993) employed a quantitative meta-analytic
technique, this study combines quantitative and qualitative methods in order to
provide a richer, more encompassing view of all data available for the time period
under study.
Secondly, the quantitative component provides an expanded scope on student-
and learning environment-level variables in relation to writing performance. These
supplemental analyses include factors such as: students’ grade level, keyboarding
skills, school setting (urban, suburban, rural), etc.
The specific research questions addressed in this study are:
• Does word processing impact k–ı2 student writing? If so, in what ways
(i.e., is quality and/or quantity of student writing impacted)?
• Does the impact of word processing on student writing vary accord-
ing to other factors, such as student-level characteristics (as described
above)?
Methodology
Meta-analytic procedures refer to a set of statistical techniques used to sys-
tematically review and synthesize independent studies within a specific area of
research. Gene Glass first proposed such methods and coined the term “meta-
analysis” in 1976. “Meta-analysis refers to the analysis of analyses … it …refer[s] to
the statistical analysis of a large collection of results from individual studies for the
purpose of integrating the findings. It connotes a rigorous alternative to the casual,
narrative discussions of research studies which typify our attempts to make sense
of the rapidly expanding research literature” (p. 3). The meta-analytic portion of the
study was conducted using procedures set forth by Lipsey and Wilson (2001) and
Hedges and Olkin (1985). The methodology followed five phases:
• identification of relevant studies,
• determination for inclusion,
• coding,
• effect size extraction and calculation, and
• data analyses.
Each of these phases is described separately below.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
6
9 31%
(n=8)
8
7 23%
(n=6)
6 19%
(n=5)
5
4 12%
(n=3)
3 8%
(n=2)
2 4% 4%
(n=1) (n=1)
1
0
Quality Quantity Quality & Revision Quantity & Quality & Quality,
Quantity Revision Revision Quantity, &
Revision
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
8
Hannafin, 1992; Peterson, 1993), while other studies were more qualitative in the
ways in which they measured revision. The latter of these studies (Hagler, 1993;
Head, 2000; Olson, 1994; Peterson, 1993; Seawel, 1994) measured ‘surface’ and
format revisions (spelling, grammar, punctuation, etc.) as well as revisions that
resulted in changes in content and meaning.
After all studies were coded, a variable representing “methodological quality”
(Moher & Olkin, 1995, Shadish & Haddock, 1994) was derived from a subset of
the coded variables. For each study, methodological quality was based on a ı6-point
scale. This scale was based on the following formula:
• one point for each dichotomous variable coded as “yes” in the “Research
Methodology” category,
• one point for studies obtained from refereed journals (“Publication
Type”),
• a maximum of three points for the “Intervention time/Duration of
study” and “Sample size” variables,
• Heterogeneity of the sample’s gender and race/ethnicity were each
awarded one point (“Student Characteristics”), and
• Mention of at least one demographic descriptor for the study’s sample
(i.e., gender, race, geographic setting (rural, urban, suburban) was
awarded one point.
Finally, there was some ambiguity in study reporting which sometimes made
coding study features a challenge. Where the presence or absence of a feature
could not be reasonably detected (explicitly or by implication), an additional code,
“no information available,” was employed.
The codes assigned to each study along with all data used to calculate effect
sizes are presented in the data file that accompanies this paper.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
10
measure for a particular outcome for the same sample (i.e., “writing quality” was
often measured in more than one way per study; mechanics, content, organization,
etc. were frequently encountered sub-domains), overall means and standard devia-
tions across these measures were calculated and used to calculate a single effect
size. In this way, the assumption of independence was preserved and inflated Type ı
error rates were controlled for, yet no study findings were ignored.
At the outset of the study, we had hoped to base the calculation of effect
sizes using gain scores (the difference between scores on post-test and pre-test
measures). Unfortunately, a considerable number of studies either lacked a pre-
post design or failed to report pre-test data. This precluded the most compelling
perspective from being meta-analyzed: comparing gain scores between paper-and-
pencil and computer writing groups.
In order to maximize the number of studies included in the analysis, the
few pre- and post-test designs were analyzed only in terms of post-test data. This
enabled results from the pre/post studies to be analyzed with post-only design
data. For all three outcomes (i.e., quantity of writing, quality of writing, and revi-
sions), the standardized mean difference effect size statistic was employed. Since it
has been documented that this effect size index tends to be upwardly biased when
based on small sample sizes, Hedges (1981) correction was applied.
Effect sizes from data in the form of t- and F-statistics, frequencies, and
p-values were computed via formulas provided by Lipsey & Wilson (2001).
Data Analysis
Three types of data analyses were performed. First, using the effect size
extracted from each study, an overall effect size across studies was calculated and
tested for statistical significance. Second, analyses were performed to investigate
the potential effects of publication bias. Finally, to investigate the extent to which
study features moderated the effect on outcome measures, regression analyses
were performed. Below, we describe the methods used to explore publication bias
and moderating effects.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
11
Publication Bias
Publication bias analyses were performed via Forest plots, funnel plots, and
the fail-safe N analysis. Forest plots were used to visually convey the contribution
of each study to its meta-analysis, by plotting study effect sizes and correspond-
ing confidence interval bars in a single display. Funnel plots, another widely-used
technique for detecting publication bias, were also employed. These plots graphi-
cally investigate possible gaps among the studies’ findings by simply plotting effect
sizes against sample sizes. Finally, a fail-safe N analysis (Orwin, 1983) was con-
ducted for each meta-analysis. This analysis addresses the “file-drawer” problem
in meta-analytic research and provides an estimate of the number of insignificant,
unpublished studies that would have to exist in order to render a statistically sig-
nificant meta-analytic finding insignificant.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
12
For each outcome variable, frequencies of study feature variables were exam-
ined. After suitable independent variables were identified, variables with more
than two levels were recoded into dummy variables. These variables were then
categorized into groups by theme. For example, variables such as “presence of
control group,” “length of intervention,” “type of publication,” and “conversion of
handwritten student work to word processed format” fell under the theme labeled
“Study’s Methodological Quality.” Variables such as: “technical assistance provided
to students,” “student participation in peer editing,” “students receive teacher feed-
back,” were included in the “Student Support” theme.
Ideally, for each outcome, each themed group of variables would be entered as
a single block and themed groups would be entered step wise into a single regres-
sion model. However, this was not statistically possible due to the small number
of effect sizes. Instead, each themed group of variables was entered as a single
block of independent variables and each theme was analyzed in separate regres-
sion models.
Summary of Findings
In this section, we present a summary of the findings. Readers who are famil-
iar with meta-analytic techniques or who desire a more technical presentation of
the findings are encouraged to read Appendix B.
The analyses focused on three outcome variables commonly reported by stud-
ies that examine the impact of word processors on student writing. These variables
include: Quantity of Writing, Quality of Writing, and Number of Revisions. Below,
findings for each of these variables are presented separately.
Quantity of Writing
Fourteen studies included sufficient information to calculate effect sizes that
compare the quantity of writing, as measured by word count, between computer
and paper-and-pencil groups.
Figure 3 depicts the effect sizes and the 95 percent confidence interval for all
ı4 studies sorted by publication year. The fifteenth entry depicts the mean weighted
effect size across all fourteen studies, along with the 95 percent confidence inter-
val.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
13
Adjusted
Publication Grand Effect Lower Upper Effect Size
Author Year N* Size 95%CI 95%CI -2 -1 0 1 2
Effect Size
Figure 3 indicates that 4 of the ı4 studies had effect sizes that were approxi-
mately zero or negative, but which did not differ significantly from zero. Figure ı
also shows that 4 of the ı4 studies had positive effect sizes that differed signifi-
cantly from zero. In addition, the mean weighted effect size across all ı4 studies
is .50, which differs significantly from zero. Thus, across the fourteen studies, the
meta-analysis indicates that students who write with word processors tend to pro-
duce longer passages than students who write with paper-and-pencil.
Recognizing that our search for studies may have missed some studies that
have not been published, a “fail-safe N” analysis (Orwin, 1983) was conducted to
estimate the number of studies that report no effect needed to nullify the mean
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
14
adjusted effect size. This analysis indicates that in order to reverse the effect size
found, there would need to be 24 unpublished studies that found no effect. Given
that only ı4 studies that fit the selection criteria were found and that only four of
these had non-positive effect sizes, it seems highly unlikely that an additional 24
studies that found non-positive effects exist. This suggests that our meta-analytic
findings are robust to publication bias.
As described above, regression analyses were performed to explore factors that
may influence the effect of word processing on the quantity of student writing.
These analyses indicated that student supports (i.e., keyboard training, technical
assistance, teacher feedback, and peer editing) were not significant factors affect-
ing the quantity of student writing. Similarly, student characteristics (i.e., keyboard
experience prior to the study, student achievement level, school setting, and grade
level) also were not significant factors affecting the quantity of student writing,
although grade level did approach statistical significance. Finally, the study charac-
teristics (i.e., publication type, presence of control group, pre-post design, length of
study) were not related to the effect of word processing on the quantity of student
writing.
Recognizing that studies that lasted for less than six weeks may not provide
enough time for the use of word processors to impact student writing, a separate
set of regression analyses were performed for the sub-set of studies that lasted
more than six weeks. For this sub-set of studies, a significant relationship between
school level and effect size was found. On average, effect sizes were larger for stud-
ies that focused on middle and high school students as compared to elementary
students. All other factors remained insignificant.
In short, the meta-analysis of studies that focused on the effect of word pro-
cessing on the quantity of student writing found a positive overall effect that was
about one-half standard deviation. This effect tended to be larger for middle and
high school students than for elementary students
Quality of Writing
Fifteen studies included sufficient information to calculate effect sizes that
compare the quality of writing between computer and paper-and-pencil groups.
Figure 4 depicts the effect sizes and the 95 percent confidence interval for all ı5
studies sorted by publication year. The ı6th entry depicts the mean weighted effect
size across all fifteen studies, along with the 95 percent confidence interval.
Figure 4 indicates that 4 of the ı5 studies had effect sizes that were approxi-
mately zero or negative, but which did not differ significantly from zero. Since the
power in meta-analysis is the aggregation of findings across many studies, it is not
unusual to find a subset of studies that contradict the overall trend of findings.
In this case, a qualitative examination did not reveal any systematic differences
among these studies’ features as compared with those studies reporting positive
effect sizes. Figure 4 also shows that the ıı remaining studies had positive effect
sizes and that seven of these effect sizes differed significantly from zero. In addition,
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
15
the mean adjusted effect size across all ı5 studies is .4ı, which differs significantly
from zero. According to Cohen’s criteria for effect sizes, this is considered a small
to moderate effect. Thus, across the ı5 studies, the meta-analysis indicates that
students who write with word processors tend to produce higher quality passages
than students who write with paper-and-pencil.
Adjusted
Publication Grand Effect Lower Upper Effect Size
Author Year N* Size 95%CI 95%CI -2 -1 0 1 2
Effect Size
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
16
Recognizing that our search for studies may have missed some studies that
have never been published, the “fail-safe N” analysis was again conducted. This
analysis indicates that in order to reverse the effect size found, there would have to
be ı6 unpublished studies that found no effect. Given that only ı5 studies that fit
the selection criteria were found and that only four of these had non-positive effect
sizes, it seems highly unlikely that an additional ı6 studies that found non-positive
effects exist.
As described above, regression analyses were performed to explore factors
that may influence the effect of word processing on the quality of student writing.
These analyses indicated that student supports (i.e., keyboard training, technical
assistance, teacher feedback, and peer editing) were not significant factors affect-
ing the quality of student writing. Similarly, the study characteristics (i.e., type of
publication, employment of random assignment, employment of pre-post design,
single vs. multiple classroom sampling, length of study, etc.) were not related to
the effect of word processing on the quality of student writing. However, when
examining student characteristics (i.e., keyboard experience prior to the study, stu-
dent achievement level, school setting, and grade level), a statistically significant
relationship was detected between grade level and quality of writing: as school level
increased, the magnitude of the effect size increased.
Recognizing that studies that lasted for less than six weeks may not provide
enough time for the use of word processors to impact student writing, a separate
set of regression analyses were performed for the sub-set of studies that lasted
more than six weeks. For this sub-set of studies, no significant relationships were
found. This suggests that the relationship between school level and quality of writ-
ing occurred regardless of the length of study.
In short, the meta-analysis of studies that focused on the effect of word pro-
cessing on the quality of student writing found a positive overall effect that was
about four tenths of a standard deviation. As with the effect for quantity, this effect
tended to be larger for middle and high school students than for elementary stu-
dents.
Revisions
Only 6 of the 30 studies that met the criteria for inclusion in this study
included measures related to revisions. Of these six studies, half were published in
refereed journals, half took place in elementary schools, and only one employed a
sample size greater than 30.
Because of the small sample size (only 6) coupled with the reporting of mul-
tiple measures of revisions which could not be combined into a single measure
for each study, it was not possible to calculate an average effect size. Nonethe-
less, these six studies all report that students made more changes to their writing
between drafts when word processors were used as compared to paper-and-pencil.
In studies that focused on both revision and quality of writing, revisions made by
students using word processors resulted in higher quality writing than did stu-
dents revising their work with paper and pencils. It should also be noted that one
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
17
study found that students writing with paper-and-pencil produced more content-
related revisions than did students who used word processors.
In short, given the small number of studies that compared revisions made on
paper with revisions made with word processors coupled with the multiple meth-
ods used to measure revisions, it is difficult to estimate the effect of computer use
on student revisions.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
19
In a three year study that examined the effect of computers on student writing,
Owston and Wideman (1997) compared changes in the quantity and quality of
writing of students attending a school in which there was one computer for every
fifteen students versus a school in which there was ı computer for every 3students.
After three years, Owston and Wideman found that the quality of writing improved
at a faster rate in the high access school and that the mean length of composi-
tion was three times longer in the high access school. The researchers, however,
acknowledged that their findings do not take into account differences between
teachers or the demographics of the students. Nonetheless, the researchers state
that these variables did not appear to explain the superior writing produced by stu-
dents in the high access school.
Not all studies, however, report positive effects of computers on student writ-
ing. In a three year study in which 72 third-grade students wrote on computer
and paper, Shaw, Nauman, and Burson (1994) report that the length and quality
of writing produced on paper was higher than writing produced on computer.
This finding occurred even though students who wrote on computer had received
keyboarding instruction. The authors described writing produced on computer as
“stilted” and less creative.
Discussion
This study employed meta-analytic techniques to summarize findings across
multiple studies in order to systematically examine the effects of computers and
student learning. Although a large number of studies initially identified for inclu-
sion in the meta-analysis had to be eliminated either because they were qualitative
in nature or because they failed to report statistics required to calculate effect sizes,
the analyses indicate that instructional uses of computers for writing are having a
positive impact on student writing. This positive impact was found in each inde-
pendent set of meta-analyses; for quantity of writing as well as quality of writing.
Early research consistently found large effects of computer-based writing on
the length of passages and less consistently reported small effects on the quality
of student writing. In contrast, although our meta-analyses of research conducted
since 1992 found a larger overall effect size for the quantity of writing produced
on computer, the relationship between computers and quality of writing appears
to have strengthened considerably. When aggregated across all studies, the mean
effect size indicated that, on average, students who develop their writing skills
while using a computer produce written work that is .4 standard deviations higher
in quality than those students who learn to write on paper. On average, the effect of
writing with computers on both the quality and quantity of writing was larger for
middle and high school students than for elementary school students.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
20
Glass is quoted in Morton Hunt’s (1997), How Science Takes Stock: the Story of
Meta-analysis as saying,
What I’ve come to think meta-analysis really is – or rather, what it ought to
be – is not little single-number summaries such as ‘This is what psychother-
apy’s effect is’ but a whole array of study results that show how relationships
between treatment and outcome change as a function of all sorts of other
conditions – the age of the people in treatment, what kinds of problems they
had, the training of the therapist, how long after therapy you’re measuring
change, and so on. That’s what we really want to get – a total portrait of all
those changes and shifts, a complicated landscape rather than a single central
point. That would be the best contribution we could make. (p. ı63)
Following this purpose, regression analyses were conducted in order to inves-
tigate key factors that may affect the relationship between computers and writ-
ing. These analyses indicated that computers had a greater impact on writing for
middle and high school students than for elementary school students, for both
quantity and quality of writing. Other factors investigated, such as students’ key-
boarding experience and students’ academic achievement level, were not found to
play a significant role for either quantity or quality of writing. However, it is impor-
tant to note that these additional analyses were conducted with a small number of
studies, which often makes detecting effects difficult.
In addition, the findings reported in the excluded studies are consistent with
both the findings of our quantitative meta-analyses and many of the findings
presented in Cochran-Smith’s (1991) and Bangert-Downs (1993) summaries of
research conducted prior to 1992. In general, research over the past two decades
consistently finds that when students write on computers, writing becomes a more
social process in which students share their work with each other. When using
computers, students also tend to make revisions while producing, rather than after
producing, text. Between initial and final drafts, students also tend to make more
revision when they write with computers. In most cases, students also tend to pro-
duce longer passages when writing on computers.
For educational leaders questioning whether computers should be used to
help students develop writing skills, the results of our meta-analyses suggest that
on average students who use computers when learning to write produce written
work that is about .4 standard deviations better than students who develop writ-
ing skills on paper. While teachers undoubtedly play an important role in helping
students develop their writing skills, the analyses presented here suggest that
when students write with computers, they engage in the revising of their work
throughout the writing process, more frequently share and receive feedback from
their peers, and benefit from teacher input earlier in the writing process. Thus,
while there is clearly a need for systematic and high quality research on computers
and student learning, those studies that met the rigorous criteria for inclusion in
our meta-analyses suggest that computers are valuable tools for helping students
develop writing skills.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
21
Endnote
ı The smallest grand sample size among the ı4 studies measuring “quantity of writing” was ı2, while
the largest grand sample size was ı36. This variation in sample size resulted in a mean inverse
variance weight of ı2.30 (SD = 8.75), and a range from 2.52 through 3ı.03. The two largest weights
were slightly greater than two standard deviations above the mean in value, and therefore were
winsorized down to the value of two standard deviations above the mean, 29.80.
References
Albertson, L. R. & Billingsley, F. F. (1997, March). Improving young writers’
planning and reviewing skills while story-writing. Paper presented at the Annual
Meeting of the American Educational Research Association, Chicago, IL.
Allen, G. & Thompson, A. (1994, April). Analysis of the effect of networking on
computer-assisted collaborative writing in a fifth grade classroom. Paper presented
at the Annual Meeting of the American Educational Research Association,
New Orleans, LA.
Allison, B. (1999). Facilitating student achievement in writing through the deft
employment of computer technology. Unpublished Master’s Action Research
Project, Saint Xavier University/IRI.
Baker E. & Kinzer, C.K. (1998). Effects of technology on process writing: Are they
all good? National Reading Conference Yearbook, 47, 428–440.
Bangert-Drowns, R. L. (1993). The word processor as an instructional tool: A
meta-analysis of word processing in writing instruction. Review of Educational
Research, 63(ı), 69–93.
Barrera, M. T., III, Rule, A. C., & Diemart, A. (2001). The effect of writing with
computers versus handwriting on the writing achievement of first-graders.
Information Technology in Childhood Education Annual, 28.
Becker, H. J. (1999). Internet use by teachers: conditions of professional use and
teacher-directed student use. Irvine, CA: Center for Research on Information
Technology and Organizations.
Bennett, R.E. (2002). Inexorable and inevitable: The continuing story of
technology and assessment. Journal of Technology, Learning and Assessment,
1(ı). Retrieved November ı, 2002, from http://www.bc.edu/research/intasc/
jtla/journal/vını.shtml
Biesenbach-Lucas, S. & Weasenforth, D. (2001). E-mail and word processing in
the esl classroom: How the medium affects the message. Language Learning &
Technology, 5(ı), ı35–ı65.
Bogard, E. A. (1998). The effects of computer-mediated writing on the quality and
quantity of foreign language composing. Unpublished doctoral dissertation,
University of South Florida.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
22
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
23
DeFoe, M. C. (2000). Using directed writing strategies to teach students writing skills
in middle grades language arts. Unpublished Ed.D. Practicum II Report, Nova
Southeastern University.
D’Odorico, L. & Zammuner, V. L. (1993). The influence of using a word processor
on children’s story writing. European Journal of Psychology of Education, 8(ı),
5ı–64.
Dodson, L. E. (2000). The effects of using technology to enhance student ability
to read, organize and write informative text. Unpublished master’s thesis,
University of Manitoba, British Columbia, Canada.
Dybdahl, C. S., Shaw, D. G., & Blahous, E. (1997). The impact of the computer on
writing: No simple answers. Computers in the Schools, 13(3), 4ı–53.
Ediger, M. (1996). Middle school pupil writing and the word processor (ED393ı0ı).
Escobedo, T. H. & Allen, M. (1996, April). Preschoolers’ emergent writing at
the computer. Paper Presented at the Annual Meeting of the American
Educational Research Association, New York, NY.
Etchison, C. (1989). Word processing: A helpful tool for basic writers. Computers
and Composition, 6(2), 33–43.
Fan, H.-L. & Orey, M. (2001). Multimedia in the classroom: Its effect on student
writing ability. Journal of Research on Technology in Education, 33(5).
Fletcher, D. C. (2001). Second graders decide when to use electronic editing tools.
Information Technology in Childhood Education Annual, ı55–74.
Freitas, C. V. & Ramos, A. (1998, February). Using technologies and cooperative
work to improve, oral, writing, and thinking skills: Voices from experience.
Proceedings of the Selected Research and Development Presentations at the
National Convention of the Association for Educational Communications and
Technology (AECT), St. Louis, MO.
Gaddis, B., Napierkowski, H., Guzman, N., & Muth, R. (2000, October). A
comparison of collaborative learning and audience awareness in two computer-
mediated writing environments. Paper presented at the Annual Proceedings
of Selected Research and Development Papers Presented at the National
Convention of the Association for Educational Communications and
Technology, 23rd, Denver, CO.
Gallick-Jackson, S. A. (1997). Improving narrative writing skills, composition skills,
and related attitudes among second grade students by integrating word processing,
graphic organizers, and art into a process approach to writing. Unpublished M.S.
Practicum Project, Nova Southeastern University, Ft. Lauderdale, FL.
Glass, G. V (1976). Primary, secondary, and meta-analysis of research.
Educational Researcher, 5, 3–8.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
24
Glass, G. V., McGaw, B., & Smith, M.L. (1981). Meta-analysis in social research.
Beverly Hills, CA: Sage Publications.
Glennan, T. K. & Melmed, A. (1996). Fostering the use of educational technology:
Elements of a national strategy. Santa Monica, CA: Rand.
Godsey, S. B. (2000). The effects of using microsoft word[r] on journal word counts in
the high school english classroom. Unpublished master of arts action research
project, Johnson Bible College, Knoxville.
Greenleaf, C. (1994). Technological indeterminacy. The role of classroom writing
practices and pedagogy in shaping student use of the computer. Written
Communication, 11(ı), 85–ı30.
Grejda, G. F. & Hannafin, M. J. (1992). Effects of word processing on sixth
graders’ holistic writing and revisions. Journal of Educational Research, 85(3),
ı44–ı49.
Hagler, W. J. (1993). The effects of the word processor on the revision behaviors of
sixth-grade students. Unpublished doctoral dissertation, Auburn University,
Auburn.
Hannafin, M. J. & Dalton, D. W. (1987, July/August). The effects of word
processing on written composition. The Journal of Educational Research, 80
338–42.
Haas, C. & Hayes, J. R. (1986). Pen and paper versus the machine: Writers
composing in hard-copy and computer conditions (CDC Technical Report
No. ı6). Pittsburgh, PA: Carnegie-Mellon University, Communication Design
Center.
Hartley, J. (1993). Writing, thinking, and computers. British Journal of Educational
Technology, 24(ı), pp. 22–3ı.
Head, B. B. (2000). Revision instruction and quality of writing by eighth-grade
students using paper and pencil or word processing. Unpublished doctoral
disertation, Oakland University, Rochester.
Hedges, L. (1981). Distributed theory for Glass’s estimator of effect size and
related estimators, Journal of Educational Statistics, 6, ı07–ı28.
Hedges, L. & Olkin, I. (1985). Statistical Methods for Meta-Analysis. Orlando, FL:
Academic Press.
Hood, L. M. (1994). Effects of computer correspondence on student writing (Technical
Report). Curry School of Education: University of Virginia, Charlottesville.
Hunt, Morton. (1997). How science takes stock: The story of meta-analysis. New York:
Russell Sage Foundation.
Hydrick, C. J. (1993). The interaction of a nine-year-old with a word processor.
Unpublished doctoral dissertation, Arizona State University, Tempe.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
25
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
26
Langone, J., Levine, B., Clees, T.J., Malone, M., & Koorland, M. (1996). The
differential effects of a typing tutor and microcomputer-based word
processing on the writing samples of elementary students with behavior
disorders. Journal of Research on Computing in Education, 29(2), ı4ı–ı58.
Lerew, E. L. (1997). The use of computers to improve writing skills among low-
achieving hispanic students. Unpublished doctoral dissertation, University of
La Verne, La Verne.
Lewis, R. B. (1998). Enhancing the writing skills of students with learning disabilities
through technology: An investigation of the effects of text entry tools, editing tools,
and speech synthesis. Final Report. San Diego, CA: Department of Special
Education, San Diego State University.
Lewis, R. B., Ashton, T. M., Haapa, B., Kieley, C. L., & Fielden, C. (1999).
Improving the writing skills of students with learning disabilities: Are word
processors with spelling and grammar checkers useful? Learning Disabilities:
A Multidisciplinary Journal, 9(3), 87–98.
Lichtenstein, N. (1996). The effect of word processing on writing achievement.
Unpublished master’s project, Kean College of New Jersey, Union.
Lipsey, M. W. W. & Wilson, D.B. (2001). Practical Meta-Analysis (Vol. 49).
Thousand Oaks, CA: Sage Publications.
Lipsey, M.W. (1992) Juvenile delinquency treatment: a meta-analytic inquiry
into the variability of effects. In T.D. Cook, H. Cooper, D.S. Cordray, H.
Hartmann, L.V. Hedges, R.J. Light, T.A. Louis and F. Mosteller (Eds.), Meta-
analysis for explanation. New York: Russell Sage Foundation.
Lohr, L., Ross, S. M., & Morrison, G. R. (1996). Using a hypertext environment
for teaching process writing: An Evaluation study of three student groups.
In Proceedings of Selected Research and Development Presentations
at the 1996 National Convention of the Association for Educational
Communications and Technology (18th, Indianapolis, IN, 1996).
Lomangino, A. G., Nicholson, J., & Sulzby, E. (1999). The Nature of Children’s
Interactions while Composing Together on Computers. CIERA Report. Ann Arbor,
MI: Center for the Improvement of Early Reading Achievement.
Lowther, D. L., Ross, S. M., Morrison, G. R. (2001). Evaluation of a laptop program:
Successes and recommendations. Paper presented at the National Educational
Computing Conference Proceedings (22nd, Chicago, IL, June 25–27, 2001).
Lund, D. M. & Hildreth, D. (1997, December). Integrating the computer into
language arts in a fifth grade classroom: A Developing instructional model. Paper
presented at the annual meeting of the National Reading Conference.
MacArthur, C.A., Graham, S., & Schwartz, S.S. (1994). Peers + word processing
+ strategies = a powerful combination for revising student writing. Teaching
Exceptional Children, 27(ı), 24–29.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
27
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
28
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
29
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
30
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
31
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
32
• Gender description
– Heterogeneous
– Homogeneous
• Race/ethnic description
– Heterogeneous
– Homogeneous
• School-setting
– Rural
– Suburban
– Urban
• Type of students
– Mainstream
– SPED/At-risk
– Gifted
– ESL/ESOL
• Writing ability of students
– Low
– Average
– High
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
33
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
34
Appendix B: Results
Descriptive Highlights
As detailed in Table Bı, 64 percent of the studies (n=9) were published in ref-
ereed journals, ı4.3 percent (n=2) employed random assignment, and more than
half (n=8) sampled from multiple classrooms. For 57 percent of the studies (n=8),
the research duration lasted between six weeks and one semester, and 86 percent
(n=ı2) utilized standardized writing tasks across groups. In 43 percent (n=6) of the
studies, students were provided with keyboarding training. Individual writing (as
opposed to collaborative writing) was the focus in all ı4 studies, and peer editing,
teacher feedback, and technical assistance were available to students in 2ı percent
(n=3) of the studies. It was inconclusive whether or not teacher feedback and/or
technical assistance were study features in n=5 and n=9 studies, respectively.
With respect to student demographics, only three studies (2ı percent) provided
sufficient information that indicated that the sample was gender diverse and four
studies (29 percent) indicated that they had racially/ethnically-diverse student
samples. Over half of the studies did not provide sufficient information about the
participating students to classify their gender or racial/ethnic diversity. All but
two studies (n=ı2) focused on mainstream education samples, and half (n=7) of
the studies were conducted with elementary school students. Finally, two studies
occurred in rural, three in urban, and four in suburban settings, while the three
studies lack any geographic information.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
35
WP format
Technical assistance provided to students 3 (21.4%) 2 (14.3%) 9 (64.3%)
Teacher’s feedback on provided to students 3 (21.4%) 6 (42.9%) 5 (35.7%)
No Infor-
High Average Low Mixed mation
Student sample ability level 3 (21.4%) 2 (14.3%) 1 (7.1%) 4 (28.6%) 4 (28.6%)
Sub- No Infor-
Rural Urban urban Mixed mation
School setting 2 (14.3%) 3 (21.4%) 4 (28.6%) 2 (14.3%) 3 (21.4%)
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
36
1.2
1.0
Adjusted Effect Size
.8
.6
.4
.2
0.0
-.2
0 10 20 30 40 50 60 70 80
In order to identify which, if any, of the coded study features have a significant
moderating effect on the relationship between computers and quantity of writing,
regression analyses were performed.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
38
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
39
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
40
Sensitivity Analysis
When heterogeneity among effect sizes are found in a meta-analysis, the
“robustness” of the main findings can be examined through sensitivity analyses
(Lipsey & Wilson, 2001). The sensitivity analysis explores ways in which the main
findings are either consistent or inconsistent in response to varying the ways in
which the data have been aggregated or included in the overall meta-analysis.
For example, to provide a sense of how sensitive the main findings are across
subgroups (say of school level), sensitivity analyses focus on a particular level of a
variable.
A key variable of interest in this analysis is length of study. It can be reason-
ably argued that in studies of short duration (i.e., six weeks or less) measuring the
impact of using computers on students’ writing is different than measuring com-
puters’ impact on writing over a longer period of time. Studies conducted under
longer time periods can result in students who are more adept at keyboarding, are
more comfortable with features of word processing programs, and have sufficient
time to adapt their writing strategies to exploit features of word processors.
Considering this, a sensitivity analysis was conducted which focused only on
those studies for which the length of intervention was greater than six weeks. This
selection procedure eliminated six of the fourteen studies from the analysis.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
41
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
42
Descriptive Highlights
As detailed in Table B7, 60 percent (n=9) of the included studies that focused
on the quality of writing were published in refereed journals. Sixty percent of the
studies also employed samples drawn from multiple classrooms, 20 percent (n=3)
employed random assignment, and for 60 percent (n=9) the research duration
lasted between six weeks and one semester. Thirteen of the fifteen studies (87 per-
cent) utilized standardized writing tasks across groups, and in 40 percent (n=6) of
the studies, students were provided with keyboarding training. Individual writing
(as opposed to collaborative writing) was the focus in all ı5 studies. Peer editing was
a component in three (20 percent) of the studies, teacher feedback on writing was
present in four of the studies (27 percent), and technical assistance was available to
students in 27 percent (n=4) of the studies. It was unclear whether or not teacher
feedback and/or technical assistance were study features in n=4 and n=9 studies,
respectively.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
43
WP format
Technical assistance provided to students 5 (33.3%) 1 (6.7%) –
Teacher’s feedback on provided to students 4 (26.7%) 7 (46.7%) 9 (60.0%)
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
44
In terms of student demographics, only five studies (33 percent) had samples
that were documented as gender-diverse, three studies (20 percent) reported
racially/ethnically-diverse samples, while 60 percent (n=9) of the studies did not
describe the gender or race/ethnic characteristics of the sample. All but two stud-
ies (87 percent) focused on mainstream education samples. Forty-seven percent
(n=7) of the studies were conducted with elementary school students, 33 percent
(n=5) were situated in middle schools, and the remaining 20 percent (n=3) were
conducted in high schools.
Geographically speaking, the studies were distributed across rural (n=2),
urban (n=3), suburban (n=6), and mixed (n=ı) settings; three studies lacked any
geographic description.
In short, the demographic descriptions of the studies included in this meta-
analysis did not appear to differ considerably from those studies included in the
“quantity of writing” meta-analysis.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
45
1.5
1.0
Adjusted Effect Size
.5
0.0
-.5
-1.0
0 20 40 60 80 100 120 140 160
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
46
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
47
Sensitivity Analysis
The sensitivity analysis focused on those studies for which the length of inter-
vention was greater than six weeks. This selection procedure eliminated 4 of the ı3
studies from the analysis.
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
50
J·T·L·A
The Effect of Computers on Student Writing: A Meta-analysis of Studies from 1992 to 2002 Goldberg, Russell, & Cook
51
J·T·L·A
The Journal of Technology, Learning, and Assessment
Editorial Board
Michael Russell, Editor Mark R. Wilson
Boston College UC Berkeley
Allan Collins Marshall S. Smith
Northwestern University Stanford University
Cathleen Norris Paul Holland
University of North Texas ETS
Edys S. Quellmalz Randy Elliot Bennett
SRI International ETS
Elliot Soloway Robert J. Mislevy
University of Michigan University of Maryland
George Madaus Ronald H. Stevens
Boston College UCLA
Gerald A. Tindal Seymour A. Papert
University of Oregon MIT
James Pellegrino Terry P. Vendlinski
University of Illinois at Chicago UCLA
Katerine Bielaczyc Walt Haney
Harvard University Boston College
Larry Cuban Walter F. Heinecke
Stanford University University of Virginia
Lawrence M. Rudner
University of Maryland
www.jtla.org
Technology and Assessment Study Collaborative
Caroline A. & Peter S. Lynch School of Education, Boston College