You are on page 1of 23

Running Head: SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

Success in an Online Learning Environment: Examining Differences Between Older and

Younger MLIS Students in a Foundations Course

Holly H. Stiegel

Valdosta State University

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

Abstract This correlation study will examine differences between younger and older adult learners enrolled in an online degree program. All Valdosta State MLIS students taking the Foundations class will be eligible (and will be encouraged) to participate in the study. Paper-and-pencil data to be collected before the beginning of the semester includes demographic information and a completed Online Technologies Self-Efficacy Scale (OTSES) developed by Miltiadou and Yu (2000). Online and email data to be gathered at the end of the semester includes student selfreported grades and BlazeVIEW archived information about online courseware use. It is expected that the duration of the study will be between 18 and 19 weeks. A correlation matrix of factor coefficients (r values) and level of significance (p values) will be included in the comparison of the students age, the number of online courses previously completed, the number of hours using Foundations online courseware, the online technology self-efficacy score, and the final course grade. Descriptive statistics of each factor will be reported (range, frequency distribution, measures of central tendency, and standard deviations). The research results will include bivariate scatter plots and contingency tables. Significant correlations and online selfefficacy scores may provide guidance to instructors in determining where a student may need additional help to successfully complete the course.

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

Success in an Online Learning Environment: Examining Differences Between Older and Younger MLIS Students in a Foundations Course

The information age has sparked a new trend in education: learning anytime, anyplace (Mariani, 2001). Web-based education now offers learners of every age the opportunity to enroll virtually at public, private, technical, single-sex, historically religious or ethnic institutions in every state. The National Center for Education Statistics (2007) reported that nearly 74,000 college degree programs were offered through distance education at over 4200 institutions. With a plethora of options now available geographic boundaries and time constraints are mitigated, and career and family commitments can be more easily managed with the flexibility offered by enrolling in asynchronous online classes. The 2011 American Library Association (ALA) web site lists 65 ALA accredited MLIS institutions in the US and Canada with 16 offering 100% online programs (Dare, 2010). ALA (2008) accreditation standards for Masters programs in library and information studies encourages graduate library and information science programs to ensure that their curricula reflect the diverse histories and information needs of all people that are served (p. 5) regardless of forms or locations of delivery selected by the school (p. 8). Although the average age of an LIS student is between 30 and 35 (Davis, n.d.), the ALA guidelines can be challenging when the age range of enrolled online students can span more than five decades; students educational experiences range from face-to-face classes to online distance education. Some students begin the program immediately after graduating from college, while others enter MLIS programs years after receiving a baccalaureate degree.

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

Enrollment requirements (GRE scores, GPA, letters of recommendation) help admissions committees select students who are academically ready to begin an MLIS program. If all students complete the OTSES prior to the beginning of a semester, advisors and Foundations instructors can evaluate the self-reported scores of perceived levels of confidence in online technologies. The OTSES scores range can vary from very confident to not confident at all. These scores can potentially identify problem areas for students who are new to an asynchronous online program, and suggest where instructors may guide students to remedial resources which will help them successfully complete the course. This study will examine the concept of success in an online learning environment by using a descriptive, correlational cohort study, focusing on what Neuman (2007) calls a category of people who share a similar life experience in a specified time period (p. 19).

Literature Review Distance education has a long history dating back to the delivery of instruction through correspondence courses, radio, and television (Calvin and Freeburg, 2010). Today, adult learners are the fastest growing segment of web-based distance education (Derrick, 2003, as cited in Calvin and Freeburg, 2010, p. 65). Socioeconomic shifts and increases in technological advances prompt many adults to further their education as the need to keep up with industry trends increases and new career opportunities appear. There is a systematic need for life-long learning in an increasingly diverse population of students (Yu, Kim, and Roh, 2001). The World Wide Web has become a ubiquitous pedagogical tool (Chyung, 2007) to deliver online instruction. With the increase in the number of courses offered exclusively

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

online, much research has shifted from examining ways to improve the traditional teachercentered model to examining ways to improve a learner-focused model. The learning environment in this Internet milieu is highly interactive. Researchers want to examine characteristics of online learners, and determine how best to serve this expanding, multigenerational, multicultural population (DeTure, 2004; Diaz, 2000; Schuyler, 1997; Sengpiel and Dittberner, 2008). Students enrolling in online classes possess varying levels of experience and competence with computer technology. They may be only one or two years beyond high school or college, or they may be returning to school after many yearseven decades. Many of these students were educated exclusively in face-to-face classes that provided structure and social outlets. The online learning environment, however, requires a different set of skills for success, including online computer competency and technical knowledge to navigate both the synchronous and asynchronous nature of distance learning (DeTure, 2004; Dupin-Bryant, 2004; Pillay, Irving and Tones, 2007; Miltiadou and Yu, 2000; Yu, Kim, and Roh, 2001) . Many studies that have examined age differences in adult learners' computer self-efficacy (CS-E) have encountered obstacles when attempting to generalize their conclusions. Problems have included small sample sizes, the lack of a clear definition of "younger" and "older" students, missing information on the average or median subjects' ages, and sample populations skewed to younger ages (Chung, 2007). Some studies have come tantalizingly close to providing additional information about adult learners. Dupin-Bryant (2004) found significant positive correlations between online course completion and pre-entry variables in distance learners (cumulative GPA, class rank,

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

Internet training, and previous online course completion) with a sample size n=464. The age range was between 17 and 59 with an average age of 29, yet none of the pre-entry variables were correlated with age. Calvin and Freeburg (2010) also had a large sample size (n=510) in their correlational study which investigated computer competencies believed to be important to online learning. Respondents, enrolled in online undergraduate courses, rated their self-perceived level of computer expertise. Positive correlations were found between technology competence and components of computer use, but neither variable was correlated with age. Older students expressed issues with time management and completing assignments on time, but no definition of older and younger was given, nor was there any information about age distribution. The stated average age was 36 and over half the respondents were active duty military. Refining an assessment tool from an earlier study, Pillay et al. (2007) propose a "Tertiary Students' Readiness for Online Learning" (TSROL). The authors offer empirical evidence that older students scored lower on computer self-efficacy measures than younger students, but despite the large sample size (n=252), the age distribution in this Australian study was skewed to younger ages. Over 80 percent were under age 30 (n=205) while the other 20 percent (n=47) were 31 and older. Only two were post-graduates. Interestingly, 60 percent of older students accessed a computer on a daily basis versus 53 percent of younger students. Similarly, 75 percent of students in the older category accessed the Internet on a daily basis as compared to 70 percent of younger students. A correlational analysis by Yu et al. (2001) examined factors including age, computer competence, frequency of Web use, and technology training in a random sampling of online

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

users of the Internet in a cross-sectional survey. No age information is given about the 257 participants, yet the authors claim a negative correlation between age and computer competence (r = -.29, p = <.01). The overall correlation between Web use and computer competence was positive (r = .32, p = <.01), leading the authors to conclude that participants' computer competence level has a positive impact on the amount of their Web use" (p. 11). Other studies provide ample information about ages of their subjects. Age and gender were the two independent variables in a study of online students by Chyung (2007). Data was gathered on the number of student posts to discussion boards, final grades, and changes in preand post-test self-efficacy scores on topics which were to be covered in an online class. The median age of the 81 graduate students was 39. This number was used to establish younger and older groups. More significant findings were in the younger group where there was a 143 percent increase in pre- and post-test scores; the older groups test scores increased an average of 107 percent. Final grades for both genders of the older group were the same as the class average of 92-93 percent. Younger females, however, scored an average of 95 percent while younger males average scores were between 89-90 percent. A short, objective instrument to derive a computer literacy scale (CLS) for older adults was developed by Sengpiel and Dittberner (2008). The test included both terms and symbols commonly used when interacting with computer technology. A small sample included 17 older adults (mostly pensioners) and 17 younger adults (mostly students or working). The average age for the groups was 67.2 years and 25.4 years, respectively. Intensity of computer use was correlated to scores on the CLS. Out of a maximum 26 points on the CLS, the older students scored an average of 14.4, while the younger students scored an average of 23.9. Significant

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

differences were also observed in intensity of computer use, with younger adults averaging nearly 27 hours per week on the computer, and older adults averaging less than three hours per week. The authors concluded that the CLS was well suited for older adults but too easy for younger and more computer literate people (p. 14). [For the purposes of this research paper, it would have been interesting to see CLS results had this test had been administered to distance education students.] Both Calvin and Freeburg (2010) and Yu et al. (2001) found that a significant number of their respondents indicated a desire for additional training. Participants also indicated that both formal instruction and informal learning through friends or colleagues would facilitate Web use. The lack of studies that specifically targeted older students returning to a learning environment requiring computer skills leaves a gap in what educators know about this age group. This study will be conducted to see whether significant correlations exist between age, number of online courses completed, OTSES scores, online courseware use, and academic performance measured by the final grade.

Definition of Variables This project will explore the concept success in an online learning environment specifically investigating first semester MLIS students who are taking a VSUs Foundations course. Successful completion of the course is defined as receiving a passing grade of C or better. Grades, however, do not fully explain whether the student experienced a successful outcome in terms of the learning environment. What level of technical skills do incoming students possess? Is age a factor? Have they had previous experiences in online classes which

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

might cause anxiety or lack of confidence, and thus undermine their beliefs in their self-efficacy with online technology? Did students become more comfortable with the technology as the semester progressed? Are there indicators which will alert an instructor to a student who is struggling? To try and provide insight into these questions, this project contains five variables. Each can be scored on a continuous scale, with an interval level of measurement: the students age at the time of enrollment in an MLIS 7000 Foundations course; the number of online courses the student has previously completed (with a passing grade of C or better, in any academic discipline); the number of hours spent online using Valdosta States BlazeVIEW courseware for MLIS 7000; the students self-reported final grade for the Foundations course; and the Online Technologies Self-Efficacy Scale score which asks students to evaluate their confidence levels in performing certain computer tasks. These scores may be the most indicative of student success in an online learning environment.

Research Question Using the sample population of enrolled online MLIS 7000 students, are any of the following factors significantly correlated which would indicate success in an online learning environment: the students age, the number of hours spent using online courseware, the number of previously completed online courses, an online self-efficacy score, and the final grade? If so,

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

10

is there a correlation which would serve as an indicator that a newly enrolled student might require remedial or focused attention from an instructor to achieve success?

Purpose and Objectives This research will focus on the concept of computer self-efficacy, and its indicators, which instructors may use to determine whether students (either young or old) require intervention to prevent student attrition. The construction of a knowledge base of VSU student CS-E scores and other correlations can help identify areas where students need help to succeed in an online learning environment.

Assumptions Two assumptions lead the researcher to use a correlation study instead of random sampling with experimental and control groups. First, the sample population is low in all Foundations classes per semester (an average n=50). Second, although every student will be solicited not all students will agree to participate. The researcher hopes that all students will agree and thus, help build a database of information about MLIS 7000 students. It is further assumed that variations between course sections will be kept at a minimum, e.g., every instructor will: use the same text, assign identical readings, post the same discussion questions, maintain uniform deadlines for assignment due dates, and return graded assignments using an agreed-upon schedule.

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

11

Limitations Any significant negative correlations between age and final grade should be handled with discretion. A grade in one introductory course may or may not be a predictor of successful completion of a degree. VSU does not want to unconsciously discriminate against enrolling older applicants on the basis of a very limited study, or discourage students from applying based solely on age. There are several possible confounding influences between the five variables when subjects are included who have previous experience with distance education. A student who has previously completed at least one online course is likely to spend less time using online courseware. Dupin-Bryant (2004, p. 204) found that previous online course experience would give a student an awareness of university expectations and a familiarity with the online distance learning milieu. It is possible that a student who has previously completed at least one online course will have a higher score on the OTSES. On the other hand a students negative experience with a previous online course may be reflected in a lower self-confidence score on the OTSES. As the semester progresses, student confidence with using the online technology should improve and might influence their final grade and time spent using online courseware. The ideal sample population would contain only students who have never taken an online course, but due to the small number of incoming MLIS students it is unrealistic to exclude anyone from participating in the study.

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

12

Methodology Sample Prior to any study of human subjects, an Application for Use of Human Participants in Research application will be sent to Valdosta States Institutional Review Board. If approved, the sample for this study will consist of all students enrolled in Valdosta States MLIS 7000 Foundations course(s) offered during either a spring or fall semester. Three weeks prior to the beginning of a semester, all enrolled students will receive an email that will include details about this research project. All students will be encouraged to participate because as Nahl (2001) states, larger samples give a more accurate representation of the population than smaller samples (p. 128). It will be clearly stated that participation is voluntary. A students decision about whether or not to participate, their answers to questionnaires, surveys, emails, end-of-semester grades, and online usage of BlazeVIEW courseware will be kept confidential. Student anonymity will be protected by reporting only aggregate data in the final report. The research will be completed within nineteen weeks which will encompass a sixteen week fall or spring semester. Instruments Students participating in the study will receive three paper-and-pencil instruments to be filled out and returned to the researcher before the beginning of the semester: a demographic survey, an Online Technologies Self-Efficacy Scale developed by Miltiadou and Yu (2000), and a signed informed consent letter. The demographic survey will contain objective questions, e.g., age at the beginning of the semester, gender, number of previously completed online classes. The online self-efficacy questionnaire will be subjective, and will ask students to rate

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

13

themselves on a Likert-type scale in terms of their level of confidence with computer technology. The letter of informed consent will include a request that students not communicate with each other before or during the semester about the project or their participation. At the conclusion of the semester, emails will be sent to all participants requesting their final grade for the course. The BlazeVIEW usage archives will be downloaded for analysis.

Data Collection and Analysis Raw data on five factors will be collected. Age, number of online courses completed, and the OTSES score will be completed before classes begin (see Appendixes). Grades will be reported at the end of the semester. Data for the fifth factor will be obtained from Valdosta States database statistics of student usage of BlazeVIEW. This courseware option will be activated by Foundations instructors in the My Tools area. It is labeled My Progress. A correlation matrix will show relationships between the factors and will be expressed as r-values. A p-value level of significance will be generated by SPSS software. The strength of association between factors will be expressed as a percentage (r 2). Using Nahls (2001) Correlation Table of Significance Levels of r, if the number of participants is 52, significance at the 95% level could be attained with a correlation as low as .273, but to achieve a 99% level would require a correlation of .354 (p. 127). Scatter charts for each of the 5 pairs will provide visual evidence of the correlations. Histograms or bar graphs will be utilized for measures of central tendency to see whether the data is skewed or has a normal distribution.

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

14

Conclusion As adult learners of all ages continue to enroll in online courses, their experiences with computer technology and self-efficacy with online technologies will remain an important component in whether they succeed in an online learning environment. For learners to succeed, instructors must have foreknowledge of any obstacles which could interfere with course completion, especially in the area of computer competence: every assignment, discussion, lesson plan, and announcement is online. By building a database of OTSES scores for every entering MLIS student (and other correlations in this study), a good indicator may be developed for advisors to see where remedial computer instruction should be focused.

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

15

References

American Library Association. (2008). Standards for accreditation of masters programs in library & information studies. Retrieved from: http://www.ala.org/ala/educationcareers/education/accreditedprograms/standards/standar ds_2008.pdf. American Library Association. (2011). Directory of accredited programs [Data file]. Retrieved from: http://www.ala.org/Template.cfm?Section=lisdirb&Template=/cfapps/lisdir/index.cfm. Calvin, J., & Freeburg, B. (2010). Exploring adult learners' perceptions of technology competence and retention in web-based courses. Quarterly Review of Distance Education, 11(2), 63-72. Chung, S. Y. (2007). Age and gender differences in online behavior, self-efficacy, and academic performance. Quarterly Review of Distance Education 8(3), 213-222. Combes, B. & Anderson, K. (2006). Supporting first year e-learners in courses for the information professions. Journal of Education for Library and Information Science 47(4), 259-277. Dare, L. (Ed.). (2010). ALA accreditation at a glance. Prism: The Office for Accreditation Newsletter 18(2). Retrieved from: http://www.ala.org/ala/aboutala/offices/accreditation/prp/prism/prismarchive/Prism_fall1 0.pdf.

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

16

Davis, D. (n.d.). Library retirements: What we can expect. Office for Research & Statistics, American Library Association. Retrieved from: http://www.ala.org/ala/research/librarystaffstats/recruitment/lisgradspositionsandretireme nts_rev1.pdf. Diaz, D. (2000). Carving a new path for distance education. The Technology Source, March/April. Retrieved from: http://technologysource.org/article/carving_a_new_path_for_distance_education_researc h/. Dupin-Bryant, P. (2004). Pre-entry variables related to retention in online distance education. American Journal of Distance Education, 18(4), 199-206. Mariani, M. (2001). Distance learning in post-secondary education: Learning wherever, whenever. Occupational Outlook Quarterly. Retrieved from: http://www.bls.gov/opub/ooq/2001/Summer/art01.pdf. Miltiadou, M. & Yu, C. H. (2000). Validation of the online technologies self-efficacy scale (OTSES). Reports--Research. Retrieved from: http://www.eric.ed.gov/PDFS/ED445672.pdf. Nahl, D. (2001). Strategic research approaches for reference librarians. Dubuque, Iowa: Kendall/Hunt. Neuman, W. L. (2007). Basics of Social Research: Qualitative and Quantitative Approaches (2nd ed.). Boston, MA: Pearson Education.

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

17

Pillay, H., Irving, K., & Tones, M. (2007). Validation of the diagnostic tool for assessing tertiary students' readiness for online learning. Higher Education Research and Development, 26(2), 217-234. Sengpiel, M. & Dittberner, D. (2008). The computer literacy scale (CLS) for older adults: development and validation. In Herczeg, M. & Kindsmuller, M.C. (Eds.), Mensch & computer 2008: viel mehr interaktion. Retrieved from: http://computerliteracy.net/CLS/CLS_V14s_english.pdf. United States Department of Education Institute of Education Services. (2007). Fast Facts. Retrieved from: http://nces.ed.gov/fastfacts/display.asp?id=80. Yu, B., Kim, K., & Roh, S. A user analysis for web-based distance education. Third Annual Topics on Distance Learning Conference. Hammond, IN. June 5-6, 2001. ReportsResearch (143), Speeches/Meeting Papers (150) Retrieved from: http://www.eric.ed.gov/PDFS/ED455830.pdf.

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

18

APPENDIX A ONLINE TECHNOLOGIES SELF-EFFICACY SCALE (OTSES) Thank you for agreeing to fill out this questionnaire. The following questions ask how confident you feel with using online technologies (such as the Internet, email, etc.) in order to succeed in an online course. If you do not have much computer experience, just complete the questionnaire to the best of your knowledge. DO NOT WORRY! Remember that each section begins with the statement, I would feel confident performing an activity, and not I have done it before. It does not matter whether you have had experience with the activities described. We would like to find out what your perceptions are performing the activities below. There is no right or wrong answer, just answer as accurately as possible. Please read the directions below and then fill in ALL items The survey requires you to indicate your level of confidence with the statements below by writing an X in the box which corresponds to how you feel about the activity, from "Very Confident" to "Not Confident At All". If you do not know what a statement means, choose "Not Confident At All." Questions about using the Internet (Internet Competencies) Sample Answer Form
VERY CONFIDENT x SOMEWHAT CONFIDENT NOT VERY CONFIDENT NOT CONFIDENT AT ALL

I would feel confident...


x

1. Opening a web browser (e. g. Netscape or Explorer) 2. Reading text from a web site 3. Clicking on a link to visit a specific web site 4. Accessing a specific web site by typing the address (URL) 5. Bookmarking a web site 6. Printing a web site 7. Conducting an Internet search using one or more keywords 8. Downloading (saving) an image from a web site to a disk

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

19

9. Copying a block of text from a web site and pasting it to a document in a word processor

Questions about chatting live via a synchronous chat system such as WebCT Chat rooms, Wimba, Eluminate, and GoToMeeting (some people call it Synchronous Interaction)

I would feel confident... 10. Providing a nickname within a synchronous chat system (if necessary) 11. Reading messages from one or more members of the synchronous chat system 12. Answering a message or providing my own message in a synchronous chat system (one-tomany interaction) 13. Interacting privately with one member of the synchronous chat system (one-to-one interaction)

Questions about using an e-mail system such as Windows Outlook, G-Mail, or Netscape Mail to communicate with friends, instructors or other students who are not online at the same time (Asynchronous)

I would feel confident... 14. Logging on and off an e-mail system 15. Sending an e-mail message to a specific person (one-to-one interaction) 16. Sending one e-mail message to more than one person at the same time (Courtesy Copy or one-to-many interaction) 17. Replying to an e-mail message 18. Forwarding an e-mail message 19. Deleting messages received via e-mail 20. Creating an address book

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

20

21. Saving a file attached to an e-mail message to a local disk and then viewing the contents of that file 22. Attaching a file (image or text) to an e-mail message and then sending it off

Questions about posting a message to a newsgroup, a bulletin board or on the discussion board of a conferencing system where participants are not online at the same time (such as WebCT, LinkedIn, Yahoo groups)

I would feel confident... 23. Signing on and off an asynchronous conferencing system 24. Posting a new message to an synchronous conferencing system (creating a new thread) 25. Reading a message posted on an asynchronous conferencing system 26. Replying to a message posted on an asynchronous conferencing system so that all members can view it (reply to all) 27. Replying to a message posted on an asynchronous conferencing system so that only one member can view it (reply to sender) 28. Downloading (saving) a file from an asynchronous conferencing system to a local disk 29. Uploading (sending) a file to an asynchronous conferencing system

Four point Likert scale: 4= very confident 3= somewhat confident 2= not very confident 1= not confident at all (select this if you do not know what the statement means)

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

21

APPENDIX B Correlation Matrix


NUMBER OF COMPLETED ONLINE COURSES NUMBER OF HOURS SPENT ON BLAZEVIEW COURSEWARE ONLINE TECHNOLOGY SELFASSESSMENT SCORE

AGE

FINAL COURSE GRADE

Age

r = 1.0

--

--

--

--

Number of Completed Online Courses Number of Hours Spent on BlazeVIEW Courseware Online Technology SelfAssessment Score

r = 1.0

--

--

--

r = 1.0

--

--

r = 1.0

--

Final Course Grade

r = 1.0

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

22

APPENDIX C Measures of Central Tendency

COUNT

RANGE

MEAN

MEDIAN

MODE

STD. DEV.

Student Ages

Number of Completed Online Courses Number of Hours Spent on BlazeVIEW Courseware Online Technology SelfAssessment Score Final Course Grade

SUCCESS IN AN ONLINE LEARNING ENVIRONMENT

23

APPENDIX D Example of Aggregate Demographic Data*


NUMBER OF STUDENTS Age of students < 30 years 31-40 years 41-50 years >50 years Number of Previously Completed Online Courses 0 1 2 3+ OTSES Score 0-29 30-58 59-87 88-116 Hours Spent Using BlazeVIEW courseware (per week) 0-7 8-14 15-21 22-28 29-35 >36 Final Grade A B C D F Dropped class PERCENTAGE

*Adapted from Combes and Anderson (2006)

You might also like