Sociation Today ® 
The Official 
Journal of 
The North 
Carolina 
Sociological 
Association: A 
Refereed Web-Based 
Publication 
ISSN 1542-6300
Editorial Board:
Editor:
George H. Conklin,
 North Carolina
 Central University

Board:
Bob Davis,
 North Carolina
 Agricultural and
 Technical State
 University

Richard Dixon,
 UNC-Wilmington

Ken Land,
 Duke University

Miles Simpson,
 North Carolina
 Central University

Ron Wimberley,
 N.C. State University

Robert Wortham,
 North Carolina
 Central University


Editorial Assistants

Rob Tolliver,
 North Carolina
 Central University

Shannon O'Connor,
 North Carolina
 Central University

John W.M. Russell,
 Technical
 Consultant




Submission Guidelines
for Authors


Cumulative
Searchable Index
of
Sociation Today
from the
Directory of 
Open Access
Journals (DOAJ)


Sociation Today
is abstracted in 
Sociological Abstracts
and a member
of the EBSCO
Publishing Group


The North
Carolina
Sociological
Association
would like
to thank
North Carolina
Central University
for its
sponsorship of
Sociation
Today



*®

Volume 5, Number 1

Spring 2007

Comparing On-Line to In-Person Course Delivery: An Empirical Study

by

Jammie Price
Department of Sociology and Social Work
Appalachian State University

and

Leslie Hossfeld
Department of Sociology and Criminal Justice
University of North Carolina Wilmington
 

Introduction

    Web-based technologies have been used in the classroom for over 15 years, including websites, email, listserves, library reserves, and text books. Among these options, social scientists range widely in their web usage – from simply posting syllabi on-line to delivering a course fully on-line in asynchronous learning networks (Jaffee, 1997; Kuechler, 1999).  Use of web-based technology for instructional purposes is increasing, as is enrollment in distance education courses and on-line course offerings (Jaffee, 2003).  Many administrators and faculty promote on-line instruction as the solution to managing increased college enrollments, particularly among non-traditional students (Dietz, 2002).  However, are the academic outcomes of on-line instruction similar to traditional in-person instruction?  Few empirical studies have been done.  Most, except Schutte (2004) are one group studies that do not provide a comparison to a traditional course. 
Are we putting the cart before the horse by promoting the utilization of web-based resources, even fully on-line courses, without empirically assessing outcomes?  To explore this, we designed a study comparing the academic outcomes in a sociological research methods course offered on-line and in-person.  Before presenting our research design and results, we first review the literature on the outcomes of web-based instruction. 

Literature Review

    We found dozens of articles in the social science literature on using web-based resources in traditional courses or as fully on-line courses.  Many were empirical studies comparing on-line and in-person outcomes and processes, but only one used an experimental design with a control or comparison group.  Schutte (2004) randomly assigned students to an on-line or a traditional section of the same course.  The studies without control or comparison groups documented patterns using web-based resources (Brooks, 1997; Jaffee, 1997; Southard, 1997; Schneider, 1998; Steiger and Levine, 1999; Ammarell, 2000; Dietz, 2002; Persell, 2004; King, 1994; Scarboro, 2004).  Further, two articles offered theoretical discussion of the advantages and disadvantages of on-line instruction (Kuechler, 1999; Stivers, 1999; Thompson, 2002; Jaffee, 2003), one surveyed the literature on on-line instruction (Benson et al., 2002) and one provided reflections on on-line instruction based on years of teaching among several coauthors (Edwards, Cordray and Dorbolo, 2000). 

    Most of these authors detected positive outcomes in using web-based resources either to supplement traditional courses or to deliver fully on-line courses.  Specifically these authors argued that web-based instruction exposed students to more perspectives on sociological issues (Southard, 1997), facilitated discussion of substantive topics (Benson et al., 2002), provided more diversity to the class (Steiger and Levine, 1999), and increased students’ exposure to computers and access to information (Kuechler, 1999).  Other studies established that web-based resources provided more opportunities for students and instructors to assess learning and encouraged instructors to critically evaluate course material, organization, and evaluation methods (Edwards, Cordray and Dorbolo, 2000; Jaffee, 2003).  For example, on-line discussions revealed gaps in student understanding which instructors addressed immediately with further readings, exercises, and discussion (Persell, 2004).  Developing on-line course tools forced instructors to anticipate problems and better organize the course, lectures, and assignments. 

    Many authors contend that web-based recourses encouraged active, participatory, and peer learning in students (King, 1994; Kuechler, 1999; Jaffee, 2003). For example, in on-line courses students spent more time with material (Benson et al., 2002; Schutte, 2004) and more structured time studying and self learning (Edwards, Cordray and Dorbolo, 2000).  On-line resources alleviated problems with scheduling group projects (Edwards, Cordray and  Dorbolo, 2000).  Most researchers confirmed that web-based technology increased communication among students and instructors by breaking down the social barriers associated with face-to-face communication (King, 1994; Jaffee, 1997, 2003; Southard, 1997; Kuechler, 1999; Steiger and Levine, 1999; Ammarell, 2000; Edwards, Cordray and Dorbolo, 2000; Benson et al., 2002).  On-line discussion allowed students more time to prepare thoughtful contributions to discussions and enabled instructors to limit free-riding by requiring students to enter these discussions.  In sum, utilizing web-based resources expanded learning methods and increased opportunities for student and instructor success (Kuechler, 1999; Benson et al., 2002).  As evidence of this, studies observed higher exam scores and student course evaluations with on-line courses (Schutte, 2004) and increased enthusiasm for course subject material (King, 1994). 

    In contrast, Stivers (1999) offered a negative review of on-line teaching.  He maintained that on-line instruction dilutes material and discourages critical thought.  Only Dietz (2002) ascertained no difference in outcomes between in-person and on-line courses.  Specifically, Dietz discovered that the use of technology did not predict exam scores, but rather attendance, reading assigned material, and forming study groups did. 

    Several studies noted the many problems with utilizing various electronic technologies.   Jaffee (1997) determined that student writing skills became even more problematic in typically writing intensive on-line courses.  Instructors spent a great deal of time replying to student emails and discussion entries (King, 1994).  Kuechler (1999) observed weak student initiative in accessing course pages, challenges with assessing the validity of web materials, and problems with ease of plagiarism in on-line courses.  Further, many authors pointed to difficulties associated with hardware, software, and the technological skills required of students in on-line courses (King, 1994; Jaffee, 1997; Kuechler, 1999). 

Methods

    Building on the literature on the outcomes of web based instruction, we co-developed a sociological research methods course centering on group discussion, projects, and exercises.  The course covered the standard research methods topics: the research process, conceptualizing research questions, ethics, developing hypotheses, experiments, surveys, personal interviews, observation, measurement, sampling, and an introduction to data analysis.  We used Neuman's Social Research Methods: Qualitative and Quantitative Approaches as the text.  For each week of the semester, we co-developed identical lectures, course discussion topics, group exercises, and assignments.  Together, we developed identical exams composed of multiple choice, true/false, and short essay questions.

    In the Spring of 2002 one author taught this course in-person, the other fully on-line.  Then in the Summer of 2003 the author who initially taught the course on-line taught it in-person. Students self-selected themselves into the courses. In both semesters, the authors used the same text, supplemental readings, lectures, assignments, class exercises, discussion topics, study guides, and exams.  All exams were administered in-person and the short essays were graded by both instructors to check for reliability of measurement.  The instructors meet weekly during the semester to discuss the curriculum, course interaction, and plans for the next week. 

Population

    We conducted the study at regional comprehensive university in the southeast of the U.S. with approximately 10,000 undergraduate students.  At the time of the study, 60% of the students were women, 90% White and 5.2% African-American.  First-year students comprised 19% of the student body, sophomores 23%, juniors 25% and seniors 33% (the remaining students were graduate students).  The average age among full-time students was 20, with 82% between 18 and 23 years old.  Approximately 80% of students lived off campus.  The average SAT score among undergraduates was 1,104 (543 verbal, 561 math).  The Department of Sociology had approximately 340 majors, all required to pass the 300 level research methods course which they usually took in the junior or senior year. 

Data Collection and Analysis

    We compared the exam scores, participation points, final averages, and final grades of (1) students in the in-person and on-line courses in Spring 2002 with different instructors, (2) students in the Summer 2003 in-person and Spring 2002 on-line courses with the same instructor, and (3) students in the in-person Spring 2002 and Summer 2003 courses with different instructors.  We made these comparisons using t-tests on means and chi-square tests on proportions.  We intended to also obtain and control for demographic data on all students (age, sex, race, year in school), GPA, number of credits taken that semester, number of previous math courses, and the standardized student course evaluations from the university databases.  Although we were granted IRB approval, when we requested this supplemental data the university would not release it, claiming the need to protect student confidentiality and academic integrity. (Paradoxically, our students had graduated when we requested the data.  So, we had no influence over their grades.)  Race, year in school, and age would not have been valid controls given the lack of variability on these characteristics among the student population. The lack of data on these variables is a potential weakness of our analysis. 

Results

    The Spring 2002 courses ended with a total of 29 students enrolled in the in-person section and 23 students on-line.  The on-line course started with 27 students, but four dropped after two weeks.  The Summer 2003 in-person course ended with 21 students enrolled.

Comparing In-Person and On-Line Outcomes, Different Instructors

    When comparing the Spring 2002 in-person class to the on-line class, taught by two different instructors, the students’ grades on exam 1 and 2 did not differ statistically (see Table 1).  The in-person average was 83 on exam 1 and 2.  The on-line averages were 73 and 78, respectively.  By exam 3, however, the in-person students started doing significantly better than the on-line students (t=2.91, p=.007).  The in-person average on exam 3 was 84, on-line was 60.  The in-person average on exam 4 was 80, on-line 51 (t=3.54, p=.001).  The students' participation points in the two classes were similar (75 and 64, respectively).  The overall averages, not surprisingly, were better in the in-person class (t=2.57, p=.015).  The overall average in the in-person course was 81.  The on-line overall average was significantly lower at 65.  Lastly, when comparing the final grades issued in each course, more of the in-person students received A's and B's than the on-line students (24% and 45% vs. 9% and 26%, respectively; (T=9.7, p=.046).  More of the on-line students received F's (30% vs. 3%). 

Table 1
In-Person and On-Line Outcomes, Across Instructors

 
In-
Person
On-
Line
Test
Statistic

Value
Exam 1 Mean
83.03
72.65
t=1.53
0.14
Exam 1 SD
11.87
30.78
 
 
Exam 2 Mean
82.93
78.26
t=0.72
0.48
Exam 2 SD
18.75
26.22
 
 
Exam 3 Mean
84.1
59.57
t=2.91
0.007
Exam 3 SD
17.58
37.24
 
 
Exam 4 Mean
79.9
51.52
t=3.54
0.001
Exam 4 SD
17.95
34.99
 
 
Participation
Mean
75.07
64.29
t=1.43
0.16
Participation
SD
19.82
31.51
 
 
Overall 
Mean
81.01
65.26
t=2.57
0.015
Overall 
SD
15.14
26.11
 
 
Final
Grade
 
 
Chi Square=
9.7
0.046
A
24%
9%
   
B
45%
26%
   
C
21%
30%
   
D
7%
4%
   
F
3%
30%
   
N
29
23
   

Comparing In-Person to On-Line Outcomes, Same Instructor

    This comparison concerns when the same instructor taught the course on-line and in-person.  Here we hoped to find similar results as when we compared the outcomes, above, across instructors.  Similar findings would lend validity to the above results.  When comparing the in-person class to the on-line class taught by the same instructor, the students in the in-person class did statistically better on exam 3 and 4, on participation points, on the overall average, and in the final grades issued (see Table 2). 

Table 2
On-Line to In-Person Outcomes, Same Instructor

 
In-
Person
On-
Line
Test
Statistic

Value
Exam 1 Mean
81.1
72.65
t=1.2
0.24
Exam 1 SD
13.18
30.78
 
 
Exam 2 Mean
87.86
78.26
t=1.71
0.1
Exam 2 SD
5.72
26.22
   
Exam 3 Mean
85.52
59.57
t=3.28
0.003
Exam 3 SD
7.15
37.24
 
 
Exam 4 Mean
80.19
51.52
t=3.69
0.001
Exam 4 SD
12.29
34.99
 
 
Participation
Mean
85.05
64.29
t=3.02
0.006
Participation 
SD
9.29
31.51
 
 
Overall
Mean
84.44
65.26
 t=3.43
0.002 
Overall 
SD
5.95
26.11
 
 
Final 
Grade
 
 
Chi Square=
11.27
0.02
A
29%
9%
 
 
B
48%
26%
 
 
C
24%
30%
 
 
D
0%
4%
 
 
F
0%
30%
 
 
N
21
23
 
 

    The on-line students averaged 73 and 78 on exam 1 and 2, respectively.  The in-person students averaged 81 and 88 on exam 1 and 2, respectively.  Again, by exam 3, the in-person students started doing significantly better than the on-line students (t = 3.28, p=.003).  The on-line students average on exam 3 was 60, compared to 86 among the in-person students.  The on-line students average on exam 4 was 52, compared to 80 among the in-person students (t = 3.69, p=.001).  The participation average among on-line students was 64, in-person 85 (t = 3.02, p=.006).  The final overall averages were 65 among the on-line students and 84 among the in-person students (t = 3.43, p =. 002).  More in-person students received A's and B's (29% and 48%, respectively) compared to on-line students (9% and 26%, respectively; (t=11.27, p=.02).  By far more on-line students received F's (30%) than in-person students (0%). We next compared the course outcomes when both instructors taught the course in-person. 

Comparing In-Person Outcomes, Different Instructors

    Here we expected to find no differences between the exam scores, overall averages, and final grades issued.   This finding would lend more validity to our assertion that course structure, lessons, expectations, and delivery were equal, or at least quite similar, when the two instructors taught the course.  The data show that there was no difference in exam scores, overall average, or final grades issued between the two instructors when teaching the course in-person (see Table 3). 

Table 3
In-Person Outcomes, Across Instructors

 
In-
Person
On-
Line
Test
Statistic

Value
Exam 1 Mean
83.03
81.1
t=0.54
0.6
Exam 1 SD
11.87
13.18
 
 
Exam 2 Mean
82.93
87.86
t=-1.33
0.19
Exam 2 SD
18.75
5.72
 
 
Exam 3 Mean
84.1
85.52
t=-0.39
0.7
Exam 3 SD
17.58
7.15
 
 
Exam 4 Mean
79.9
80.19
t=-0.07
0.95
Exam 4 SD
17.95
12.29
 
 
Participation
Mean
75.07
85.05
t=-2.38
0.02
Participation 
SD
19.81
9.29
 
 
Overall
Mean
81.01
84.44
t=-1.11
0.28
Overall 
SD
15.14
5.95
 
 
Final Grade
 
 
 Chi Square=
2.34
0.67
A
24%
29%
 
 
B
45%
48%
 
 
C
21%
24%
 
 
D
7%
0%
 
 
F
3%
0%
 
 
N
29
21
 
 

There was a significant difference in participation points – the Spring 2002 in-person students had a lower overall participation score than the Summer 2003 in-person students (75 vs. 85; t = -2.38, p = .02).  We think this difference is likely due to the summer school condensed setting, where every day is the equivalent of a week in the traditional semester, so students attend more regularly. 

Discussion

    We find that the in-person version of a sociological research methods course results in higher exam scores and final grades than an on-line version.  We tested this with different instructors teaching the course (one in-person, the other on-line) and with the same instructor teaching the course (in-person and on-line).  The instructors co-developed the course for delivery in-person and on-line, using the same text, supplemental readings, lectures, assignments, class exercises discussion topics, study guides, and exams in all sections. Although we were not able to include several control variables in our analysis, we find a clear pattern of results between and within instructors and across time.  We believe this limits the strength of alternative explanations for our findings.

    Given the unequal academic outcomes across delivery methods, we question the promotion of fully on-line instruction (Brent, 1999; Sellani and Harrington 2002).   A potential solution would be to create on-line courses that approach, or even exceed, in-person courses in quality.  However, this proves labor intensive for faculty (Brahler, Peterson and Johnson, 1999), and hence costly for schools.  Further, these higher costs would negate the reason why many administrators promote fully on-line courses -- to reduce costs and increase tuition revenue (Rhoades, 1998; Kuechler, 1999). 

    Further, the work required to develop high quality on-line courses is not currently valued.  Few incentives exist.  Many schools do not count the development of web-based instructional tools towards tenure and promotion, particularly affecting assistant professors who often engage this work.  Other schools either do not pay faculty for developing on-line tools and courses, or pay them once and then take ownership of the intellectual property, hiring cheaper labor to then repeatedly administer the course.

    Hence, we conclude that the best use of web-based technology may be to supplement traditional in-person instruction. Many previous studies conclude similarly (Jaffee, 1997; Kuechler, 1999).   No matter how small the course size or the amount of discussion time, virtual interaction alone may not provide the social bond that many students need to excel (Brooks, 1997).  Interactivity, the hallmark of on-line instruction, however, can be used across the curriculum (Scheel, 2002; Thompson, 2002; Jaffee, 2003).  Regardless of delivery method, students benefit from regular mediated course discussion, active learning, and faculty responsiveness (Jaffee, 1997). 

References

Ammarell, G. 2000.  "Network Newsgroups as a Teaching Tool in the Social Sciences."  Teaching Sociology 28: 153-159.

Brahler, C.J., Peterson, N.S., and Johnson, E.C. 1999. "Developing On-Line Learning Materials for Higher Education: an Overview of Current Issues."  Educational Technology and Society, 2,2, http://ifets.ieee.org/
periodical/vol_2_99/jayne_brahler.html

Benson, D.E., Haney, W., Ore, T.E., Hodges Persell, C., Schulte, A., Steele, J., and Winfield, I. 2002. "Digital Technologies and the Scholarship of Teaching and Learning in Sociology." Teaching Sociology 30: 140-157.

Brent, E. 1999. "Computers in the Undergraduate Classroom: Lessons from the first 2,000 Students." Social Science Computer Review 17: 162-175.

Brooks, M.J. 1997. "Beyond Teaching and Learning Paradigms: Trekking in the Virtual University." Teaching Sociology 27: 1-14.

Dietz, T.L. 2002. "Predictors of Success in Large Enrollment Introductory Courses: An Examination of the Impact of Learning Communities and Virtual Learning Resources on Student Success in an Introductory Level Sociology Course." Teaching Sociology 30: 80-88. 

Edwards, M.E., Cordray, S. and Dorbolo, J. 2000. "Unintended Benefits of Distance-Education Technology for Traditional Classroom Teaching." Teaching Sociology 28: 386-391.

Jaffee, D. 1997. "Asynchronous Learning: Technology and Pedagogical Strategy in a Distance Learning Course." Teaching Sociology 25: 262-277.

------ 2003. "Virtual Transformation: Web-Based Technology and Pedagogical Change." Teaching Sociology 31: 227-236.

King, K.M. 1994. "Leading Classroom Discussions: Using Computers for a New Approach." Teaching Sociology 22: 174-182.

Kuechler, M. 1999. "Using the Web in the Classroom."  Social Science Computer Review 17: 144-161.

Neuman, W.L. 2000. Social Research Methods: Qualitative and Quantitative Approaches, 4th edition. Boston, Allyn and Bacon. 

Persell, C.H. 2004. "Using Focused Web-Based Discussion to Enhance Student Engagement and Deep Understanding." Teaching Sociology 32: 61-78.

Rhoades, G. 1998. Managed Professionals: Unionized Faculty and Restructuring Academic Labor.  Albany, NY: State University of New York Press. 

Scarboro, A. 2004. "Bringing Theory Closer to Home Through Active Learning and Online Discussion." Teaching Sociology 32: 222-231.

Scheel, E.D. 2002. "Using Active Learning Projects to Teach Research Skills Throughout the Sociology Curriculum." Sociological Practice, 4: 145-170.

Schneider, A. 1998. "Sociology: the Internet as an Extended Classroom."  Social Science Computer Review 16: 53-57.

Schutte, J.G. 2004. "Virtual Teaching in Higher Education: The New Intellectual Superhighway or Just Another Traffic Jam?" On-line paper. 
www.csun.edu/sociology/virexp.htm

Sellani, R., and Harrington, W. 2002. "Addressing Administrator/Faculty Conflict in an Academic Online Environment." The Internet and Higher Education 5: 131-145.

Southard, P.A. D. 1997.  "Expanding Horizons: Electronic Communication and Classroom Teaching in Sociology." Social Science Computer Review 15: 427-430.

Steiger, T. L. and Levine, R.F. 1999. "Using Computer Listserves to Achieve a More Diverse Classroom: The 'Virtual Salon'."  Critical Sociology 25: 36-58.

Stivers, R. 1999. "The Computer and Education: Choosing the Least Powerful Means of Instruction." Bulletin of Science, Technology and Society 19: 99-104.

Thompson, H. 2002. "Cyberspace and Learning." Electronic Journal of Sociology  6(1), 
http://www.sociology.org/
content/vol006.001/thompson.html

Return to Sociation Today, Spring 2007 
 


©2007 by the North Carolina Sociological Association