Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010, pp. 1 13.
Creating effective student engagement in online courses:
What do students find engaging?
Marcia D. Dixson
1
Abstract: While this paper set out to discover what activities and/or interaction
channels might be expected to lead to more highly engaged students, what it
found was a bit different. After first creating a scale to measure online student
engagement, and then surveying 186 students from six campuses in the Midwest,
the results indicate that there is no particular activity that will automatically help
students to be more engaged in online classes. However, the results also suggest
that multiple communication channels may be related to higher engagement and
that student-student and instructor-student communication are clearly strongly
correlated with higher student engagement with the course, in general. Thus,
advice for online instructors is still to use active learning but to be sure to
incorporate meaningful and multiple ways of interacting with students and
encouraging/requiring students to interact with each other.
Keywords: active learning, online teaching, social presence, student engagement
There are two primary reasons for studying student engagement in online courses. The first is
that online courses are here to stay and growing so we need to do them well. The growth of
online courses continues to rise dramatically. In fall, 2005 3.2 million university higher
education students in the United were taking at least one online course, up from 2.3 million the
previous year (Allen and Seamna, 2006). The second reason is that one of the primary
components of effective online teaching (or any other teaching, for that matter) is student
engagement. Therefore, it is imperative that we learn what engages students in order to offer
effective online learning environments.
I. Effective Online Instruction.
Research into effective online instruction offers three conclusions: 1) online instruction can be as
effective as traditional instruction; 2) to do so, online courses need cooperative/collaborative
(active) learning and 3) strong instructor presence.
A. As effective as traditional.
Several researchers have found that online students can and often do outperform traditional
students (Maki and Maki, 2007). Maki and Maki (2007) found that students were often required
to do more in online courses than in traditional courses. They also concluded that, to be effective,
online instruction required strong methodology and opportunities for students to interact with
each other and the instructor. Other researchers have echoed these findings, discovering that
1
Department of Communication, Indiana University Purdue University Fort Wayne, 2101 E. Coliseum, Fort Wayne, IN 46805
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
2
online students report learning more and spending more time on task (Robertson, Grant, and
Jackson, 2005), being more engaged than traditional students according to the NSSE (National
Survey of Student Engagement) averages (Robinson and Hullinger, 2008), having higher
achievement and performing better (Conolly et al., 2007; Lim, et al., 2008). Like Maki and Maki,
Zhao, Lui, Lai, and Tan (2005) reported that students do better with instructor interaction and
communication. The potential for online courses to be as or more effective than traditional
courses is there. What does it take to accomplish this? Other research indicates the potential may
be realized with active learning strategies.
B. Cooperation/collaboration.
One of the recurrent themes in the literature is the effectiveness of using collaborative activities,
group discussions, and other forms of student-student interaction. Gayton and McEwen (2007)
found rapport and collaboration between students, thought provoking questions, and dynamic
interaction among the top instructional processes identified by instructors and students. They
believe an interactive and cohesive environment that includes group work, regular assignments,
and solid feedback are needed for success. Levy (2008) found collaborative activities along with
other interactions such as reading students’ posts were valued by students. Graham et al. (2001)
states that a “well designed discussion facilitates meaningful cooperation” (p. 2).
Collaborative/interactive activities seem to be a necessary component to effective online
instruction.
A few articles state that a variety of instructional methods are needed for effective online
instruction (Chickering and Ehrmann, 1996; Gaytan and McEwen, 2007). However, only one
researcher mentions specific strategies such as moving away from recorded lectures, readings,
homework and tests toward more interactive and active learning environments like virtual teams,
games, case studies etc. (Johnson and Aragon, 2003). Active learning is also touted as a way to
engage students in the online environment (Chickering and Ehrmann, 1996). However, active
learning, like collaboration, is a broad term and can encompass everything from students being
given the opportunity to “talk about what they are learning” to students using simulation
software and designing “radio antenna” (Chickering and Ehrmann, 1996). One area that deserves
investigation is the specific types of active learning or collaboration in online courses that
students find engaging. Thus, the first two research questions are posited:
RQ
1
: What types of active learning in online courses do students report as engaging?
RQ
2
: Is there a difference in the active learning activities reported by high engagement
versus low engagement students?
C. Instructor presence.
The third conclusion from the literature is that instructors need to be actively involved in the
learning of their students (Gayton and McEwen, 2007; Young, 2006). Instructors should be
minimally active in discussions (Dennen, et al., 2007; Levy, 2008; Shea, Li, and Pickett, 2006;
Young, 2006) and use email appropriately (Dennen, et al, 2007; Gayton and McEwen, 2007,
Levy, 2008). Dennen et al. (2007) did find, however, that too much instructor participation in
discussion boards etc. can actually decrease student participation.
Social presence of instructors and students is a concern of online researchers. Social
presence is the phenomenon that helps translate virtual activities into impressions of “real”
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
3
people. Kehrwald (2008) defines social presence as “performative, that is, it was demonstrated
by visible activity; posting messages, responding to others, and participating in the activities of
the groups” (pp. 94-95). Such activities offer clues about the individual such as histories,
personalities, and current circumstances and help online participants experience “other
participants as both real in the sense of being a real person (a human being) and present in the
sense of being there in (coexisting, inhabiting) the virtual environment.” (p. 95). “Effective
design, facilitation, and direction of cognitive and social processes” are the defining activities of
teacher presence according to Shea, Li and Pickett (2006). Several researchers feel that social
presence, especially on the part of instructors, is a necessary component to effective online
instruction (Dennen, et. al, 2007; Goertzen and Kristjansson, 2007; Hughes, 2007; Kehrwold,
2008; Shea, Li and Pickett, 2006).
Emphasis on the social presence of instructors makes sense in light of research finding
that students need to feel connected to the instructor and other students in the course (Garrison,
Anderson and Archer, 2001; Lewis and Abdul-Hamid, 2006; Russo and Campbell, 2004; Song
and Singleton, 2004; Swan, 2002; Swan, Shea et. al., 2000) as well as to the content being
studied. In an online course, where the risk of students feeling isolated is of greater concern
(Lewis and Abdul-Hamid, 2006; Ortiz-Rodriguez, et al, 2005; Russo and Campbell, 2004; Song
and Singleton, 2004), it may be more important that learning include student to student and
student to instructor communication.
What communication activities between students and students and instructors are more
likely to help students feel connected and engaged with the course? The last two research
questions are:
RQ
3
: What types of student-student communication are reported by highly engaged
students versus students who report less engagement?
RQ
4
: What types of student-instructor communication are reported by highly engaged
students versus students who report less engagement?
Finally, given the previous conclusions, both instructor-student and student-student
communication should be significantly related to the student’s report of overall engagement with
the course:
H
1
: Reported level of instructor presence will be significantly correlated with student
engagement.
H
2
: Reported level of student presence will be significantly correlated with student
engagement.
II. Methods.
A. Instrumentation.
Because there was no scale to measure online student engagement, the first stage of the
project was to develop a measure of student engagement in online courses. Two student
engagement instruments and one measure of interaction within online courses were consulted:
The Classroom Survey of Student Engagement (CLASSE) (Smallwood, 2006), the Student
Course Engagement Questionnaire (SCEQ) (Handelsman, Briggs, Sullivan, and Towler, 2005)
and the Rubric for Assessing Interactive Qualities in Distance Courses (RAIQDC) (Roblyer and
Wiencke, 2004). Each of these instruments is a strong tool in its own right. None is appropriate
for measuring student engagement in online courses. The first two instruments include items
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
4
such as “Came to class without having completed readings or assignments” (CLASSE) and
“Raising my hand in class” (SCEQ). The RAIQDC, designed for online courses, asks students to
rate such items as “By end of course, most students (50-75%) are replying to messages from the
instructor . . .” rather than reporting their own experienced engagement with the course, students
are asked to report their perceptions of other students’ engagement with the course, a less than
optimal way to measure student engagement.
The Student Course Engagement Questionnaire (SCEQ) was chosen as a foundation
because its creators contend that student course engagement consists of four factors: skills
engagement (staying up on readings, putting forth effort); emotional engagement (making the
course interesting, applying it to my life); participation/interaction engagement (having fun,
participating actively in small group discussions); and performance engagement (doing well on
tests, getting a good grade) (Handelsman, Briggs, Sullivan, and Towler, 2005, p. 187). These
factors make not only intuitive sense as indications of a student’s active pursuit of learning in a
course, but are grounded in theories of motivation, self, and mastery/performance orientations by
students.
Next, a focus group of online instructors were asked what students who were engaged in
an online course would “look like” in terms of skills, emotional, participation and performance
engagement. The results of the focus group were used to adapt the Handelsman et al. instrument
to the online environment. Some items, such as “Listening carefully in class,” and “Taking good
notes in class” were replaced with items like “Listening/reading carefully” and “Taking good
notes over readings, PowerPoints, or video lectures.”
Reliability of the pilot with 31 online students was strong (0.95) and the scale correlated
strongly with two global items on engagement with the course (r = 0.73; p < 0.01) and two
global items of social presence (getting to know other students and your instructor) (r = 0.38; p <
0.05), thus supporting face validity.
B. Data gathering.
Online instructors on multiple main and regional campuses of two large Midwestern universities
were contacted to request they pass along an email/announcement to their students inviting them
to complete the online survey of student engagement. Instructors were contacted via a teaching
organization, by using the schedule of classes and by contacting teaching centers to ask them to
pass along the request. To give instructors more incentive to participate, they were offered
aggregate data from their own course if five or more students participated.
Participants. 186 students from six campuses and 38 courses completed surveys. The
sample included students from courses in communication, economics, English, nursing,
psychology, sociology, and tourism management. Because of the offer to share aggregate data
with instructors and to lessen potential student fears about instructors’ abilities to identify
individual student responses, no demographic data beyond campus and course were requested.
Scale validation. An exploratory factor analysis was run to validate the scale
measurement of the four types of engagement: skills, emotional, participation and performance.
As recommended by Allen, Titsworth, and Hunt (2009, p. 180-182), a predetermined number of
factors (four) was entered into a principal axis factoring analysis with promax rotation. An item
was only considered for a factor if it had a loading of 0.60 or higher on that factor and no
secondary loading of 0.40 or higher. The results of the KMO and Bartlett’s Test were appropriate
to continue the factor analysis. Nineteen of the thirty items loaded onto the four factors (see
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
5
Appendix A for KMO and Bartlett’s Test results and pattern matrix of factor loadings). The 19
items yielded a Cronbach alpha of 0.91 and had a significant correlation with the global course
engagement item (r = 0.67; p < 0.001). Therefore, only the remaining 19 items were used in the
rest of the analysis See Appendix B for Online Student Engagement Scale).
Besides the scale of engagement and global engagement items, students were also asked
three other questions: 1) What assignments, activities, requirements of this course
helped/encouraged/required you to really think about and be interested in the content of the
course (just list one or two)?; 2) What assignments, activities, requirements of this course
helped/encouraged/required you to interact with the instructor (just list one or two)?; 3) What
assignments, activities, requirements of this course helped/encouraged/required you to interact
with other students (just list one or two)?
Analysis. The answers to the three open-ended questions were then grouped into categories of
ways of communicating and/or activities. For instance, activities to engage with course content
included quizzes/tests, papers, application of the content, discussion forums, projects, and
lectures/connect session. Ways of interacting with other students included forums, group papers
or projects, chats and connect sessions, e-mailing, and peer review. Ways of interacting with the
instructor included chats and connect sessions, feedback on assignments, e-mail, forums, and
lectures. There was a wide variety of activities in each of the three categories. For each student,
the first activity they listed was the one coded.
III. Results.
RQ
1
: What types of active learning in online courses do students report as engaging?
Students reported a number of types of activities as engaging. These included application
activities (having to apply the concepts to case studies or problem solving); discussion forums
about the concepts, labs and group projects, research papers, and current events assignments. To
confirm that such active learning assignments are more engaging than passive learning
assignments, an ANOVA was run to compare the engagement of students reporting Active
activities (listed above) with those reporting Passive activities (reading, taking quizzes,
watching/looking at PowerPoints or video lectures) and those reporting none (no activity was
engaging). Students not answering the question were omitted from the analysis. There was a
significant difference in the reported engagement of students reporting Active (n = 102; M =
3.47; SD = 0.67); Passive (n = 36; M = 3.45; SD = 0.72); and No engaging activities (n = 8; M =
32.8; SD = 1.0); F (2,143) = 3.28; p < 0.05. The Tukey HSD post-hoc comparisons indicate the
significant differences occurred between active and none (p = 0.03; mean difference = -0.66)
and between passive and none (p = 0.05; mean difference = -0.64). Therefore, only students who
could report some type of activity which motivated them to interact with the content of the
course (passive or active) were significantly more engaged than students who did not feel there
were any such activities in the course.
RQ
2
: Do highly engaged students report different activities than students who report less
engagement?
None of the Chi Square tests run to determine if highly engaged students reported
significantly different kinds of activities than less engaged students were significant.
The research question was answered in the negative. Highly engaged, those who reported
engagement scores above the mean of 3.4, did not report significantly different activities than
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
6
students who reported low engagement: Chi-square (df = 10); 11.23; ns. Table 1 shows the chi-
square results for activities by level of student engagement.
Table 1. Table for Course Activity by Level of Engagement.
Course Activity
Engagement
Level
Missing
Application
Lecture/
Connect
None
Other
Papers
Project
Quiz
Readings
Research
Total
Low
15
12
2
5
11
4
4
4
7
6
76
High
9
13
9
4
15
8
7
3
6
12
100
Total
24
25
11
9
26
12
11
7
13
18
176
RQ
3
: Do highly engaged students report different student-student communication
activities than students who report less engagement?
The results for this question approached significance Chi-square (df = 7) = 14.03; p =
0.051. Both highly engaged and less engaged students reported similar channels of student-
student communication: discussion forums, group work, peer reviews, and chat/connect sessions.
However, highly engaged students were twice as likely to report using discussion forums to
interact with other students and were the only students who reported web projects and webpages
as a means of interaction. Table 2 shows the breakdown of student-student communication by
level of engagement.
Table 2. Table for Student-Student Interaction by Level of Engagement.
Student-Student Interaction Channel
Engagement
Level
Missing
Chat/Connect
Forum
Group
Projects
None
Other
Peer
Revisions
Webpages
Total
Low
15
3
15
3
20
12
8
0
76
High
16
2
29
5
14
12
12
10
100
Total
31
5
44
8
34
24
20
10
176
RQ
4
: Do highly engaged students report different instructor-student communication
activities than students who report less engagement? ?
This was not significant: Chi-square (df = 8) = 9.05 ns. Both sets of students reported
email, feedback on assignments, connect/chat sessions, lectures and discussion forums as ways
they interacted with their instructors. Table 3 breaks down the instructor-student communication
by level of student engagement.
A. Follow-up.
Although students were requested to list one or two” activities or ways they interacted with
fellow students or with instructors, many listed “none” (not the same as not answering the
question), just one, or several. The fact that some would spontaneously list more than requested
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
7
Table 3. Student-Instructor Interaction by Level of Engagement.
Instructor Student Interaction Channel
Engagement
Level
Missing
Feedback on
Assignments
Chat/
Connect
Email
Forums
Lecture
None
Other
Quizzes/Tests
Total
Low
18
7
3
8
6
6
16
7
5
76
High
17
16
6
18
8
8
9
11
7
100
Total
35
23
9
26
14
14
25
18
12
176
was somewhat surprising. Because of this, a follow-up test to compare the simple number of
reported activities with reported engagement was run. Table Four indicates that students
spontaneously reporting multiple ways of interacting with students and of communicating with
instructors had significantly higher levels of engagement with the course in general than those
who reported “None”. The finding suggests that multiple opportunities for communication may
be more important than any particular channel. However, given that this was not a proposed
research question, more data would need to be gathered to confirm this suggestion.
Table 4. Means for engagement based on number of activities reported.
Number of activities/communication
methods reported
Content Activities
Student-Student
Interaction
Instructor-Student
Interaction
None
3.03
3.24
3.11
One
3.44
3.42
3.47
Two or more
3.51
3.73
3.70
F test
1.60
df = 2, 148
*3.46
df = 2, 143
**5.62
df = 2, 143
*significant at 0.05; ** significant at 0.01
H
1
: Reported level of instructor presence will be significantly correlated with student
engagement.
The hypothesis was supported. The mean for engagement was 3.41 (SD = 0.70) while the
mean for instructor presence was 2.96 (SD = 1.24); r = 0.41, p < 0.001.
H
2
: Reported level of student presence will be significantly correlated with student
engagement.
The second hypothesis was also supported. As stated previously, mean for engagement
was 3.41 (SD = 0.70) while the mean for student presence was 1.83, not terribly high on a 5
point scale, (SD = 0.98); r = 0.42; p < 0.001.
IV. Discussion.
While the findings are somewhat disappointing, a couple of interesting results emerged from this
study. First, the finding of no significant difference in student engagement levels between those
reporting active vs. passive activities indicates that a myriad of content activities can be used to
engage students in online courses. However, active learning assignments, particularly discussion
forums and web pages, may serve the secondary purpose of helping to develop students’ social
presence. Given the research regarding the potential for social isolation (Lewis and Abdul-
Hamid, 2006; Ortiz-Rodriguez, et al., 2005; Russo and Campbell, 2004; Song and Singleton,
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
8
2004) of the online learner, instructors should consider learning assignments that engage students
with the content and with each other. Across many types of courses when students readily
identified multiple ways of interacting with other students as well as of communicating with
instructors, they reported higher engagement in the course. The importance of this idea is further
supported by the significant correlation of student course engagement with both the global item
on instructor presence and the global item on student presence. So, instructors should consider
assignments in which students interact with each other and the content of the course. Instructors
need to create not just opportunities for students to interact, but the requirement that they do so.
Students who are working on group projects together, doing peer review of one another’s papers,
interacting within a discussion forum on a particular topic, are likely to feel more engaged in the
course. Simply offering the opportunity i.e., having an open discussion forum where they can
(but are not required) to participate, is probably not enough.
Beyond this, the findings indicate instructors also need to provide multiple ways of
interacting with students themselves to create their own social presence that the literature
confirms is an integral component to a successful online course (Dennen, et al, 2007; Goertzen
and Kristjansson, 2007; Hughes, 2007; Kehrwold, 2008; Shea, Li and Pickett, 2006). For
instance, instructors need to use several channels: announcements on the homepage of the course
delivery system, e-mails to students, discussion forums in which the instructor interacts, and
online lectures or connect sessions and chats, to enhance engagement. However as stated earlier,
the result that more channels of communication are reported by more highly engaged students
cannot be considered with confidence until further testing is completed.
Clearly the path to student engagement, based on this data, is not about the type of
activity/assignment but about multiple ways of creating meaningful communication between
students and with their instructor it’s all about connections. While the study did not find
specific activities that engage students more in an online course, it did yield some interesting
insights into teaching online and the importance of social presence of both other students and the
instructor.
Beyond the results of the study itself, the introduction of a scale to measure online
student engagement is a step forward in our understanding of online teaching. The scale, with
further validation, could prove very useful to research into online learning and teaching.
A. Limitations.
Limitations of the study are standard. While the sample is fairly good sized, all of the students
are from a Midwestern Universities although the inclusion of both the main and regional
campuses allows for varying sizes of campuses and a pool of traditional and nontraditional
students. The primary limitation with this study is that in order to get at the information desired,
a different methodological design may be indicated. To discover if discussion forums work better
than e-mails for students interacting with each other, students using both will need to rate their
relative effectiveness regarding course engagement.
B. Implications.
As usual, the study raises as many questions as it answers: confirming the importance of student
to student interaction and instructor to student interaction but suggesting that more than one
method for such interaction may be important for students to be engaged in the course. However,
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
9
findings indicate that particular types of activities are not necessarily more effective in engaging
students in the online learning community. However, comparing assignments, which actively
engage students with content and with each other against assignments that do not accomplish
both of tasks, would be worth pursuing. Clearly, more research is desired and required.
In conclusion, this study emphasizes the importance of developing real connections in
online courses. Instructors need to create active learning situations in which students can
meaningful apply what they are learning. However, meaningful communication opportunities
also need to be integrated into online courses. Such connections really help students to feel
engaged with the courses they are taking despite the lack of a physical presence of instructor or
other students.
Acknowledgements
This research was partially supported by a Mack Fellowship Grant from the Mack Center
of Indiana University.
Appendices
Appendix A. Factor analysis Tables Online Student Engagement Scale (OSE).
KMO and Bartlett's Test
Kaiser-Meyer-Olkin Measure of Sampling Adequacy.
0.910
Approx. Chi-Square
3281.745
df
435
Bartlett's Test of Sphericity
Sig.
0.000
Pattern Matrix
a
Factor
1
2
3
4
SE1
0.643
SE2
0.643
SE3
0.411
SE4
0.650
SE5
0.647
SE6
0.740
SE7
0.942
SE8
0.724
SE9
0.453
SE10
0.997
SE11
0.986
SE12
0.863
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
10
SE13
0.560
SE14
0.687
SE15
0.500
SE16
0.568
SE17
0.678
SE18
0.870
SE19
0.623
SE20
0.994
SE21
1.000
SE22
0.529
SE23
SE24
0.886
SE25
0.394
SE26
0.907
SE27
0.569
SE28
0.427
SE29
0.682
SE30
0.490
Extraction Method: Principal Axis Factoring.
Rotation Method: Promax with Kaiser Normalization.
a. Rotation converged in 6 iterations.
Appendix B. Online Student Engagement Scale (OSE).
1. Making sure to study on a regular basis SKILLS
2. Putting forth effort EMOTIONAL
3. Doing all the homework SKILLS
4. Staying up on the readings SKILLS
5. Looking over class notes between getting online to make sure I understand the material
SKILLS
6. Being organized SKILLS
7. Taking good notes over readings, PowerPoints, or video lectures SKILLS
8. Listening/reading carefully SKILLS
9. Entering the online class multiple times a week PARTICIPATION
10. Finding ways to make the course material relevant to my life EMOTIONAL
11. Applying course material to my life EMOTIONAL
12. Finding ways to make the course interesting to me EMOTIONAL
13. Thinking about the course between times I am online EMOTIONAL q
14. Really desiring to learn the material EMOTIONAL
15. Visiting or calling the instructor with questions about the material and/or assignments
PARTICIPATION
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
11
16. Emailing or posting questions when I don’t understand the material and/or assignments
PARTICIPATION
17. Having fun in online chats, discussions or via email with the instructor or other students
PARTICIPATION
18. Participating actively in small-group discussion forums PARTICIPATION
19. Helping fellow students PARTICIPATION
20. Getting a good grade PERFORMANCE
21. Doing well on the tests/quizzes PERFORMANCE
22. Being confident that I can learn and do well in the class PERFORMANCE
23. Taking advantage of all class resources (i.e., extra links, readings etc.) SKILLS
24. Engaging in conversations online (chat, discussions, email) PARTICIPATION
25. Critically thinking about my own ethics, priorities, beliefs and values in the context of the
class EMOTIONAL
26. Posting in the discussion forum regularly PARTICIPATION
27. Emailing the instructor regarding my grade in the class PERFORMANCE
28. Checking my grades online PERFORMANCE
29. Getting to know other students in the class PARTICIPATION
30. Assessing my own learning and progress in the class PERFORMANCE
References
Allen, I.E. and Seaman, J. (2006). Making the Grade: Online Education in the United States
2006. Needham, MA: Sloan-C. Retrieved October 13, 2007 from Sloan Consortium Publications
website: http://www.sloan-c.org/publications/survey/survey06.asp
Allen, M., Titsworth, S. and Hunt, S.K. (2009). Quantitative Research in Communication.
London: Sage.
Chickering, A.W. and Ehrmann, S.C. (1996, October). Implementing the seven principles:
Technology as a Lever. AAHE Bulletin, 3-6. Retrieved October 7, 2007 from Teaching,
Learning and Technology Group Website: http://www.tltgroup.org/programs/seven.html.
Connolly, T.M., MacArthur, E., Stansfiled, M. and McLesslan, E. (2007). A quasi-experimental
study of three online learning courses in computing. Computers & Education 49, 345-359.
Dennen, V.P., Darabi, A.A. and Smith, L.J. (2007). Instructor-learner interaction in online
courses: The relative perceived importance of particular instructor actions on performance and
satisfaction. Distance Education, 28(1), 65-79.
Garrison, D.R., Anderson, T. and Archer, W. (2001). Critical thinking, cognitive presence and
computer conferencing in distance education. The American Journal of Distance Education,
15(1), 7-23.
Gayton, J. and McEwen, B.C. (2007). Effective online instructional and assessment strategies.
The American Journal of Distance Education, 21(3), 117-132.
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
12
Goertzen, P. and Kristjansson, C. (2007). Interpersonal dimensions of community in graduate
online learning: Exploring social presence through the lens of systemic functional linguistics.
The Internet and Higher Education, 10(3), 212 – 230.
Graham, C., Cagiltay, K., Lim, B., Craner, J., and Duffy, T.M. (2001). Seven principles of
effective teaching: A practical lens for evaluating online courses. Assessment, March/April.
Handelsman, M.M., Briggs, W.L., Sullivan, N. and Towler, A. (2005). A measure of college
student course engagement. The Journal of Educational Research, 93(3), 184-191.
Hughes, G. (2008). Diversity, identity and belonging in e-learning communities: Some theories
and paradoxes. Teaching in Higher Education, 12(5-6), 709-720.
Johnson, S.D. and Aragon, S.R. (2003, Winter). An instructional strategy framework for online
learning environments. New Directions for Adult and Continuing Education, (100), 31-43.
Kehrwald, B. (2008). Understanding social presence in text-based online learning environments.
Distance Education, 29(1), 89-106.
Levy, Y. (2008). An empirical development of critical value factors (CVF) of online learning
activities: An application of activity theory and cognitive value theory. Computers & Education,
51, 1664-1675.
Lewis, C.C. and Abdul-Hamid, H. (2006). Implementing Effective Online Teaching Practices:
Voices of Exemplary Faculty. Innovative Higher Education, 31, 2, 83-98.
Lim, J., Kim, M., Chen, S.S., and Ryder, C.E. (2008). An empirical investigation of student
achievement and satisfaction in different learning environments. [Electronic version]. Journal of
Instructional Psychology, 35 (2).
Maki, R.H. and Maki, W.S. (2007). Online Courses. In F.T. Durso (Ed.), Handbook of applied
cognition (2
nd
ed., pp. 527-552). New York: Wiley & Sons, Ltd.
Ortiz-Rodríguez, M., Telg, R. W., Irani, T., Roberts, T. G. and Rhoades, E. (2005). College
students’ perceptions of quality in distance education: The importance of communication.
Quarterly Review of Distance Education, 6, 97-105.
Robertson, J.S., Grant, M.M. and Jackson, L. (2005). Is online instruction perceived as effective
as campus instruction by graduate students in education? Internet and Higher Education, 8, 73-
86.
Robinson, C.C. and Hullinger, H. (2008). New benchmarks in higher education: Student
engagement in online learning. [Electronic version]. Journal of Education for Business, 84(2),
101-109.
Dixson, M. D.
Journal of the Scholarship of Teaching and Learning, Vol. 10, No. 2, June 2010.
www.iupui.edu/~josotl
13
Roblyer, M.D. and Wiencke, W.R.. ( 2004, December). Exploring the interaction equation:
Validating a rubric to assess and encourage interaction in distance courses. Journal of
Asynchronous Learning Networks, 8(4). Retrieved October 9, 2007 from http://www.sloan-
c.org/publications/jaln/v8n4/v8n4_roblyer.asp.
Russo, T. C. and Campbell, S. W. (2004). Perceptions of mediated presence in an asynchronous
online course: Interplay of communication behaviors and medium. Distance Education, 25, 215-
232.
Shea, P., Li, C.S., and Pickett, A. (2006). A study of teaching presence and student sense of
learning community in fully online and web-enhanced college courses. The Internet and Higher
Education, 9, 175-190.
Smallwood, B. (2006). Classroom survey of student engagement. Retrieved October 11, 2007
from University of Northern Florida, Assessment at UNF Website:
http://www.unf.edu/acadaffairs/assessment/classe/overview.html
Song, L. and Singleton, E. S. (2004). Improving online learning: Student perceptions of useful
and challenging characteristics. Internet & Higher Education, 7, 59-70.
Swan, K. (2002). Building learning communities in online courses: The importance of
interaction. Education, Communication & Information, 2, 23-49.
Swan, K., Shea, P., Fredericksen, E., Pickett, A., Pelz, W., and Maher, G. (2000). Building
knowledge building communities: Consistency, contact and communication in the virtual
classroom. Journal of Educational Computing Research, 23, 4, 359-383.
Young, S. (2006). Student views of effective online teaching in higher education. The American
Journal of Distance Education, 20(2), 65-77.
Zhao, Y., Lei, J., Yan, B., Lai, C., and Tan, H.S. (2005). What makes the difference? A practical
analysis of research on the effectiveness of distance education. Teachers College Record, 107(8),
1836-1884.