Students' perceptions of instructors' roles in blended and online learning environments: A comparative study

  • Published on
    07-Apr-2017

  • View
    212

  • Download
    0

Transcript

  • Computers & Education 81 (2015) 315e325Contents lists available at ScienceDirectComputers & Education

    journal homepage: www.elsevier .com/locate /compeduStudents' perceptions of instructors' roles in blended and onlinelearning environments: A comparative study

    Min-Ling Hung a, *, Chien Chou b

    a Teacher Education Center, Ming Chuan University, 5 De Ming Rd., Gui Shan District, Taoyuan County 333, Taiwanb Institute of Education, National Chiao Tung University, 1001 Ta-Hsueh Rd, Hsinchu, 30010 Taiwana r t i c l e i n f o

    Article history:Received 9 September 2014Received in revised form23 October 2014Accepted 24 October 2014Available online 1 November 2014

    Keywords:Distance education and telelearningInteractive learning environmentsTeaching/learning strategiesEvaluation methodologies* Corresponding author. Tel.: 886 3 350 7001; faxE-mail address: mlhong@mail.mcu.edu.tw (M.-L. H

    http://dx.doi.org/10.1016/j.compedu.2014.10.0220360-1315/ 2014 Elsevier Ltd. All rights reserved.a b s t r a c t

    This study develops an instrumentdthe Online Instructor Role and Behavior Scale (OIRBS)dand uses it toexamine students' perceptions of instructors' roles in blended and online learning environments. A totalsample of 750 university students participated in this study. Through a confirmatory factor analysis, theOIRBSwasvalidated infiveconstructs: coursedesignerandorganizer (CDO),discussion facilitator (DF), socialsupporter (SS), technology facilitator (TF), andassessmentdesigner (AD). The results showthat thefive factorstructures remained invariant across the blended learning and online learning. Both students in blendedlearning environments and students in online learning environments exhibited the greatest weight in theCDO dimension, followed by the TF and DF dimensions. In addition, students in the online learning envi-ronments scored higher in the DF dimension than did those in the blended learning environments.

    2014 Elsevier Ltd. All rights reserved.1. Introduction

    With the development of technology rapidly expanding into higher education, online instruction has emerged as a popular mode and asubstantial supplement to traditional teaching. Over the past few years, a growing number of studies have explored the perspectives ofonline instructors who use various technologies and pedagogies for teaching (e.g., Bailey & Card, 2009; Ellis, Hughes, Weyers, & Riding,2009; Motaghian, Hassanzadeh, & Moghadam, 2013; Zingaro & Porter, 2014). Research in this field has generally concluded that educa-tors regard both traditional education as chiefly instructor-centered and online education as chiefly student-centered. Owing to the ongoingshift from traditional classroom-based education to online education, many instructors no longer have direct control of the teaching processand they act more as facilitators than as traditional lecturers (Arbaugh, 2010; Schoonenboom, 2012).

    Instructors havemany concernswhen taking on the role of online educators. The preliminary concern is how to adapt to the relatively newrole and thus effectively shoulder the related responsibilities required by online education. A significant role adjustment for students may berequired aswell if theywant to be successful in an online learning environment. Students may shift from being a traditional passive classroomlearner to being an active online inquirer. With such changes in learning contexts and in the roles of instructors and students, correspondingchanges may have taken place in students' expectations and perceptions regarding the competence with which teachers should provideassistance, whether in advance of engaging in online studies or while in the process of doing so (Matzat, 2013; Zingaro & Porter, 2014).

    In addition to online education, blended teaching is growing in popularity. Educators regard it as an essential teaching component thatpromotes effective learning (Matzat, 2013; Ocak, 2011). Dziuban, Moskal, and Hartman (2005) identified two principal advantages fromwhich participants in blended teaching can benefit: strengthened learning engagement and strengthened interaction. However, Humbert(2007) showed that faculty members are under sometimes oppressive pressure to deal with online interactions and technical issues inblended courses. Ocak (2011) proposed that the reasons for faculty members' lack of interest in teaching blended courses include formidablycomplex course structures, the necessity of intimidatingly careful preparation and planning, and a lack of effective communication. Inresponse to such problems, Salmon and Lawless (2006) stated that instructors' changing roles constitute a critical issue in blended teaching.

    The purpose of this study, therefore, is to examine students' perceptions of instructors' roles and associated behaviors in learning en-vironments that are entirely or partially web-based. In this study, we have adopted the definition that Lin and Overbaugh (2009) assigned to: 886 3 3593887.ung).

    Delta:1_given nameDelta:1_surnamemailto:mlhong@mail.mcu.edu.twhttp://crossmark.crossref.org/dialog/?doi=10.1016/j.compedu.2014.10.022&domain=pdfwww.sciencedirect.com/science/journal/03601315http://www.elsevier.com/locate/compeduhttp://dx.doi.org/10.1016/j.compedu.2014.10.022http://dx.doi.org/10.1016/j.compedu.2014.10.022http://dx.doi.org/10.1016/j.compedu.2014.10.022

  • M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325316the term blended instruction: it is teaching in which a blend of both traditional classroom instruction and online learning activities areutilized, including synchronous and asynchronous communicationmodes (p. 999). To examine students' perceptions of instructors' roles indifferent web-based learning environments, we created the Online Instructor Role and Behavior Scale (OIRBS) and examined its psycho-metric properties, each corresponding to one of two samples of students. The first sample comprised students enrolled in a course havingblended learning environments while the second sample comprised students enrolled in a course having online learning environments. Inthis regard, we asked four research questions:

    1. Can a measurement model of OIRBS be established?2. If it can be established, is the measurement model of OIRBS invariant in the presence of two distinct learning environments?3. What perceptions do college students have toward their instructor's roles in two distinct learning environments?4. Do the learning environments correspond to any difference in college students' perceptions of the roles and the associated behaviors of

    instructors?2. Literature review

    2.1. Studies on online instructor functions

    There is a growing understanding that teaching online is different from teaching face-to-face. Cho and Cho (2014) pointed out that onlineinstructors' scaffolding for interaction had a significantly positive influence on students' behavioral and emotional engagement. This findingstrongly suggests that a particular set of pedagogies should be in place to help online teachers teach. Knowlton (2000) argued that in-structors no longer amount to an umpire, a judge, or a dictator; instead, they serve students in the capacity of a coach, a counselor, a mentor,and a facilitator. In an interview-based study of online instructors, Hsieh (2010) examined interactive activities, evaluation criteria, and self-expectations to identify experiences of online instructors. In a similar study, Liu, Bonk, Magjuka, Lee, and Su (2005) conducted interviewswith 28 faculty members and explored four dimensions of online teachers' roles: the pedagogical, managerial, social, and technical di-mensions. The aforementioned study suggested that instructors attempting to establish a more engaging environment for online learningshould play roles that have been transformed pedagogically, socially, and technologically.

    Another relevant study was undertaken by Lim and Lee (2008). These researchers argued that teachers in computer-supported learningenvironments should have technical, managerial, and facilitative skills, and that discussions about teachers' roles should be open to a morediverse set of views. Similarly, Wilson, Ludwig-Hardman, Thornam, and Dunlap (2004) directly identified five significant tasks that in-structors should perform: (1) providing a learning-oriented infrastructure that comprises syllabi, calendars, communication tools, andinstruction resources; (2) modeling various strategies for effective participation, collaboration, and learning; (3) monitoring and assessingstudents' learning and providing them feedback, remediation, and grades; (4) troubleshooting and resolving instructional, interpersonal,and technical problems; and (5) creating a learning community characterized by an atmosphere of trust and reciprocal concern.

    Most of the prior literature, as mentioned above, was based more on conceptual development and qualitative interview data than on quan-titative data analysis of instructors' changing roles; however, recent research on online teaching has started to probe perspectives drawn fromsolid, diverse samples.Mazzolini andMaddison (2007) investigatedhow instructors' participation rates, the timingof instructor postings, and thenature of these postings are related to students' academic engagement and to their perception of this engagement. The findings indicate thatinstructors' efforts to post on forums could influence students' discussions and participation on the forums in unexpected ways. Cho and Cho(2014) used a sample of 158 college students and found that instructors' role as a facilitator for social interaction is critical in creating positiveonline learning environments, a pattern that in turn promotes academic engagement among students. In fact, recent research has examined howinstructors' characteristics, attitudes, and behaviors can influence online courses. For example, Liaw, Huang, and Chen (2007) presented ques-tionnaires to a sample of 30 instructors and 168 college students. The results indicate that the instructors had very positive attitudes toward e-learning, particularly in regards toperceived self-efficacy, enjoyment, usefulness, andbehavioral intentionof use. Liawet al. alsonoted that systemsatisfaction and multimedia instruction could positively affect instructors' attitudes toward and enjoyment of e-learning. Similarly, Arbaugh(2010) evaluated faculty members' characteristics and behaviors on display in 46 MBA courses offered by a Mid-Western U.S. university. Ac-cording to the findings, instructor behavior is an important factor in the enhancement of student learning outcomes. Teaching presence andimmediacy behaviors were positive predictors of students' perceived learning and satisfaction with the educational delivery medium. Hence,Arbaugh suggested that instructors should structure and organize their courses in advance so that they can focus on efficient engagement withtheir students while class is in progress.

    A number of studies have empirically investigated educational Internet use, which has the potential to motivate students and tostrengthen their interactive behaviors and their autonomy in the educational process (Claudia, Steil, & Todesco, 2004). However, somestudies have shown that online instructors lack the time, the relevant training, or the support to make proper use of such Internet tools(Muir-Herzig, 2004). While conducting a study in the Netherlands, Mahdizadeh, Biemans, and Mulder (2008) noted that instructors used e-learning tools mainly to present course announcements, news, course materials, and PowerPoint displays. These uses were all for pre-liminary presentation purposes rather than for advanced communication or collaboration purposes. In other words, even when all kinds ofe-learning tools are available, instructors tend to use relatively basic tools for teaching, instead of tools for online communication orcollaboration. In order to promote instructors' effective online teaching and to eliminate any barriers in the teaching process, educators ingeneral should strive to understand instructors' roles in online learning environments as well as instructors' associated behaviors.

    2.2. Online instructors' roles and behaviors

    To adequately examine students' perceptions and perspectives of online instructors' roles and behaviors, researchers need an appro-priate framework and a valid instrument with which they can categorize and measure participants' perceptions. Kim and Bonk (2006)argued that the most important skills for an online instructor are the ability to moderate or facilitate learning and the ability to develop

  • M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325 317or plan for high-quality online courses. Liu et al. (2005) placed a stronger emphasis on online instructors' pedagogical roles, including thoseof course designer, profession-inspirer, feedback-giver, and interaction-facilitator. However, online instructors in general carry out a diversearray of important roles to varying degrees, and Kim and Bonk's study nor Liu et al.'s study addressed the particularly important role ofassessment designer. Thus, the current study reviews the following additional dimensions that may be involved in instructors' roles andbehaviors: course designer and organizer, discussion facilitator, social supporter, technology facilitator, and assessment designer. These rolesare also the focus of our proposed Online Instructor Role and Behavior Scale (OIRBS), which will be discussed in greater detail in the methodsection.

    2.2.1. Course designer and organizerAnderson, Rourke, Garrison, and Archer (2001) stated that instructors' development of digitally formatted courses can get instructors to

    think through the process, structure, evaluation, and interaction components of the courses well before the first day of class. Instructorsshould establish clear guidelines for in-class student participation and should present their students with information about course ex-pectations and procedures (Bailey & Card, 2009; Eom, Wen, & Ashill, 2006). In order to integrate technology components into a course,instructors should use course-management software tools, give students links to web sites and supplemental course materials, andgenerallymake all coursematerials available to students by the first day of class (Bailey& Card, 2009). These learning arrangements can helpstudents shoulder more responsibility for their own learning, can engage students in deeper and broader interactions with course materials,can promote student collaboration in learning processes, and thus can enhance the degree to which students' learning experiences arepositive (Heuer& King, 2004). Shea, Li, and Pickett (2006) conducted a study by using a sample of 1067 students across 32 different collegesand found that students were more likely to report experiencing a greater sense of a learning community when the courses had effectiveinstructional design and organization.

    The dimension of course design and organization includes instructors' role in coordinating learning activities and in handlingoverall course structure. Online instructors use clearly structured content and timetables to convey to students the course expectations,in turn improving the quality of the given course and facilitating students' positive learning experience (Liu et al., 2005). In order toconstruct relevant items for course design and organization in our proposed OIRBS, we created a pool of behavioral items by bothwriting new items and adapting items from Liu et al. (2005). In this way, we selected three items, an example of which is Theinstructor provides clear syllabi (e.g., goals, organization, policies, expectations, and requirements) to students at the beginning of thecourse.

    2.2.2. Discussion facilitatorRovai (2007) characterized online constructivist learning environments as discourses, typically in the form of online discussions. Hara,

    Bonk, and Anjeli (2000) reported that online discussions can constitute a text-based digital record of concepts, plans, answers to questions,and strategies, and thus can facilitate meaningful processing of information. Other research shows that online discussion can help studentsreflect on their own perspectives (MacKnight, 2000), foster their own metacognitive skills (McDuffie & Slavit, 2003), and strengthen theirown critical-thinking skills (Jeong, 2003). Thus, instructors should seek and implement such effective strategies for facilitating onlinediscussion as promoting students' motivation to engage in productive discussions and engaging students in socio-emotional discussions andauthentic content-and task-oriented discussions (Rovai, 2007). When facilitating discussion, instructors must assess student comments,give feedback on student comments, share opinionswith students, ask questions of students, encourage students to explore newconcepts inthe course, keep students focusing on the tasks at hand, draw out shy or reserved students, and praise students for their productive efforts(Arbaugh, 2010; Dringus, Snyder, & Terrell, 2010).

    Arbaugh (2001) pointed out that instructors could promote discussion and feedback through the use of text-based discussion, emo-ticons, personal examples, or audio clips. Instructors' attempts to reduce the social and psychological distance between themselves andtheir students are often on display in the behaviors that the instructors exhibit when directly responding to students' own behaviors.From the above-mentioned studies, we have concluded that the role of discussion facilitator corresponds to an essential dimension in ourOIRBS. We have created four behavioral items in one of two ways: either by writing novel items or by adapting concepts from Rovai (2007)and Arbaugh (2010). One such item is The instructor encourages students to engage in critical and reflective thinking in onlinediscussion.

    2.2.3. Social supporterBailey and Card (2009) identified the fostering of relationships as a significant means by which online instructors would express their

    empathy for students, their passion for teaching, and their strong desire to help students succeed in their college-level learning. In this samevein, Yuan and Kim (2014) uncovered evidence that the dropout problem for some online programs may be attributable to a lack ofinteraction between learners and instructors. The lack of interaction can also leads to learners' feeling of isolation. Thus, learning com-munities can strengthen learners' interactions with one another and their instructors, and can, by the same token, alleviate the learners'feelings of isolation.

    Kreijns, Kirschner, Jochems, and van Buuren (2007) defined sociability as the extent to which people perceive a computer-supportedcollaborative learning (CSCL) environment to be capable of facilitating the emergence of a sound social space with attributes such astrust and belonging, a strong sense of community, and good working relationships. Shea et al. (2006) conducted a study using Rovai's(2002a, 2002b) Classroom Community Index as a diagnostic instrument to investigate college students' levels of connectedness andlearning in both online courses and in classroom-based courses with online components. These researchers found that if an instructorwould reinforce student contributions, thereby furthering students own knowledge and confirming student understanding, students wouldbe more likely to report a strong sense of learning community.

    Kang and Imt (2013) proposed that instructional interaction can predict learners' outcomes and satisfaction in online learning envi-ronments. However, the researchers also showed that social interaction such as social intimacy could negatively affect perceived learningachievement and satisfaction. In order to construct the social supporter-related items in our proposed OIRBS in this study, we developedsome behavioral items. One such item is The instructor helps foster a sense of community in this online course.

  • M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e3253182.2.4. Technology facilitatorOne element of the successful and effective implementation of online learning is related to effective use of technology (Bailey & Card,

    2009). Educators have made efforts to integrate emerging Internet technologies into the teaching and learning process in higher educa-tion (Roby, Ashe, Singh, & Clark, 2013). For example, researchers have reported the plausibility of using blogs (Martindale & Wiley, 2005)and wikis (Lamb, 2004) for online student collaboration and reflection. However, Condie and Livingston (2007) argued that, in their sample,most instructors expressed little confidence in the technical aspects of using information and communication technology (ICT) for teaching.Instructors not only criticized ICT use's alleged benefits for the teaching subject at hand, but also exhibited unwillingness to learn proper ICTuse for the promotion of students' learning.

    This evidence of unwillingness and reluctance on the part of instructors raises the question of the extent to which learning institutionsrequire instructors to adapt their practices or adopt new approaches for the purpose of maximizing new technologies' potential support oflearning and teaching (Condie & Livingston, 2007; Schoonenboom, 2014). Brill and Galloway (2007) suggested that learning institutionsshould provide instructors an appropriate degree of support clarifying the diverse positive influences that equally diverse technologies canhave on the classroom (e.g., presentations, interactions, attention); in turn, instructors can develop positive attitudes and proficiency inselecting the most useful technologies given specific pedagogical goals.

    Berge (1995) pointed out that online instructors' technical role required them to have aminimumdegree of knowledge, skill, and comfortin the presence of communication tools. Online instructors' technical roles included supporting students with technical resources,addressing technical concerns, diagnosing and clarifying problems encountered, and allowing students sufficient time to learn new pro-grams (Bonk, Kirkley, Hara, & Dennen, 2001). A general principle seems to be that technology can create or bolster unique opportunities forpromoting reflective and collaborative learning (Frank, 2006). These technical functions depend on the degree towhich instructors not onlybecome comfortable and proficient with the technology being used but also can transfer that level of comfort to the learners (Liu et al.,2005). Since online instructors can strengthen learners' comfort with course-based technical supports, the current study proposes toexamine behavioral items related to instructors' technology use and support in our OIRBS. One such item is The instructor uses tools andtechnologies (e.g., PowerPoint, audio devices, video devices, multimedia) that foster our learning.

    2.2.5. Assessment designerBransford, Brown and Cocking (1999) argued that learner-centered environments emphasize forms of assessment that make learners'

    thinking visible to themselves. Traditional tests usually impose a standardized procedure on all students in a class at the same controlledtime and location. In contrast, online assessment and evaluation activities, without face-to-face interactions and observations, can be acompletely different process from those in a traditional teaching situation (Azza, 2001).

    Online testing, if it is to be rigorous, must address the issues of identity security and academic honesty. Researchers have paid someattention to the growing ease with which students undertake text plagiarism using the Internet (Liu, Lo, & Wang, 2013; McMurtry, 2001).Searching the Internet for pertinent textual information, finding it, copying it and pasting it into a document, and passing the document offas an original work can be done easily and quickly (Rovai, 2000). This activity can be a grave problem for online tests. Another problemwithonline tests is that instructors cannot easily ensure the simultaneity of all students' test-taking (Olt, 2002). If simultaneity is not enforced,earlier test-takers can supply answers to later test-takers whenever the questions on the test remain unchanged. Thus, student assessmentsin online environments become more challenging for instructors, who cannot observe students' test-taking behaviors directly.

    Webb, Gibson, and Forkosh-Baruch (2013) identified a need for alternative assessment approaches and instruments for measuring thecomplex, higher order outcomes empowered by technology-enriched learning experiences. Moreover, with the rapidly unrolling ad-vancements in technology, instructors can now assess students through personalized measures to evaluate the students' attainment ofhigher-order learning outcomes (Yeh, 2010). Assessments such as simulations, e-portfolios, and interactive games create a medium foranalyzing a broad range of knowledge even in real-time (Clarke & Dede, 2010; Gibson, Aldrich, & Prensky, 2007).

    Robles and Braathen (2002) argued in favor of online assessment requiring a more ongoing and systematic approach than that usedwith traditional instruction. Hence, online instructors need to use effective assessment strategies and techniques such as creative designsand approaches in projects, portfolios, self-assessments, peer evaluations, and weekly assignments, all coupled with immediate feedback(Gaytan & McEwen, 2007; Rovai, 2000). Consequently, designing forms of assessment to measure student performance is anotherimportant task for online teachers. In order to construct relevant items pertinent to assessment designers in our OIRBS, we created a poolof behavioral items by both writing new items and adapting some items from Gaytan and McEwen (2007). One such item is Theinstructor designs exam questions or assessment activities that cover information in lectures and reading in proportion to the importancein the course.

    As stated by the research that we have been reviewing, online teaching creates a new type of educational experience that requires a re-examination of online instructors' roles and their associated behaviors. In this study, we propose that the five following dimensions of theOIRBS are empirically distinct from one another: course designer and organizer (CDO), discussion facilitator (DF), social supporter (SS),technology facilitator (TF), and assessment designer (AD). Additionally, the above-mentioned studies have led us to the conclusion thatlearners' perceptions toward instructors' roles and associated behaviors in online learning environments are research issues worthy offurther investigation. However, all of these studies have focused on a specific delivery format (i.e., only web-based instruction). There seemsto be no cross-format comparison that would help us understanddfrom students' perspectivesdthe changing roles that online instructorsplay depending on different learning environments.

    3. Method

    3.1. Participants

    The current study's sample consisted of 750 students. All the students owned a computer, which they typically kept at their home ordormitory. The participants' courses were of two types: blended courses (classroom-based courses with online components) and onlinecourses. There were 367 undergraduate student participants enrolled in two blended general-education courses at a private university

  • M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325 319in northern Taiwan: the first course was Introduction to Environmental Protection and the second was Taiwanese Ecology. The firstcourse had 172 students and the second course had 195 students. Each course had one instructor and two teaching assistantsrespectively. The students attended face-to-face classes once a week for each of the two blended courses, which provided students withdigital-learning materials including videos and slides via the Moodle Learning Management System. Students were asked to postquestions and comments on discussion spaces each week throughout the semester, and the instructors responded to students'postings. The grade distribution in these two courses was 10% for attendance and participation in face-to-face classroom time, 20% forgroup projects, 20% for online discussion, 25% for the midterm exam, and 25% for the final exam. The questionnaire was distributed byemail after the midterm. The researchers in this study gathered the data pertaining to this learning group over the 2010 spring se-mester and the 2010 fall semester.

    In the online learning environment, 383 participants were enrolled in one cross-school general-education course entitled InternetLiteracy and Ethics. This and similar types of courses were offered via a self-developed e-campus learning-management system. Withsemesters spanning 18 weeks, only three classes in the Internet Literacy and Ethics course were of the face-to-face variety, and theseconsisted of the orientation class, the midterm-exam class, and the final-exam class. One instructor and five teaching assistants wereresponsible for designing discussion topics, responding to individual postings, and providing general comments in online discussionforums. Online instructors required students to participate in online forums, where discussions would address topics assigned duringthe semester. This kind of participation counted for 30% of students' final grade. Students also needed to develop a project, whichcounted for 20% of the grade, and the midterm and final exams each counted for 25% of the grade. The instructor spent about 2 h everyother day to answer students' questions and to participate in the discussions. The online posting rates of the instructor are approx-imately 25e30% of the total posting. The questionnaire was conducted by email before the final-exam. The researchers in this studygathered the data pertaining to this learning group over four consecutive semesters (the spring and fall semesters of 2010 and of 2011;see Table 1).3.2. Instrument

    The instrument that the current study usedwas the Online Instructor Role and Behavior Scale (OIRBS), consisting of two sections (SectionA and Section B) that were identical for the two learning groups examined here. Section A comprised questions regarding demographiccharacteristics (e.g., gender) and student grade level (e.g., freshman, sophomore, junior, senior). Section B comprised 16 statements (i.e.,items) addressing students' post-course views about a given instructor's roles and associated behaviors (regardless of whether the coursewas of the online or blended variety). To provide students with a clear and understandable instrument, we (the researchers of the currentstudy) revised and adapted the draft of the survey to the current study's framework on the basis of three individuals' opinions: an expertwith five years of instructional-design experience in online settings and two online instructors. Students' agreement with each item wasindicated by a five-point rating scale with the categories strongly agree, agree, neutral, disagree, and strongly disagree. The Appendix showsthe mean and standard deviation of each item in the dimensions of the OIRBS.4. Results

    4.1. Total-sample CFA

    The CFA of the OIRBS model rested on our assumption that online-instructor roles and associated behaviors would exhibit a five-factorstructure composed of course designer and organizer, discussion facilitator, social supporter, technology facilitator, and assessmentdesigner. In other words, students' responses to the OIRBS could be explained bymeans of five first-order factors, and covariance among thefirst-order factors could be explained by means of the second-order factor. Also, each itemwould have non-zero loadings on the measuredfirst-order factors.

    Adequate model fit is represented by c2, which is highly sensitive to sample size. We used the ratio of c2 to its degree of freedom (c2/df),with a value below 5.0 being indicative of an acceptable fit between the hypothetical model and the sample data (Carmines&McIver, 1981).RMSEA values were below 0.08; SRMR values were below 0.05; GFI and CFA values were greater than 0.90 (Kline, 2005). Fit indices weregood (c2/df 3.525, RMSEA 0.058, SRMR 0.050, GFI 0.94, CFI 0.99). In addition to fit indices, we set out to examine such structuralelements of the model as factor loadings and squared multiple correlations. As can be seen in Fig. 1, each item loads on its intended factorwith factor loadings ranging from0.49 to 0.85, and each factor loading is statistically significant. The range of the factor loadings is consistentwith the range recommended for social science researchdthe range being between 0.40 and 0.70 (Costello& Osborne, 2005). Fig. 1 presentsthe analytical results for CFA.Table 1Participant characteristics.

    Blended course Online course Total

    Survey period 2010 Spring semesters & 2010 fall semesters Spring & fall semesters of 2010 & 2011Course name Introduction to Environmental Protection Taiwanese Ecology Internet Literacy and EthicsNumber of respondents 367 383 750Gender Male 107 (29.2%) 197 (51.2%) 304

    Female 260 (70.8%) 186 (48.8%) 446Student grade Freshman 41 (11.2%) 9 (2.3%) 50

    Sophomore 65 (17.7%) 42 (11%) 107Junior 106 (28.9%) 170 (44.4%) 276Senior 155 (42.2%) 162 (42.3%) 317

  • Fig. 1. Results of the CFA: A five-factor model with factor loadings for the 16-item Online Instructor Role and Behavior Scale (OIRBS).

    M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e3253204.2. Measurement invariance analysis (blended vs. online)

    Since this study used two samples from a variety of courses with different requirements in two learning environments (blended vs.online), we had to test themeasurement invariance. Testingmeasurement invariance involves a series of increasingly restrictive hypotheses.Different levels of invariance are assessed with a hierarchical procedure comparing four multi-group models in a fixed order, from the leastto the most restrictive model.

    An initial baseline model (Model 1) has no between-group invariance constraints on estimated parameters. Groups have the same formwithout restricting any non-fixed parameters acrossmodels (Bollen,1989). The baselinemodel shows a good fit with RMSEAof 0.060 andGFIandCFI greater than0.90. InModel 2, themodel is the sameacross the groups and the factor pattern coefficients (loadings) are identical acrossgroups because thepattern coefficients carry the information about the relationshipbetween latent scores andobserved scores (Steenkamp&Baumgartner,1998).Model 2 also exhibits a good fit for the RMSEAof 0.067, the GFI of 0.92, and the CFI of 0.98. The next stepwas to assess themodel (Model 3) inwhich factor loadings andmeasurement error variances were constrained to being equal in the two groups. Model 3 alsohad a good fit for RMSEA, GFI, and CFI. Model 4 in this hierarchywas themost restrictivemodel, inwhich all three parametermatrices (factorloadings, intercepts, and residual variances) were simultaneously tested for equality. In this case of strict factorial invariance, the measure-ments across groupswere not biased in anywayandwere identical in terms of the construct's validity and reliability as captured by the latentvariables. The results showthat thefive factors in theOIRBS remained invariant across groups of blended learning andonline learning, and canbe used as a valid and reliable research instrument for further comparisons. Table 2 shows the results of the invariance-testing procedure.4.3. Differences among students' scores in the five OIRBS dimensions

    We explore here how students' perceptions about a given instructor's roles and associated behaviors varied across two different samplesstemming from two learning environments. Table 3 presents students' mean scores and standard deviations on the five dimensions inblended courses. To calculate each student's mean score for every dimension, we calculated the sum of the values pertaining to the answersto each item in a factor, and then divided the sum by the number of that factor's items. All students' average scores relative to the differentdimensions ranged from 3.66 to 3.99 on a 5-point Likert-type rating scale. In order to investigate the differences among the five dimensionsof the scale, we conducted a multivariate, repeated one-way ANOVA. By comparing the mean of those five dimensions, we noted a pattern:the higher the mean score, the greater the weight (i.e., importance) students attributed to instructors' roles and behaviors. The results showthat Hottelling's Trace was significant (F 32.262, p < 0.001). A post hoc test further revealed that the mean score of course designer andTable 2Invariance testing across two different groups.

    Model c2 df RMSEAa CFIb GFIc SRMRd

    1 451.62 194 0.060 0.99 0.93 0.0492 572.47 214 0.067 0.98 0.92 0.0432 574.13 226 0.064 0.98 0.91 0.0404 595.92 230 0.065 0.98 0.91 0.040

    a Root mean square error of approximation.b Comparative-fit index.c Goodness-of-fit index.d Standardized root mean square residual.

  • Table 3Results of a multivariate repeated one-way ANOVA and a post hoc test of the OIRBS for the blended learning environment (n 367).

    Factors Mean SD F Value (Hotelling's trace) Summary of significant differences in pairedsamples in the post hoc test

    Course designer and organizer (CDO) 3.99 0.71 32.262*** CDO > TF > DF, AD > SSDiscussion facilitator (DF) 3.73 0.67Social supporter (SS) 3.66 0.70Technology facilitator (TF) 3.85 0.75Assessment designer (AD) 3.73 0.71

    ***p < 0.001, All items were measured via a 5-point Likert scale (1 strongly disagree, 2 disagree, 3 neutral, 4 agree, and 5 strongly agree).

    M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325 321organizer (CDO) is greater than the other four factors' mean scores; the mean score of technology facilitator (TF) is greater than the meanscores of discussion facilitator (DF), social supporter (SS), and assessment designer (AD); and the mean scores of factors DF and AD aregreater than the mean score of factor SS.

    Table 4 presents students' mean scores and standard deviations on the five dimensions as they pertain to the online courses. All students'average scores range from 3.62 to 4.02 on a 5-point Likert-type rating scale. We conducted a multivariate, repeated one-way ANOVA toexamine the differences among the five dimensions of the scale. The results show that Hottelling's Trace was significant (F 39.698,p < 0.001). A post hoc test further revealed that the mean score of course designer and organizer (CDO) is greater than the other four factors'mean scores, and that the mean scores of discussion facilitator (DF) and technology facilitator (TF) are greater than the mean scores ofassessment designer (AD) and social supporter (SS). As shown in Fig. 2, the radar graphs for the two learning environments have the samedimensions but exhibit different shapes.

    4.4. Results of group comparison

    We conducted an independent samples t-test to explore the differences between the blended learning environment and the onlinelearning environment regarding the five measured factors of instructor roles.

    The results of this analysis revealed statistically significant group differences between blended and online learners regarding their viewsof two roles (factors): discussion facilitator (DF) (p < 0.01) and social supporter (SS) (p < 0.001). As shown in Table 5, students weighted therole of discussion facilitator (DF) in online learning environments heavier than the same type of role in blended learning environments.However, no significant group differences were found in terms of students' views of the other four instructor roles (factors): course designerand organizer (CDO), social supporter (SS), technology facilitator (TF), and assessment designer (AD) (Table 5).

    5. Discussion

    5.1. The measurement model of OIRBS

    In this study, we first drew frompast research to establish the Online Instructor Role and Behavior Scale (OIRBS), and thenwe statisticallyinvestigated whether the five-factor structure underlying this scale could reflect students' perceptions of five online-instructor roles andtheir associated behaviors. Five main roles for online teaching emerged: course designer and organizer, discussion facilitator, social sup-porter, technology facilitator, and assessment designer. The first research question probed the measurement model of our OIRBS, and theconfirmatory factor analysis results support the measurement model. The second research question asked whether the OIRBS would yieldevidence of invariance according to the two learning environments. We used multi-group confirmatory factor analysis to evaluate themeasurement invariance across the blended learning and online learning groups. Analytical results show that the factor structures of a giveninstructor's roles, as perceived by students, remained invariant across the blended learning and online learning. That is, evidence emerged ofan invariant pattern of factor loadings, measurement error variances, and factor variances between the two groups (learning environments).To put the matter yet another way, we found that measurement invariance was present across the two learning environments (onlinelearning and blended learning) even when we had drawn our samples from different courses offered by different schools during differentsemesters. Therefore, we have concluded that the OIRBS items operated equivalently so that meaningful comparison across these twogroups could be made.

    5.2. College students' perceptions of five instructors' roles

    The third research question inquired into college students' perceptions of their instructors' roles. Upon examining students' mean scoresfor the OIRBS factors as shown in Fig. 2, we found that the current study's sample of college students whowere engaged in blended learningand college students who were engaged in online learning has exhibited the greatest weight in the dimension of course designer andorganizer. This result is consistent with Yusoff and Salimb (2012), whose results indicate that both course organization and a clearTable 4Results of a multivariate repeated one-way ANOVA and post hoc test of the OIRBS regarding the online learning environment (n 383).

    Dimension Mean SD F Value (Hotelling's trace) Summary of significant differences in paired samples in the post hoc test

    Course designer and organizer (CDO) 4.02 0.69 39.698*** CDO > DF, TF > AD > SSDiscussion facilitator (DF) 3.88 0.60Social supporter (SS) 3.62 0.71Technology facilitator (TF) 3.94 0.69Assessment designer (AD) 3.74 0.67

    ***p < 0.001.

  • Fig. 2. Radar charts for the mean comparison of the five roles in the two learning environments (Unit: five-point Likert scales).

    M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325322articulation of course expectations are crucial to the success of the class. The findings in the current study also supports the proposition thatthe most important role of an online instructor is to act as an instructional designer who plans and prepares the course and who providesdirect instruction (Ke, 2010). Without such course management and direction, as Arbaugh and Hwang (2006) proposed, students may belost in the virtual learning world, which differs notably from the traditional classroom.

    In addition to the roles of course designer and organizer, instructors should play a positive role as technology facilitator in both blendedand online learning groups, as shown in Fig. 2. This finding indicates that instructors, regardless of which of the two learning environmentsthey are teaching in, should be competently knowledgeable about technology to support instruction and learning. Jones (2004) showed thatmany teachers who do not consider themselves to be well skilled in using ICT feel anxious about using it in front of a class who perhapsknow more than they do (p. 7). Therefore, it is crucial that instructors should grasp how to use technology and should have effectivetechnological strategies to enhance courses.

    Finally, also shown in Fig. 2, this study's results show that the participating college students' mean scores for social supporter were by farthe lowest scores for any of the five instructor roles. The significance of this finding is that, in each of the two learning environments, thesocial supporter was neither obvious nor significant in comparison with other roles. Although Garrison, Anderson, and Archer (2000)proposed that social messages are necessary to sustain interaction and create and foster a community of discussion, instructors gener-ally play less active roles to facilitate a learning community. Nevertheless, this finding in our present study does not suggest that instructorsshould refrain from becoming social supporters. Teachers, whether in an online or a blended environment, should endeavor to developrelationships of trust with students and to cultivate a general sense of belonging among students.

    5.3. Learning-group differences in instructor roles and associated behaviors

    One of the research questions in the current study asked whether learning environment makes any difference in college students'perceptions of instructor roles and associated behaviors. On the basis of the independent samples t-test results (see Table 5), we high-lighted two interesting findings. First, students in each learning environment equally rated the four instructor roles of course designer andorganizer, social supporter, technology facilitator, and assessment designer. The equal ratings suggest that students expect instructors toperform these roles evenly across each of the two learning environments. Second, students in online learning environments assignedhigher scores to the dimension of discussion facilitator than did students in blended learning environments. Although teaching via anonline asynchronous discussion forum presents a challenge to instructors (Mazzolini & Maddison, 2007), the current study's participantsin online learning environments were more likely than the participants in blended environments to formulate positive perceptions ofinstructors as discussion facilitators or cheerleaders, whose function was to promote interactions among students. In comparing the twolearning environments, we found that students in online learning environments expected instructors to perform a more active, inter-active, and reflective role than did students in blended learning environments. It is generally the case that instructors in blended learningenvironments have significant opportunities to interact with students and to give them immediate feedback during face-to-face en-counters; thus, students in this type of environment may be more likely than students in the online learning environment to expect thatinstructors facilitate discussion.

    6. Implications, limitations, and conclusion

    The results of this study reveal that online instructors' role as social supporters merits special attention from the researchers. Both theblended-learning mean score and the online-learning mean score for the social-supporter variable are the lowest among the mean scoresTable 5Results of group comparison.

    Blended learning (n 367) Online learning (n 383) tM SD M SD

    Course designer and organizer (CDO) 3.99 0.71 4.02 0.69 0.24Discussion facilitator (DF) 3.73 0.67 3.88 0.60 2.52**Social supporter (SS) 3.66 0.70 3.62 0.71 1.02Technology facilitator (TF) 3.85 0.75 3.94 0.69 1.54Assessment designer (AD) 3.73 0.71 3.74 0.67 0.05

    **p < 0.01, ***p < 0.001 statistically significant at a 95% confidence level (2-tailed).

  • M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325 323for the five instructor roles. Duncan-Howell (2010) and Matzat (2013) characterized the need to belong as a desire for regular social contactwith people to whom one feels connected; in this light, teachers might do well to establish and sustain students' sense of belongingnessthrough the development of their interpersonal relationships and their sense of community, especially in online learning environments. Forexample, teacher can create learning communities through which group discussions, experience sharing, instant feedback, and so on cankeep the students more involved in the course. Teachers can, if possible, send an email or make phone calls to relatively passive students toask them the reason for the passivity and to draw them back into the course. These might establish and sustain students' sense ofbelongingness. In addition, students attributed the highest degree of importance to the role of course designer and organizer in both theblended learning and online learning environments. The results are similar to those discussed in Bailey and Card (2009), who argued thatorganization is an important practice for online teachers. The current study's findings regarding students' perceptions suggest that onlineinstructors should provide their students with effective organization, which entails well-defined course goals and learning objectives, clearsyllabi, firmly articulated expectations, satisfactory availability of all course materials, and the like. Effective use of technology is animportant component of effective practices for blended learning environments no less than for online learning environments. From stu-dents' perspectives, instructors should consider using a wide variety of technological tools to deliver course materials and to assist withstudent learning. Thus, institutions can, if possible, focus on providing instructors with technological training to enhance their onlineteaching.

    The limitations of our study merit acknowledgment here and discussion in future research. First, because of its exploratory nature, thisstudy did not check the OIRBS criterion-related validity; that is, we did not collect students' data on the OIRBS and other similar scalesconcurrently. Future research may benefit from focusing on possible correlations between OIRBS and similar scales for more concurrentevidence of validity. In particular, future research may find it wise to address the test-retest reliability of the OIRBS. Second, this studycompared two different learning environmentsdblended learning and online learningdfrom two distinct samples. We did not probe intothe format-related differences as they pertained to the course content, instructor and schools. Future research seeking to examine theusefulness of the OIRBS for all academic disciplines should consider focusing on students from diverse colleges and courses.

    In this study, we examined instructors' roles and associated behaviors across two samples of students in two learning environments. Itseems that, in both blended learning and online learning contexts, students perceived their instructors playingmultiple roles, expected theirinstructors to play certain roles, and assigned varying degrees of importance to these roles. This study suggests that instructors re-considertheir handling of these roles for students' successful learning experience in online as well as blended learning settings. The OIRBS helpsclarify some of the benefits and drawbacks of various behaviors that instructors performwhen inhabiting a particular role, and to this end,the OIRBSwill hopefully strengthen the efficiency, the effectiveness, and the overall appeal with which instructors operate in these contexts.The findings of our present study should spur discovery of more applications of the OIRBS in future research, and we anticipate that theseapplications will be as diverse as are the many contexts of online and blended learning.

    Acknowledgments

    The authors are grateful for funding from the Ministry of Science and Technology, Taiwan, grant number: MOST103-2511-S-130 -001 andMOST 103-2511-S-009-009-MY2.AppendixList of constructs and items, Mean, S.D. and Inter-item correlation (N 750).

    Item no. Dimension/items Mean S.D. Inter-itemcorrelations

    Course designer and organizer (CDO)CDO1 The instructor provides clear syllabi (e.g., goals, organization, policies, expectations, and requirements) to students at the

    beginning of the course.4.02 0.80 0.732

    CDO2 The instructor provides supplemental course materials for online courses. 4.08 0.79 0.732CDO3 The instructor provides online courses that are well organized and presented. 3.89 0.82 0.752Discussion facilitator (DF)DF1 The instructor encourages students to engage in critical and reflective thinking during online discussion. 3.82 0.77 0.694DF2 The instructor plays a role of facilitator, guide, or cheerleader in online discussion. 3.79 0.87 0.681DF3 The instructor gives feedback to students in online discussion. 3.86 0.84 0.714DF4 The instructor is helpful in guiding the class toward a reasonable understanding of course topics and concepts. 3.73 0.84 0.697Social supporter (SS)SS1 The instructor helps foster a sense of community in this online course. 3.79 0.83 0.722SS2 The instructor establishes a harmonious learning climate in this course. 3.84 0.80 0.700SS3 The instructor knows students through online interactions. 3.27 0.95 0.616Technology facilitator (TF)TF1 The instructor uses such tools and technologies as PowerPoint, audio, video, and multimedia devices, which are helpful in

    fostering learning.3.92 0.87 0.685

    TF2 The instructor exposes students to tools and technologies that are easy to use. 3.94 0.81 0.648TF3 The instructor provides students with appropriate technical support when the students face such problems as system

    disconnects.3.83 0.80 0.704

    Assessment designer (AD)AD1 The instructor designs exam questions that facilitate higher-order thinking skills (analysis, synthesis). 3.69 0.83 0.658AD2 The instructor designs exam questions or assessment activities that cover information in lectures and reading in proportion to

    the importance in the course.3.76 0.80 0.687

    AD3 The instructor contacts students who have not completed assignments and helps them complete assignments. 3.74 0.95 0.580

    1 strongly disagree, 2 disagree, 3 neutral, 4 agree, 5 strongly agree.

  • M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325324References

    Arbaugh, J. B. (2001). How instructor immediacy behaviors affect student satisfaction and learning in web-based courses. Business Communication Quarterly, 64(4), 42e54.Arbaugh, J. B., & Hwang, A. (2006). Does teaching presence exist in online MBA courses? The Internet and Higher Education, 9(1), 9e21.Arbaugh, J. B. (2010). Sage, guide, both, or even more? an examination of instructor activity in online MBA courses. Computers & Education, 55(3), 1234e1244.Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conference context. Journal of Asynchronous Learning Networks, 5(2),

    1e17.Azza, A. A. (2001). Learning from the Web: are students ready or not? Journal of Educational Technology & Society, 4(4), 32e38.Bailey, C. J., & Card, K. A. (2009). Effective pedagogical practices for online teaching: perception of experienced instructors. Internet and Higher Education, 12(34), 152e155.Berge, Z. L. (1995). Facilitating computer conferencing: recommendations from the field. Educational Technology, 15(1), 22e30.Bollen, K. A. (1989). Structural equations with latent variables. New York: John Wiley & Sons.Bonk, C. J., Kirkley, J. R., Hara, N., & Dennen, N. (2001). Finding the instructor in post-secondary online learning: pedagogical, social, managerial, and technological locations. In

    J. Stephenson (Ed.), Teaching and learning online: Pedagogies for new technologies (pp. 76e97). London, UK: Kogan Page.Bransford, J., Brown, A. L., & Cocking, R. R. (Eds.). (1999). How people learn: Brain, mind, experience, and school. Washington, D.C.: National Academy Press.Brill, J. M., & Galloway, C. (2007). Perils and promises: University instructors' integration of technology in classroom-based practices. British Journal of Educational Technology,

    38(1), 95e105.Carmines, E. G., & McIver, J. P. (1981). Analyzing models with unobserved variables: analysis of covariance structures. In G. W. Bohrnstedt, & E. F. Borgatta (Eds.), Social

    measurement: Current issues (pp. 65e115). Beverly Hills, CA: Sage Publications.Cho, M. H., & Cho, Y. J. (2014). Instructor scaffolding for interaction and students' academic engagement in online learning: mediating role of perceived online class goal

    structures. Internet and Higher Education, 21, 25e30.Clarke, J., & Dede, C. (2010). Assessment, technology, and change. Journal of Research in Teacher Education, 42, 309e328.Claudia, M., Steil, A., & Todesco, J. (2004). Factors influencing the adoption of the Internet as a teaching tool at foreign language schools. Computers and Education, 42(4),

    353e374.Condie, R., & Livingston, K. (2007). Blending online learning with traditional approaches: changing practices. British Journal of Educational Technology, 38(2), 337e348.Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. Practical Assessment

    Research & Evaluation, 10(7). Available from: http://pareonline.net/getvn.asp?v10&n 7 Accessed 27.08.14.Dringus, L. P., Snyder, M. M., & Terrell, S. R. (2010). Facilitating discourse and enhancing teaching presence : using mini audio presentations in online forums. Internet and

    Higher Education, 13(1e2), 75e77.Duncan-Howell, J. (2010). Teachers making connections: online communities as a source of professional learning. British Journal of Educational Technology, 41, 324e340.Dziuban, C., Moskal, P., & Hartman, J. (2005). Higher education, blended learning and the generations: knowledge is power no more. In J. Bourne, & J. C. Moore (Eds.), Elements

    of quality online education: Engaging communities. Needham, MA: Sloan Center for Online Education.Ellis, R. A., Hughes, J., Weyers, M., & Riding, P. (2009). University teacher approaches to design and teaching and concepts of learning technologies. Teaching and Teacher

    Education, 25(1), 109e117.Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students' perceived learning outcomes and satisfaction in university online education: an empirical inves-

    tigation. Decision Sciences Journal of Innovative Education, 4(2), 215e235.Frank, M. (2006). How to teach using today's technology: matching the teaching strategy to the e-learning approach. In L. T. W. Hin, & R. Subramaniam (Eds.), Handbook of

    research on literacy in technology at the Ke12 level (pp. 372e393). Hershey, PA: Idea Group Publishing.Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: computer conferencing in higher education. Internet and Higher Education,

    2(2e3), 87e105.Gaytan, J., & McEwen, B. C. (2007). Effective online instructional and assessment strategies. The American Journal of Distance Education, 21(3), 117e132.Gibson, D., Aldrich, C., & Prensky, M. (2007). Games and simulations in online learning: Research and development frameworks. Hershey, PA: Information Science Publishing.Hara, N., Bonk, C. J., & Anjeli, C. (2000). Content analysis of online discussions in an applied educational psychology course. Instructional Science, 28, 115e152.Heuer, B. P., & King, K. (2004). Leading the band: the role of the instructor in online learning for educators. Journal of Interactive Learning Online, 3(1). Retrieved April 12, 2014,

    from http://www.ncolr.org/jiol/issues/pdf/3.1.5.pdf.Hsieh, P. H. (2010). Globally-perceived experiences of online instructors: a preliminary exploration. Computers & Education, 54, 27e36.Humbert, M. (2007). Adoption of blended learning by faculty: an exploratory analysis. In M. K. McCuddy (Ed.), The challenges of educating people to lead in a challenging world

    (pp. 423e436). Springer.Jeong, A. (2003). Sequential analysis of group interaction and critical thinking in online threaded discussions. The American Journal of Distance Education, 17(1), 25e43.Jones, A. (2004). A review of the research literature on barriers to the uptake of ICT by teachers. Retrieved August 2011, from http://www.becta.org.uk.Kang, T., & Imt, T. (2013). Factors of learnereinstructor interaction which predict perceived learning outcomes in online learning environment. Journal of Computer Assisted

    Learning, 29, 292e301.Kim, K.-J., & Bonk, C. J. (2006). The future of online teaching and learning in higher education: the survey says. EDUCAUSE Quarterly, 29(4), 22e30.Ke, F. (2010). Examining online teaching, cognitive, and social presence for adult students. Computers & Education, 55(2), 808e820.Kline, R. B. (2005). Principles and practice of structural equation modeling (2nd ed.). New York: Guilford Press.Knowlton, D. S. (2000). A theoretical framework for the online classroom: a defense and delineation of a student-centered pedagogy. New Directions for Teaching and Learning,

    84, 5e14.Kreijns, K., Kirschner, P. A., Jochems, W., & van Buuren, H. (2007). Measuring perceived sociability of computer-supported collaborative learning environments. Computers &

    Education, 49, 176e192.Lamb, B. (2004). Wide open spaces: wikis, ready or not. EDUCAUSE Review, 39(5), 36e48.Lim, K., & Lee, D. Y. (2008). A comprehensive approach to the teacher's role in computer supported learning environments. In Proceedings of the Society for Information

    Technology and Teacher Education International Conference, Chesapeake, VA.Lin, S. Y., & Overbaugh, R. C. (2009). Computer-mediated discussion, self-efficacy and gender. British Journal of Educational Technology, 40(6), 999e1013.Liu, X., Bonk, C. J., Magjuka, R. J., Lee, S., & Su, B. (2005). Exploring four dimensions of online instructor roles: a program level case study. Journal of Asynchronous Learning

    Networks, 9(4), 29e48.Liaw, S. S., Huang, H. M., & Chen, G. D. (2007). Surveying instructor and learner attitudes toward e-learning. Computers & Education, 49(2), 1066e1080.Liu, G. Z., Lo, H. Y., & Wang, H. C. (2013). Design and usability testing of a learning and plagiarism avoidance tutorial system for paraphrasing and citing in English: a case

    study. Computers & Education, 69, 1e14.MacKnight, C. (2000). Teaching critical thinking through online discussions. Educause Quarterly, 4, 38e41.Mahdizadeh, H., Biemans, H., & Mulder, M. (2008). Determining factors of the use of e-learning environments by university teachers. Computers & Education, 51(1), 142e154.Martindale, T., & Wiley, D. A. (2005). Using weblogs in scholarship and teaching. TechTrends, 49(2), 55e61.Matzat, U. (2013). Do blended virtual learning communities enhance teachers' professional development more than purely virtual ones? A large scale empirical comparison.

    Computers & Education, 60(1), 40e51.Mazzolini, M., & Maddison, S. (2007). When to jump in: the role of the instructor in online discussion forums. Computers & Education, 49(2), 193e213.McDuffie, A. R., & Slavit, D. (2003). Utilizing online discussion to support reflection and challenge beliefs in elementary mathematics methods classrooms. Contemporary Issues

    in Technology and Teacher Education, 2(4), 447e465.McMurtry, K. (2001). E-Cheating: combating a 21st century challenge. T.H.E. Journal, 29(4), 36e41.Motaghian, H., Hassanzadeh, A., & Moghadam, D. K. (2013). Factors affecting university instructors' adoption of web-based learning systems: case study of Iran. Computers &

    Education, 61, 158e167.Muir-Herzig, R. G. (2004). Technology and its impact in the classroom. Computers and Education, 42(2), 111e131.Ocak, M. A. (2011). Why are faculty members not teaching blended courses? Insights from faculty members. Computers & Education, 56(3), 689e699.Olt, M. (2002). Ethics and distance education: strategies for minimizing academic dishonesty in online assessment. Online Journal of Distance Learning Administration, 5(3).

    Retrieved August 27, 2014, from http://www.westga.edu/~distance/ojdla/fall53/olt53.html.Robles, M., & Braathen, S. (2002). Online assessment techniques. The Delta Pi Epsilon Journal, 44(1), 39e49.

    http://refhub.elsevier.com/S0360-1315(14)00242-5/sref1http://refhub.elsevier.com/S0360-1315(14)00242-5/sref1http://refhub.elsevier.com/S0360-1315(14)00242-5/sref2http://refhub.elsevier.com/S0360-1315(14)00242-5/sref2http://refhub.elsevier.com/S0360-1315(14)00242-5/sref3http://refhub.elsevier.com/S0360-1315(14)00242-5/sref3http://refhub.elsevier.com/S0360-1315(14)00242-5/sref3http://refhub.elsevier.com/S0360-1315(14)00242-5/sref4http://refhub.elsevier.com/S0360-1315(14)00242-5/sref4http://refhub.elsevier.com/S0360-1315(14)00242-5/sref4http://refhub.elsevier.com/S0360-1315(14)00242-5/sref5http://refhub.elsevier.com/S0360-1315(14)00242-5/sref5http://refhub.elsevier.com/S0360-1315(14)00242-5/sref5http://refhub.elsevier.com/S0360-1315(14)00242-5/sref6http://refhub.elsevier.com/S0360-1315(14)00242-5/sref6http://refhub.elsevier.com/S0360-1315(14)00242-5/sref6http://refhub.elsevier.com/S0360-1315(14)00242-5/sref7http://refhub.elsevier.com/S0360-1315(14)00242-5/sref7http://refhub.elsevier.com/S0360-1315(14)00242-5/sref74http://refhub.elsevier.com/S0360-1315(14)00242-5/sref74http://refhub.elsevier.com/S0360-1315(14)00242-5/sref8http://refhub.elsevier.com/S0360-1315(14)00242-5/sref8http://refhub.elsevier.com/S0360-1315(14)00242-5/sref8http://refhub.elsevier.com/S0360-1315(14)00242-5/sref9http://refhub.elsevier.com/S0360-1315(14)00242-5/sref10http://refhub.elsevier.com/S0360-1315(14)00242-5/sref10http://refhub.elsevier.com/S0360-1315(14)00242-5/sref10http://refhub.elsevier.com/S0360-1315(14)00242-5/sref11http://refhub.elsevier.com/S0360-1315(14)00242-5/sref11http://refhub.elsevier.com/S0360-1315(14)00242-5/sref11http://refhub.elsevier.com/S0360-1315(14)00242-5/sref12http://refhub.elsevier.com/S0360-1315(14)00242-5/sref12http://refhub.elsevier.com/S0360-1315(14)00242-5/sref12http://refhub.elsevier.com/S0360-1315(14)00242-5/sref13http://refhub.elsevier.com/S0360-1315(14)00242-5/sref13http://refhub.elsevier.com/S0360-1315(14)00242-5/sref14http://refhub.elsevier.com/S0360-1315(14)00242-5/sref14http://refhub.elsevier.com/S0360-1315(14)00242-5/sref14http://refhub.elsevier.com/S0360-1315(14)00242-5/sref15http://refhub.elsevier.com/S0360-1315(14)00242-5/sref15http://pareonline.net/getvn.asp?v=10%26n%20=7http://pareonline.net/getvn.asp?v=10%26n%20=7http://pareonline.net/getvn.asp?v=10%26n%20=7http://pareonline.net/getvn.asp?v=10%26n%20=7http://refhub.elsevier.com/S0360-1315(14)00242-5/sref17http://refhub.elsevier.com/S0360-1315(14)00242-5/sref17http://refhub.elsevier.com/S0360-1315(14)00242-5/sref17http://refhub.elsevier.com/S0360-1315(14)00242-5/sref17http://refhub.elsevier.com/S0360-1315(14)00242-5/sref18http://refhub.elsevier.com/S0360-1315(14)00242-5/sref18http://refhub.elsevier.com/S0360-1315(14)00242-5/sref19http://refhub.elsevier.com/S0360-1315(14)00242-5/sref19http://refhub.elsevier.com/S0360-1315(14)00242-5/sref20http://refhub.elsevier.com/S0360-1315(14)00242-5/sref20http://refhub.elsevier.com/S0360-1315(14)00242-5/sref20http://refhub.elsevier.com/S0360-1315(14)00242-5/sref21http://refhub.elsevier.com/S0360-1315(14)00242-5/sref21http://refhub.elsevier.com/S0360-1315(14)00242-5/sref21http://refhub.elsevier.com/S0360-1315(14)00242-5/sref22http://refhub.elsevier.com/S0360-1315(14)00242-5/sref22http://refhub.elsevier.com/S0360-1315(14)00242-5/sref22http://refhub.elsevier.com/S0360-1315(14)00242-5/sref22http://refhub.elsevier.com/S0360-1315(14)00242-5/sref23http://refhub.elsevier.com/S0360-1315(14)00242-5/sref23http://refhub.elsevier.com/S0360-1315(14)00242-5/sref23http://refhub.elsevier.com/S0360-1315(14)00242-5/sref23http://refhub.elsevier.com/S0360-1315(14)00242-5/sref24http://refhub.elsevier.com/S0360-1315(14)00242-5/sref24http://refhub.elsevier.com/S0360-1315(14)00242-5/sref25http://refhub.elsevier.com/S0360-1315(14)00242-5/sref26http://refhub.elsevier.com/S0360-1315(14)00242-5/sref26http://www.ncolr.org/jiol/issues/pdf/3.1.5.pdfhttp://refhub.elsevier.com/S0360-1315(14)00242-5/sref28http://refhub.elsevier.com/S0360-1315(14)00242-5/sref28http://refhub.elsevier.com/S0360-1315(14)00242-5/sref28http://refhub.elsevier.com/S0360-1315(14)00242-5/sref29http://refhub.elsevier.com/S0360-1315(14)00242-5/sref29http://refhub.elsevier.com/S0360-1315(14)00242-5/sref29http://refhub.elsevier.com/S0360-1315(14)00242-5/sref30http://refhub.elsevier.com/S0360-1315(14)00242-5/sref30http://www.becta.org.ukhttp://refhub.elsevier.com/S0360-1315(14)00242-5/sref32http://refhub.elsevier.com/S0360-1315(14)00242-5/sref32http://refhub.elsevier.com/S0360-1315(14)00242-5/sref32http://refhub.elsevier.com/S0360-1315(14)00242-5/sref32http://refhub.elsevier.com/S0360-1315(14)00242-5/sref33http://refhub.elsevier.com/S0360-1315(14)00242-5/sref33http://refhub.elsevier.com/S0360-1315(14)00242-5/sref34http://refhub.elsevier.com/S0360-1315(14)00242-5/sref34http://refhub.elsevier.com/S0360-1315(14)00242-5/sref34http://refhub.elsevier.com/S0360-1315(14)00242-5/sref35http://refhub.elsevier.com/S0360-1315(14)00242-5/sref36http://refhub.elsevier.com/S0360-1315(14)00242-5/sref36http://refhub.elsevier.com/S0360-1315(14)00242-5/sref36http://refhub.elsevier.com/S0360-1315(14)00242-5/sref38http://refhub.elsevier.com/S0360-1315(14)00242-5/sref38http://refhub.elsevier.com/S0360-1315(14)00242-5/sref38http://refhub.elsevier.com/S0360-1315(14)00242-5/sref39http://refhub.elsevier.com/S0360-1315(14)00242-5/sref39http://refhub.elsevier.com/S0360-1315(14)00242-5/sref40http://refhub.elsevier.com/S0360-1315(14)00242-5/sref40http://refhub.elsevier.com/S0360-1315(14)00242-5/sref41http://refhub.elsevier.com/S0360-1315(14)00242-5/sref41http://refhub.elsevier.com/S0360-1315(14)00242-5/sref42http://refhub.elsevier.com/S0360-1315(14)00242-5/sref42http://refhub.elsevier.com/S0360-1315(14)00242-5/sref42http://refhub.elsevier.com/S0360-1315(14)00242-5/sref43http://refhub.elsevier.com/S0360-1315(14)00242-5/sref43http://refhub.elsevier.com/S0360-1315(14)00242-5/sref43http://refhub.elsevier.com/S0360-1315(14)00242-5/sref44http://refhub.elsevier.com/S0360-1315(14)00242-5/sref44http://refhub.elsevier.com/S0360-1315(14)00242-5/sref44http://refhub.elsevier.com/S0360-1315(14)00242-5/sref44http://refhub.elsevier.com/S0360-1315(14)00242-5/sref45http://refhub.elsevier.com/S0360-1315(14)00242-5/sref45http://refhub.elsevier.com/S0360-1315(14)00242-5/sref46http://refhub.elsevier.com/S0360-1315(14)00242-5/sref46http://refhub.elsevier.com/S0360-1315(14)00242-5/sref46http://refhub.elsevier.com/S0360-1315(14)00242-5/sref47http://refhub.elsevier.com/S0360-1315(14)00242-5/sref47http://refhub.elsevier.com/S0360-1315(14)00242-5/sref48http://refhub.elsevier.com/S0360-1315(14)00242-5/sref48http://refhub.elsevier.com/S0360-1315(14)00242-5/sref48http://refhub.elsevier.com/S0360-1315(14)00242-5/sref48http://refhub.elsevier.com/S0360-1315(14)00242-5/sref49http://refhub.elsevier.com/S0360-1315(14)00242-5/sref49http://refhub.elsevier.com/S0360-1315(14)00242-5/sref49http://refhub.elsevier.com/S0360-1315(14)00242-5/sref50http://refhub.elsevier.com/S0360-1315(14)00242-5/sref50http://refhub.elsevier.com/S0360-1315(14)00242-5/sref50http://refhub.elsevier.com/S0360-1315(14)00242-5/sref51http://refhub.elsevier.com/S0360-1315(14)00242-5/sref51http://refhub.elsevier.com/S0360-1315(14)00242-5/sref52http://refhub.elsevier.com/S0360-1315(14)00242-5/sref52http://refhub.elsevier.com/S0360-1315(14)00242-5/sref52http://refhub.elsevier.com/S0360-1315(14)00242-5/sref53http://refhub.elsevier.com/S0360-1315(14)00242-5/sref53http://refhub.elsevier.com/S0360-1315(14)00242-5/sref54http://refhub.elsevier.com/S0360-1315(14)00242-5/sref54http://refhub.elsevier.com/S0360-1315(14)00242-5/sref54http://www.westga.edu/%7Edistance/ojdla/fall53/olt53.htmlhttp://refhub.elsevier.com/S0360-1315(14)00242-5/sref56http://refhub.elsevier.com/S0360-1315(14)00242-5/sref56

  • M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325 325Roby, T., Ashe, S., Singh, N., & Clark, C. (2013). Shaping the online experience: how administrators can influence student and instructor perceptions through policy andpractice. Internet and Higher Education, 17, 29e37.

    Rovai, A. P. (2000). Online and traditional assessments: what is the difference? Internet and Higher Education, 3(3), 141e151.Rovai, A. P. (2002a). A preliminary look at structural differences in sense of classroom community between higher education traditional and ALN courses. Journal of Asyn-

    chronous Learning Networks, 6(1), 41e56.Rovai, A. P. (2002b). Development of an instrument to measure classroom community. Internet and Higher Education, 5, 197e211.Rovai, A. P. (2007). Facilitating online discussions effectively. Internet and Higher Education, 10(1), 77e88.Salmon, G., & Lawless, N. (2006). Management education for the twenty-first century. In C. J. Bonk, & C. Graham (Eds.), The handbook of blended learning: Global perspectives,

    local designs (pp. 387e399). San Francisco, CA: Pfeiffer Publications.Schoonenboom, J. (2012). The use of technology as one of the possible means of performing instructor tasks: putting technology acceptance in context. Computers & Ed-

    ucation, 59(4), 1309e1316.Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why instructors in higher education intend to use some learning management

    system tools more than others. Computers & Education, 71, 247e256.Shea, P., Li, C. S., & Pickett, A. (2006). A study of teaching presence and student sense of learning community in fully online and web-enhanced college courses. Internet and

    Higher Education, 9(3), 175e190.Steenkamp, J. E. M., & Baumgartner, H. (1998). Assessing measurement invariance in cross-national consumer research. Journal of Consumer Research, 25(1), 78e107.Webb, M., Gibson, D., & Forkosh-Baruch, A. (2013). Challenges for information technology supporting educational assessment. Journal of Computer Assisted Learning, 29(5),

    270e279.Wilson, B. C., Ludwig-Hardman, S., Thornam, C., & Dunlap, J. C. (2004, November). Bounded community: designing and facilitating learning communities in formal courses.

    The International Review of Research in Open and Distance Learning, 5(3). Retrieved August 27, 2014, from http://www.irrodl.org/index.php/irrodl/article/view/204/286.Yeh, S. S. (2010). Understanding and addressing the achievement gap through individualized instruction and formative assessment. Assessment in Education: Principles, Policy

    & Practice, 17, 169e182.Yuan, J., & Kim, C. (2014). Guidelines for facilitating the development of learning communities in online courses. Journal of Computer Assisted Learning, 30(3), 220e232.Yusoff, N. M., & Salimb, S. S. (2012). Investigating cognitive task difficulties and expert skills in e-Learning storyboards using a cognitive task analysis technique. Computers &

    Education, 58(1), 652e665.Zingaro, D., & Porter, L. (2014). Peer instruction in computing: the value of instructor intervention. Computers & Education, 71, 87e96.

    http://refhub.elsevier.com/S0360-1315(14)00242-5/sref57http://refhub.elsevier.com/S0360-1315(14)00242-5/sref57http://refhub.elsevier.com/S0360-1315(14)00242-5/sref57http://refhub.elsevier.com/S0360-1315(14)00242-5/sref58http://refhub.elsevier.com/S0360-1315(14)00242-5/sref58http://refhub.elsevier.com/S0360-1315(14)00242-5/sref59http://refhub.elsevier.com/S0360-1315(14)00242-5/sref59http://refhub.elsevier.com/S0360-1315(14)00242-5/sref59http://refhub.elsevier.com/S0360-1315(14)00242-5/sref60http://refhub.elsevier.com/S0360-1315(14)00242-5/sref60http://refhub.elsevier.com/S0360-1315(14)00242-5/sref61http://refhub.elsevier.com/S0360-1315(14)00242-5/sref61http://refhub.elsevier.com/S0360-1315(14)00242-5/sref62http://refhub.elsevier.com/S0360-1315(14)00242-5/sref62http://refhub.elsevier.com/S0360-1315(14)00242-5/sref62http://refhub.elsevier.com/S0360-1315(14)00242-5/sref63http://refhub.elsevier.com/S0360-1315(14)00242-5/sref63http://refhub.elsevier.com/S0360-1315(14)00242-5/sref63http://refhub.elsevier.com/S0360-1315(14)00242-5/sref63http://refhub.elsevier.com/S0360-1315(14)00242-5/sref64http://refhub.elsevier.com/S0360-1315(14)00242-5/sref64http://refhub.elsevier.com/S0360-1315(14)00242-5/sref64http://refhub.elsevier.com/S0360-1315(14)00242-5/sref64http://refhub.elsevier.com/S0360-1315(14)00242-5/sref66http://refhub.elsevier.com/S0360-1315(14)00242-5/sref66http://refhub.elsevier.com/S0360-1315(14)00242-5/sref66http://refhub.elsevier.com/S0360-1315(14)00242-5/sref67http://refhub.elsevier.com/S0360-1315(14)00242-5/sref67http://refhub.elsevier.com/S0360-1315(14)00242-5/sref68http://refhub.elsevier.com/S0360-1315(14)00242-5/sref68http://refhub.elsevier.com/S0360-1315(14)00242-5/sref68http://www.irrodl.org/index.php/irrodl/article/view/204/286http://refhub.elsevier.com/S0360-1315(14)00242-5/sref70http://refhub.elsevier.com/S0360-1315(14)00242-5/sref70http://refhub.elsevier.com/S0360-1315(14)00242-5/sref70http://refhub.elsevier.com/S0360-1315(14)00242-5/sref71http://refhub.elsevier.com/S0360-1315(14)00242-5/sref71http://refhub.elsevier.com/S0360-1315(14)00242-5/sref72http://refhub.elsevier.com/S0360-1315(14)00242-5/sref72http://refhub.elsevier.com/S0360-1315(14)00242-5/sref72http://refhub.elsevier.com/S0360-1315(14)00242-5/sref73http://refhub.elsevier.com/S0360-1315(14)00242-5/sref73http://refhub.elsevier.com/S0360-1315(14)00242-5/sref73

    Students' perceptions of instructors' roles in blended and online learning environments: A comparative study1. Introduction2. Literature review2.1. Studies on online instructor functions2.2. Online instructors' roles and behaviors2.2.1. Course designer and organizer2.2.2. Discussion facilitator2.2.3. Social supporter2.2.4. Technology facilitator2.2.5. Assessment designer

    3. Method3.1. Participants3.2. Instrument

    4. Results4.1. Total-sample CFA4.2. Measurement invariance analysis (blended vs. online)4.3. Differences among students' scores in the five OIRBS dimensions4.4. Results of group comparison

    5. Discussion5.1. The measurement model of OIRBS5.2. College students' perceptions of five instructors' roles5.3. Learning-group differences in instructor roles and associated behaviors

    6. Implications, limitations, and conclusionAcknowledgmentsAppendixReferences