Rating User Interface and Universal Instructional Design in MOOC Course Design

This study examines how college students rate Massive Open Online Courses (MOOCs) in terms of User Interface Design and Universal Instructional Design. The research participants were 115 undergraduate students from a public midwestern university in the United States. Each participant evaluated three randomly chosen MOOCs, all of which were developed on the Coursera platform, using rubrics for User Interface Design and Universal Instructional Design. The results indicated that students had an overall positive impression of each MOOC’s course design. This study concludes that overall course design strategies are not associated with the massive dropout rates currently documented in MOOC learning environments. The authors suggest the use of appropriate instructional design principles be further explored.


Introduction
In 2012, the interest in massive open online courses (MooC) was so intense that the New York Times declared 2012 "The year of the MooC". however, after two years of expectation and excitement over the trend, huge attrition rates, often believed to reach as high as 90-95 percent (Kolowich, 2013;vu & Fadde, 2014), challenged MooC pioneers. Many reasons have been identified, and solutions suggested for these large attrition rates. For instance, Cheng et al. (2013) studied behaviors in online discussion forums of over 100,000 learners in 73 courses offered by Coursera, and found various correlations with the dropout rate. one of these factors was the amount of peer-graded homework related to the courses. Another factor was the disorganization of learners' posts in the forums. The authors developed an automated system that identified small talk and filtered it out of the fire hose. They argued that this method would help learners focus on the useful posts, and so enhance the learning experience. vu & Fadde (2014) introduced a model for MooC enrollment called "Rings of Engagement" in which learners enrolled in a MooC would be divided into three different tracks so that learners' performances, and attrition rates in each circle could be tracked to decide the effectiveness of the MooC. Guo (2014) examined the effect of video lecture length on MooC learners' engagement and made recommendation on how to make the video lectures more engaging. Guo (2014) suggested that to maximize student engagement, video lecture length in MooCs should be broken into small, bite-sized pieces of around six minutes or shorter rather than having hours-long video lectures that follow the more traditional in-person lecture model. Nonetheless, course design, often emphasized in the instructional design and/or technology community, has not been addressed in studies regarding MooCs. To that end, this study examines whether the design of MooCs in the prominent MooC platform, Coursera, have any relationship with the current high dropout rate. Specifically, we examine two aspects of course design: user interface design and universal instructional design to answer the following questions. At a basic level, course design can be defined as the process one must go through at the start of any course to plan for successful student outcomes. Instructors must determine their values and teaching philosophy, situational factors, the overall goals for the learners, and what learners should be able to do at the end of the learning experience (Fink, 2005;wiggins & McTighe, 2005).
Hutchinson & Waters (1989) define course design as a process composed of several stages, and state that the main goal of course design is to provide learners with knowledge that will help them perform well in a real situation. Graves (2008) defines course design, course development, or curriculum design as a process composed of several parts and constituents. The ability to work in stages enables course developers or instructional designers to build effective and structured learning experiences, taking into account learning theories and instructional design principles in the form of a course.
MooC course design often follows a traditional course design model, as well as general instructional strategies. however, MooCs possess complex and unique features that could benefit from a more customized set of design principles. For the sake of learning, any instructional model should employ a design that takes into account the learners' characteristics, needs and dispositions towards learning.

RITPU • IJTHE
What is the relationship between course design and students' performance?
Recently, scholars from across a wide variety of disciplines have proposed innovative designs for learning environments with the purpose of strengthening the relationship between the conceptual world of classrooms and real world practice in learning processes. Among the various learning models that resulted from education innovation, one of the most promising and controversial trends is the rise of MooCs in higher education. while there are important concerns about the MooC trend with regards to factors such as rigor, authentic assessment, and skill-based learning, the design features and strategies adopted in MooC learning environments have become a central question in the emerging research literature on MooCs. Course design seems to play a significant role in learners' success as they journey through this new and unique learning environment. Fortunately, successful features of online learning have been well documented in the literature. hopefully, existing frameworks can contribute, or at least inform, new online learning trends such as scalable or massive online open courses. Kearsley (2000) found that learning online offers a unique social context. In fact, both earlier and recent research have consistently found that course design elements such as structure (Romiszowski & Cheng, 1992), comprehensibility (Eastmond, 1995) and community (hrastinski, 2008;Irani, 1998) to mention just a few, embedded in online course design are vital assets to the overall success of online learners. Thus, connectivity and the sense of belonging remain controversial concepts in online learning whether in a MooC or the more traditional online learning approach.
In the earlier stages of online learning, researchers found that how the medium (technology) is used determines success, not the medium itself (Clark, 1983;Merisotis & phipps, 1999;owston, 1997). This notion has been widely accepted and discussed in the scholarly work conducted in both traditional online learning models and more recently in MooC related studies (Siemens, 2005). Little research has been conducted to find elements of course design that positively affect overall success, from the perspective of learners in MooCs.
A study by Cross (2013) investigated learner compliance with MooC design. More precisely, the study looked at how well learning took place according to the expectations written into the course design. A survey containing sixteen core features was conducted asking participants to rate the impact of the features in their learning experience. having an authentic design process structure was rated the third most important feature in MooC learning. Cross (2013) also found that many aspects of course design affect learners' experience, expectations and success in MooC courses. For example, feedback from MOOC participants identified the use of more visuals, such as diagrams, hands-on practice with required technology, more focus on course dynamics in the beginning stages through how to's, and less multitasking as important items to consider in MooC courses. Fini (2009) reported that MooC participants have expressed mixed opinions regarding the technologies employed in the course design. MooCs are usually sustained by technology, integrating a variety of tools to distribute content and allow participants to accomplish tasks (Siemens, 2009). For example, courses may provide important content or learning materials in addition to requiring participants to access external sources such as social media sites, video or audio lectures, discussion forums, and video conferencing tools such as Google hangout and Google Chat. Fini (2009) found that participants' perceptions towards technology design varied according to learning style preferences, personal objectives, and time involved on tasks. For example, participants perceived social networking sites as offering no advantage to their learning while access to daily newsletters was perceived as useful.
To maximize the experience for participants in MooC environments, instructors should design course materials with different learning styles and learner characteristics in mind. Fini (2009) suggests that clean and simple course designs appeal

IJTHE • RITPU
to learners with multiple technology skills and may prevent many from feeling overwhelmed or frustrated. one can conclude that the potential for hopelessness in MooC courses can be quite discouraging especially when there is no obligation or costs associated with dropping out. Cross (2013) found similar concerns. participants often reported "not understanding what they to need to do" and often "getting lost". The author also reported technology issues ranging from reliability to an overwhelming amount of channels to choose from as leading factors to failure in pursuing MooCs.
Current research suggests that clean design approaches can potentially have a positive effect on the attitude, success and completion rates of those pursuing learning through MooCs (Fini, 2009;waite, Mackness, Roberts, & Lovegrove, 2013;Zutshi et al., 2012). A recent study by Margaryan, Bianco, & Littlejohn (2015) concluded that MooCs are usually well organized and visually appealing, but most lack quality in terms of using instructional design principles for online learning.

User interface design
one reason we chose user interface design (UID) as one of the key aspects of MooC design, is that MooCs offer limited human interaction between learners and instructors. Similar to the fields of software production and computer systems design, which offer users little or no human interaction, user interfaces need to be designed in a way that users can navigate and understand every instruction with limited human-related support. In other words, the goal of UID is to make the user's interaction as simple and efficient as possible (Lee & Lochovsky, 1985;opperman, 2002).
User interface design is a subfield of an area of study named human-computer interaction. According to the U.S. Department of health and human Services (2014), UID emphasizes both anticipating what users may need to do and ensuring that the interface has all elements which are easy to access, understand, and use to accommodate those actions. Galitz (2007) argued that a well-designed interface is of essential importance to users. If the design or information presentation is confusing and inefficient, users or learners will have more difficulty achieving the outcomes. A poorly designed interface can also lead to frustration, increased stress, and aggravation. Specifically addressing the UID in the learning setting, Najjar (1998) identified five design principals to improve learning, including: make the user interface interactive, use elaborative media, present multimedia synchronously, use multimedia in a supportive, not a decorative way, and use the medium that best conveys the information. The International organization for Standardization (2014) in ISo 9241 also set standards for the organization of information (arrangement, alignment, grouping, labels, location), display of graphical objects, and coding of information (abbreviation, color, size, shape, visual cues) stated in the following seven attributes.
• Clarity: the information content is conveyed quickly and accurately. In this study, these seven attributes were included in the rubric used by the research participants to evaluate the MooC design in terms of UID.

Universal instructional design
Universal instructional design, frequently referred to as universal design for learning, is an approach that emphasizes meeting the learning needs of learners from different backgrounds, so that learners with and without disabilities, as well as learners

RITPU • IJTHE
with diverse learning needs, have equal access to educational opportunities and environments (CAST, 2001;pliner & Johnson, 2004;Silver, Bourke, & Strehorn, 1998). Connell et al. (1997) identified seven principles of universal instructional design, including: 1. Equitable use. The design is useful and marketable to people with diverse abilities. 2. Flexibility in use. The design accommodates a wide range of individual preferences and abilities.

Simple and intuitive use.
Use of the design is easy to understand, regardless of the user's experience, knowledge, language skills, or current concentration level. 4. Perceptible information. The design communicates necessary information effectively to the user, regardless of ambient conditions or the user's sensory abilities. 5. Tolerance for error. The design minimizes hazards and the adverse consequences of accidental or unintended actions. 6. Low physical effort. The design can be used efficiently, comfortably, and with a minimum of fatigue. 7. Size and space for approach and use. Appropriate size and space is provided for approach, reach, manipulation, and use regardless of the user's body size, posture, or mobility.
Several years later, Scott, McGuire, & Shaw (2003) added two principles to the original list of seven developed by Connell et al. (1997). The principles added are the following: A community of learners and instructional climate. The expanded set of principles is listed in Table 1.

Research Method
This is a quantitative research study using an online, Likert scale, grading rubric to evaluate the design of three MooCs in Coursera. one hundred and fifteen (115) undergraduate students in four different courses at a public midwestern university in the United States participated in this study. All of the research participants indicated no prior online learning experiences or knowledge of MooCs. The age range of the participants was between 19-24; most of them were sophomores.
For the purpose of this study, a generic account in Coursera was created during a period from July

RITPU • IJTHE
2013 to october 2014 to enroll in 12 MooCs offered by different universities using the Coursera MooC platform. To collect the research participants' evaluations of MooC designs, we arranged four sessions in which we met with participants in computer labs during a regularly scheduled class. Before participants entered the lab, one of the authors of this research project used the generic account to log into three randomly selected courses from the 12 MooCs the researchers originally enrolled in and made them available on each of the computers. The online Likert scale grading rubric was also made available on each of the computers. After all the computers in the lab had the courses and online rubric ready for the research participants to evaluate, they were invited to enter the lab and listen to the oral introduction about MooCs, research purposes, and instructions on how to do the evaluation using the online rubric. Four sessions were conducted using the same procedure; each session had approximately 28-32 participants. The result of the research participants' evaluations were automatically saved and analyzed in Qualtrics, a software program that enables users to collect and analyze online data.

Validity and Reliability of the Grading Rubric
There are many existing rubrics to evaluate online course design. one of the most popular rubrics is the Quality Matters higher Education Rubric originally developed under a three-year grant (2003)(2004)(2005)(2006) from the Fund for the Improvement of postsecondary Education (Legon & Runyon, 2007). The rubric has 40 specific standards that can be grouped into eight general standards listed below.

IJTHE • RITPU
To gauge the content validity of the rubric, we asked nine professional instructional designers who design and evaluate online courses at five different universities to evaluate whether our grading items accurately assess the defined content, and to provide feedback on the rubric. Appropriate modifications were made to the grading rubric based on the instructional designers' feedback.
In addition to content validity checking, we conducted a reliability test to measure how consistent the items in the rubric were in measuring the content. To be more specific, the reliability test examined whether all 13 items in the rubric related to aspects of the issues under investigation. The resulting alpha values are reported in Table 2 below. Cronbach's Alpha showed a value of .88, which according to George & Mallery (2009), indicates that the survey items had good internal consistency.

Research Question 1: How Do Learners Evaluate the MOOC Design in Term of User Interface Design?
Seven specific standards or items where used to assess the quality of User Interface Design: Clarity, Discriminability, Conciseness, Consistency, Detectability, Legibility, and Comprehensibility. Ninety-nine out of 110 responses were eligible for inclusion in the data analysis process. Eleven responses were removed from the data analysis because they were either incomplete or had inaccurate information. Tables 3, 4, and 5 present the findings of each evaluation in relation to User Interface Design.

Responses
As shown in tables 3, 4 and 5, research participants gave high scores on each specific standard of the User Interface Design of the three MooCs. Their scores were also consistent on each of the standards. overall, participants' responses indicated that the design of the MooCs in this study were easy to navigate, consistent, clear and concise. In other words, the design of those MooCs complied with the standards for good User Interface Design.

Research Question 2: How Do Learners Evaluate the MOOC Design in Terms of Universal Instructional Design?
Six specific standards were used to assess the quality of Universal Instructional Design: Equitable Use, Flexible Use, Simple and Intuitive, perceptible Information, Low Technical Effort, and Community of Learners and Support. out of 110 responses, 102 were eligible for inclusion in the data analysis process. Eight responses were removed from the data analysis because they were either incomplete or had inaccurate information. The findings in relation to Universal Instructional Design for each course are presented in Tables 6, 7 and 8.

Discussion
The mean score for each rubric item related to User Interface Design and Universal Instructional Design was above 4.0 on a scale of 1 to 5. This indicates that students' initial impression of each MooC was very positive. It also indicates that the MooC authors incorporated design properties that appear to be helpful to students. Because students have a positive impression of each of the MooCs, it is unlikely that there is a relationship between the User Interface Design, Universal Instructional Design, and students dropping out of MooCs. our study indicates that the high student dropout rate in MooCs is not related to course design. As there does not appear to be a relationship between MooC course design and the high dropout rate in MooCs, the authors of this study recommend that further research be conducted to determine why students drop out of MooCs in at such high rates. The results of this research are supported by a recent study by Margaryan et al. (2014) who explored the instructional quality in seventy-six MooC courses by conducting a survey to measure the use of instructional design principles in the selected MooC courses. The study concluded that although MooC courses are usually well organized and packaged (overall design, look and feel) the use of instructional design principles and quality of instructional practices applied in MooCs are very low.
The present study suggests that possible causes for the large attrition rate in MooCs may be related to instructional design principles, such as difficulty of the content, the types of assignments provided in the courses, lack of time to complete course activities, quality and quantity of feedback provided, lack of meaningful interactions with other students, lack of commitment because because MooC courses are offered for free or at minimal cost to students, and limited contact with the instructor.
The researchers recognize the limitations of this study, one of which is that the subjects in this study did not complete the MooC courses they evaluated. Instead, participants received access to the courses they evaluated in order to assess the design. The limited interaction and time spent exploring each of the MooC courses may not have been enough to provide each participant with the in-depth experience of with course content, assignments, readings, and required learning activities such as would be experienced by students actually enrolled in these courses. participants entered the courses, explored various parts of the courses, listened to some of the lectures or videos, clicked on quizzes or other assessments, but did not complete any required coursework.
It appears that MooC course designers are currently implementing all elements of quality online courses as defined by User Interface Design and Universal Instructional Design research. In other words, MooC courses are generally well packaged or organized. MooC designers are able to provide visually appealing templates or content holders, making the online look and feel clean, as well as easily accessible for students. Therefore, this study concluded that overall course design strategies are not associated with the infamous massive dropout rates currently associated with MooC learning environments. we suggest that the use of appropriate instructional design principles be further explored. MooCs offer an exciting opportunity for delivering online content to students, but further study needs to address why students drop out of MooC courses.