xMOOCs and Completion rates

Deconstructing Disengagement: Analyzing Learner Subpopulations in Massive Open Online Courses

The paper focues on three computing MOOCs, so it’s xMOOC focused. It will be interesting to compare their analyasis with a cMOOC scenario.

 

“These MOOCs are structured learning environments that emphasize instructional videos and regular assessments, centralizing activities on a single platform. This is a distinct model from the set of learner-directed, open-ended courses that are now known as“cMOOCs” because of their grounding in connectivist theories of learning”

The relatively low completion rates of MOOC participants

has been a central criticism in the popular discourse. This

narrative implies a binary categorization of learners: those

who pass the class by adhering to the instructor’s expectations

throughout the course–and everyone else. This monolithic

view of so-called “noncompleters” obscures the many

reasons that a learner might disengage from a MOOC”

 

This is a valuable point. While I would disagree that critics necessarily present a binary sense of this, focusing on the possible complexity of low completion rates is key. It’s possible to focus on non completion constructively, but that conversion, as the writers suggest, needs to happen in a context where we look at the varied reasons for non-completion, some of which may be negative, some of which may be not.

“It also makes no allowances for learners who choose to participate in some aspects of the MOOC but not others, staying engaged with the course but not earning a statement of accomplishment.”

This is a standard issue MOOC problem….monitoring for participation where students don’t hit, or need to hit, the traditional markers. It also makes it difficult to monitor influence or effect.

In contrast, one could emphasize the importance of individual differences and consider all learners to be unique in their interactions with the platform. But whereas the monolithic view overgeneralizes, this individualist perspective overcomplicates. In this paper, we seek to strike a balance by identifying a small yet meaningful set of patterns of engagement and disengagement.”

 

This idea of overcomplication hadn’t occured to me. I had started to theorise, without data, possible categories for differing engagement patterns. Professional curiousity (from educators who want a hands on experience, from ed tech designers and professionals, from techies, from entrepeneurs and businesspeople looking to understand the phenomenon from a commercial perspective, students with specific, limited needs, and therefore very focused, or limited desires for engagement, self improval motivations)

“We de*ne learner trajectories as longitudinal patterns of engagement with the two primary features of the course–video lectures and assessments. We uncover four prototypical categories of engagement consistently across three MOOCs by clustering on engagement patterns. We focus on interactions with course content, because learning is a process of individual knowledge construction that emerges in a dynamic process of interactions among learners, resources, and instructors”

This is a pattern mirrored by Kop – focus on the interactions (live seminar participants, sminar replays, number of tweets, blog posts, comments, the network of interactions between participants both on proscribed/suggested platforms, but also on platforms that evolved). In xMOOCs, this is probably more direct. There may be a closed LMS system, the content is closed, clear, and linear, and easy to  monitor.

 

“The first step in our methodology is to generate a rough description of each student’s individual engagement in a course. For each assessment period, all participants are labeled either “on track” (did the assessment on time), “behind” (turned in the assessment late), “auditing” (didn’t do the assessment but engaged by watching a video or doing a quiz), or “out” (didn’t participate in the course at all). These labels were chosen because they could be easily collected, and would make sense in any MOOC that is based on videos and assessments, regardless of content area or the pedagogical strategies of the course.”

This seems like it would work for an x, but not a cMOOc (and would work for a task based one too, cMOOC artefacts are already loose, and loosely defined, and the choose your opwn path aspect means that not  engaging with an artefact is not necessarily a partiucpation issue. Also, as the artefacts are crowdsourced, and informally shared, many artefacts that may be popular, or somewhat endemic, could be missed, or difficult to track.

Here are the four engagement patterns, and, notably, they orient around the set artefacts. Also, video watching counts as participation of a type, but they don’t seem to have measured interaction across the student body (oops, that comes later, in the forums part…)

 

“Forum activity (Figure 4) varies signi*cantly between engagement trajectories with medium to large e*ect sizes, with Completing learners participating at signi*cantly higher rates than learners in other engagement trajectories (p<.001). For example, in the GS-level course, Completing learners exhibit signi*cantly higher levels of activity on the discussion board compared to Auditing (d=.721, mean=.536), Disengaging (d=.480, mean=1.98), and Sampling learners (d=1.97, mean=.09). The signi*cance of these di*erences is preserved when controlling for the di*erent durations of these learners’ participation in the course. On average, Completing learners write 1.71 posts and comments in the UG-level, .788 in the HS-level, and 7.18 in GS-level course”

1. ‘Completing’: learners who completed the majority of

the assessments o*ered in the class. Though these participants

varied in how well they performed on the assessment,

they all at least attempted the assignments.

This engagement pattern is most similar to a student in

a traditional class.

2. ‘Auditing’: learners who did assessments infrequently if

at all and engaged instead by watching video lectures.

Students in this cluster followed the course for the majority

of its duration. No students in this cluster obtained

course credit.

3. ‘Disengaging’: learners who did assessments at the beginning of the course but then have a marked decrease in engagement (their engagement patterns look like Completing at the beginning of the course but then the student either disappears from the course entirely or sparsely watches video lectures). The moments at which the learners disengage di*er, but it is generally in the *rst third of the class.

4. ‘Sampling’: learners who watched video lectures for only one or two assessment periods (generally learners in this category watch just a single video). Though many learners “sample” at the beginning of the course, there are many others that brie*y explore the material when the class is already fully under way”

 

The courses are three levels, HS is high school level, UG is undergrad, and GS is Graduate Sttunedt. Interestingly, the HS course had the highest level completion cluster, significantly, and also the highest disengaging, auditing (viewing videos, but not doing assignments) was similar across all. Sampling varied hugely.

Course Auditing Completing Disengaging Sampling

HS         6%            27%              28%                39%

UG          6%            8%               12%                 74%

GS           9%             5%               6%                  80%

 

“Understanding who learners are, why they enroll in the course, and other activities in the course is a *rst step towards illuminating potential influences on the self-selection of learners into these engagement patterns. Di*erences in the distribution of particular features across clusters may indicate that these demographic variables or learning processes a*ect learners’ engagement decisions.”

Tis seems key, and, though analytics of this type won’t catch all of that picture, a picture without the analytics is also incomplete.

 

“Since MOOCs are relatively new to most learners, it is reasonable to hypothesize that users are going to adapt over time to better take advantage of free material. As a result we predict that learner patterns of engagement will also change–a trend which could be explored through clustering engagement over present and future o*erings of the same course.”

 

This is an interesting point. Institutions, organisers, companies and participants will all, perhaps, alter their engagement with the  phenomenon as it, and they, evolve in response to each other.

 

“Auditing appears to be an alternative engagement pathway for meeting learner needs, with Auditing learners reporting similarly high levels of overall experience to Completing learners in two of three courses. This implies di*erent underlying preferences or constraints for Auditing and Completing learners, and points to an opportunity to design features to actively support these engagement trajectories. Auditors could be identi*ed via self-report or based on a predictive model that should be developed for the early detection of engagement patterns.”

In cMOOCs, for example, where actice connection is considered to be the metric, Kop reports that seminars were viewed in replay mode far more than in live session, and that lurkers made up a significant part of the participants, and that some lurkers reported on the utility of the course.

 

“From the surveys, the most prominent reasons that learners across the three courses selected on a list of reasons for disengaging were: personal commitment(s), work conflict, and course workload. While the personal constraints re*ected in the *rst two reasons may be unassailable, the three together can be interpreted to mean that some of these learners may have been better served by a course that was o*ered at a slower pace or even entirely self-paced.”

In other MOOCs (must track down the stats) learners report that too high, or low a workload, or workloads that were sponatneously adjusted (this is on the data visualisation for MOOC completion site I favourited and need to read), or that course requirements were not exoplicitly laid out in advance (Kop). There are para;lells here.

 

“More qualitative or survey-based data should be gathered on why learners choose to leave these courses. This data should be combined with more *ne-grained analytics to develop a predictive model for when learners are likely to disengage in the future and what category of disengagement their choice falls into.”

Excellent idea. And an argument in favour of an LMS based course – on open platforms with no, or few, centralised resources, this data is not easy to get, or, at times, is impossible.

 

“For example, forum activity is a behavioral measure excluded from the clustering algorithm that di*erentiates the trajectories and deepens our understanding of the activities of highly engaged learners. Completing learners exhibit the highest level of activity on the forum; notably, this rate is much higher than that of Disengaging learners, who are initially assessment-oriented and then disengage from the course. While to some extent this is a re*ection of Completing learners’ high level of engagement with the course overall, we may hypothesize that participation on the forum creates a positive feedback loop for some learners, as they are provided with social and informational inputs that help them stay on their trajectory towards completion. This hypothesis can be tested using encouragement designs, such as reputation systems, or by leveraging social influence by displaying participation levels or contributions of other learners”

 

They cite

R. Kraut, P. Resnick, S. Kiesler, Y. Ren, Y. Chen,

M. Burke, N. Kittur, J. Riedl, and J. Konstan.

Building successful online communities:

Evidence-based social design. The MIT Press, 2012

which I need to get.

“Platform designers should also consider building other community-oriented features to promote pro-social behavior, such as text or video chat, small-group projects, or facilitated discussions. Linking these community features more strongly to the content in the course–for example, “situated” discussions linked to a point in a video or other resource–may further promote learning.”

Interventions

demonstrated to counteract stereotype threat among

women taking math tests include presenting the test as one

where there are no gender di*erences associated with results–

one where “everyone can achieve”–or a test that is described

as designed to help you learn, not one that is diagnostic of

your current skills [26]. Both of these devices could be used

to frame assessments to counteract instances of stereotype

threat in MOOCs” This is interesting. They are talking about sterotype threat as a possible problem here – the fear that one will fail at a particular type of task that where there us a stereotype of failure ( eg women are bad at quantitative tasks) resulting in lower rates of success across genders, despite the ability of bothe genders sets being equal. The answers in terms of motivation are useful, and come from

S. Spencer, C. Steele, and D. Quinn. Stereotype threat and women’s math performance. Journal of Experimental Social Psychology,35(1):4–28,199

“Much commentary has focused on the role that MOOCs can play in credentialing and opportunities for job (re)training. While acquiring new skills, along with the certi*cation of those skills, is certainly important to many participants, there are far more who are driven by the intellectual stimulation o*ered by the courses. MOOCs are evidently playing an important role in providing opportunities for engaging in lifelong learning outside of the confines of an institution, and can potentially serve as a powerful means of harnessing the “cognitive surplus” [24] that has emerged in a post-industrial ag”

 

Discussing com,pletion rates without knowing about motivation is useless. Here, a key point is illuminated. Completion may not always be a motivation.

“The second trend concerns the promise that MOOCs hold for global access to education. Though there are many exceptions, it is notable that the learners in all three courses tend to be well-educated professionals from high-HDI countries.

Moreover, the majority are male. These facts are partially an artefact of the technical nature of these courses.

The awareness of MOOCs is also likely much higher among learners from the US, which dominates the enrollment of the three courses under analysis. But broadband access is likely to be a factor as well, as many learners in low- and medium-HDI countries are faced by intermittent, slow, or metered bandwidth that would make it a challenge to fully engage with video-heavy courses. MOOC designers should consider decreasing videos or o*ering only the audio version of the lecture, two strategies that would also have implications for pedagogy and learning. The skew in geographical distribution is a clear call to action for those in the MOOC space who are focused on issues of access and equity, and explanations for this phenomenon should be pursued in order to develop more culturally sensitive and accommodating MOOCs.”

 

 

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s