I do. A Question of Engagement.

2012. The year of the Summer Olympics. The Diamond Jubilee of the Queen of England. And the year that the world of higher education and learning and development focused strongly on Massive Open Online Courses. It was dubbed ‘the year of the MOOC’.

2012 also saw the start of the obsession with completion data in online courses. I have absolutely no issue with collecting data on this metric for analysis, but it shouldn’t be the be-all-and-end-all in measuring the success of a course. Nowhere else in learning does this metric have such weight. Student registrations for universities, hard copy book sales, YouTube videos, iTunes U downloads, Amazon Kindle downloads, Audible sales, etc. are all measured by initial sign up, views, or downloads, not who completed them.

So why am I discussing an 8 year old metric in 2020?

Due to the pandemic there has been a sudden surge in interest in online courses, and the discussions around the importance of completion data in 2012 which previously may have passed some by, need to be reopened.

For the entirety of my doctorate I focused on the engagement of 800,038 enrollers, 425,972 learners, 120,842 survey responses in online courses. My findings supported the belief that I have held for years, engagement is more important than completion.

Some readers right now will be throwing their hands up in the air, shutting down this blog article and declaring I don’t know what I’m talking about. But after 20 years in L&D, 15 in blended and online learning, producing literally thousands of pieces of content, including over 300 MOOCs, and researching the largest single source dataset in MOOCs to date, I have a pretty good handle on things. So hear me out…

Courses only work if the participant needs to learn everything within it. Otherwise participants zone out or gloss over what they already know, and therefore make it difficult to engage in the parts that they don’t actually know. We have all been there, clicking through a course or sitting in a workshop and glazing over whilst the course or the facilitator talks about a subject or a skill that you are already aware of, but when it gets to the part you have no prior knowledge of you don’t sit bolt upright all ears… because you’re no longer engaged. So the participant may get to the end, they may complete the course, but did they engage with it? Did they learn the parts they had no prior knowledge of to the level you wished? Will your workplace or organisation’s bottom line reap the benefits? No. So why is this metric more important? It’s not.

Completion data was created in 2012 for one purpose, monetary value. MOOC platforms based their income generation on certificates sold to completed learners. For years the pedagogical success of a course was based solely on its completion data, never on the engagement of the course within. As someone who places learning experience at the heart of human-centred learning, metrics on engaging with content is highly valuable. In a previous blog post I spoke of the concept of modular learning and the removal of sequential learning. I had originally written this post as part of my doctorate student blog years ago and is still relevant today.

So why is this important in a pandemic?

With the rapid movement to online, it is really important to define your metrics early on. What is it that is important to you? Participants achieving learning outcomes or completion metrics. If you are tracking application of learning outcomes through KPIs, e.g. customer satisfaction, increase in sales, shortening of a gestation period in contract completion, etc. does a click through completion metric really matter?

With companies moving rapidly to online, and the change in direction for a number of organisations, the concept of completion of a lengthy course needs to be reviewed – can it be chunked into bite-size content? Are all the components of the course necessary? Is the completion really more important than how a learner engages?

As you can see there are a number of factors that are open to debate, and have been fiercely debated for the last 8 years. Now that L&D is mainstreaming to the world of online, hopefully a number of elements of this debate can be reviewed internally for discussion. The key take home here is that one size does not fit all, but to at least make sure that what you select is an appropriate fit for the organisation that you are a part of.

*due to a lengthy battle with tonsillitis this blog post was severely delayed.

Leave a Comment

Your email address will not be published. Required fields are marked *