Clicking Links in an Online Course and Student Engagement


tool use pie chart


Overall LMS tool use via blackboard.com



Blackboard's data science people have done a study on the data from all that student clicking in their learning management system and aggregated data from 70,000 courses at 927 colleges and universities in North America during the spring 2016 semester. That's big data.

But the results (reported this week on their blog) are not so surprising. In fact, their own blog post title on the results - "How successful students use LMS tools – confirming our hunches" - implies that we shouldn't be very surprised.

Let us look at the four most important LMS tools they found in predicting student grades. As someone who has taught online for fifteen years, it makes sense to me that the four tools are the ones most frequently used.

On top was the gradebook - not the actual grades, but that students who frequently check their grades throughout the semester tend to get better marks than do those who look less often. "The most successful students are those who access MyGrades most frequently; students doing poorly do not access their grades. Students who never access their grades are more likely to fail than students who access them at least once. There is a direct relationship at every quartile of use – and at the risk of spoiling results for the other tools, this is the only tool for which this direct trend exists. It appears that students in the middle range of grades aren’t impacted by their use of the tool."

Next was their use of course content. That makes sense. Actually, I would have thought it would be the number one predictor of success. Their data science group reports "An interesting result was that after the median, additional access is related to a decline in student grade; students spending more than the average amount of time actually have less likelihood of achieving a higher grade!" That's not so surprising. Students spending more time (slow or distracted readers; ones who skimmed and need to repeatedly return to material etc.) are probably having problems, rather than being more though. The student who spends an hour on a problem that should take 15 minutes is not showing grit.

This is followed by assessments (tests etc.) and assignments. "If students don’t complete quizzes or submit assignments for a course, they have lower grades than those who do so. This was not a surprising finding. What was surprising to me is that this wasn’t the strongest predictor of a student’s grade." Why is that surprising? Because it is what we use to evaluate and give those grades.Digging a bit deeper in that data, Blackboard concludes that time is a factor as a "...strong decline in grade for students who spend more than the average amount of time taking assessments. This is an intuitive result. Students who have mastered course material can quickly answer questions; those who ponder over questions are more likely to be students who are struggling with the material. The relationship is stronger in assessments than assignments because assessments measure all time spent in the assessment, whereas assignments doesn’t measure the offline time spent creating the material that is submitted. Regardless, this trend of average time spent as the most frequent behavior of successful students is consistent across both tools, and is a markedly different relationship than is found in other tools."

The fifth tool was discussion. I have personally found discussions to be very revealing of a student's engagement in the course. I also find that level of engagement/participation correlated to final grades, but that may be because I include discussions in the final grade. I know lots of instructors who do not require it or don't grade it or give it a small weight in the final grade.

An article on The Chronicle of Higher Education website is a bit unsure of all this big data's value. "But it’s hard to know what to make of the click patterns. Take the finding about grade-checking: Is it an existential victory for grade-grubbers, proving that obsessing over grades leads to high marks? Or does it simply confirm the common-sense notion that the best students are the most savvy at using things like course-management systems?"

And John Whitmer, director of learning analytics and research at Blackboard, says "I’m not saying anything that implies causality."

Should we be looking at the data from learning-management systems with an eye to increasing student engagement? Of course. Learning science is a new term and field and I don't think we are so far past the stage of collecting data that we have a clear learning path or solid course adjustments to recommend.

Measuring clicks on links in an LMS can easily be deceiving, as can measuring the time spent on a page or in the course. If you are brand new to the LMS, you might click twice as much as an experienced user. Spending 10 minutes on a page versus 5 minutes doesn't mean much either since we don't know if the time spent reading, rereading or going out to get a coffee.

It's a start, and I'm sure more will come from Blackboard, Canvas, MOOC providers (who will have even greater numbers, though in a very different setting) and others.


Trackbacks

Trackback specific URI for this entry

Comments

Display comments as Linear | Threaded

No comments

The author does not allow comments to this entry