Competency and Mastery: Part 2 - How to Measure It

Yesterday, I was writing about about differentiating mastery and competency in the light of movements such as competency-based education and degree programs.

The Mozilla Open Badge project and other initiatives have tried to standardize the use of badges for documenting learning. I like the idea but I don't see that badges have made any serious entry into educational institutions.

Badges have been used to mark what a person knows or what they can do. Proponents say using them is more student-centered and more about real student learning. It's certainly more real than using seat time and time on task as a measurement. Because a student has completed 9 credits hours proves nothing, and more often we hear that employers also question that getting an "A" grade for those 9 credits also doesn't prove any mastery or competency. Enter competency-based or evidence-based approaches to learning.

badgeI still think about the merit badges I earned in scouting when this topic comes up. The badges were extrinsic motivators and they worked for me and most of my fellow scouts. You wanted to get them. I liked the ceremonial awarding of them at meetings and the recognition. My mom and my "den mother" were pretty conscientious about signing off that I had completed the requirements to earn them. But much of the work I had to do was on the "honor system" and I'm sure I cut corners on some things and got away with it.

If I earned a badge for "climbing" (as in rock and mountains), would you say I was competent at the sport? Would you say I had mastered it? I don't think I'd be comfortable saying either one of those things. I had learned about it and I had done some actual activities involved with it. I had not mastered it and I'm not sure a real climber would say I was competent enough to do it on my own or very seriously.

As Bernard Bull and others have pointed out, this same critique can be leveled at letter grades. Do both make school about "earning instead of learning?"

We also associate badges with video games and in the gamification of learning they play an important role. In the pure gaming environment, earning badges, points, power pills or whatever tokens are given sometimes does take precedence over learning. Then again, some games aren't much interested in learning.

It's better to think of badges as markers, milestones of progress rather than as a goal. 

The Mozilla project and others have tried to give more trust in badges as credentials and educational currency. Education has always valued tests, score and credits as evidence of learning even though we have been arguing about it for hundreds of years and continue to do so.

If the organization awarding the badge is credible, then the real concern is what evidence is being used to determine the completion. As with the goals and objectives we now hold as important in schools, some things are more easily measured.

Want to earn the "Miler" badge? Then run a mile in under 5 minutes and have it verified by the teacher or coach. Want to earn the "Team Player" or "Leadership" badges? Then... play on a team... be the captain...  Hmmm. Those are tougher things to measure.

Students, teachers and schools have talked for a long time about trying to get away from a reliance on just grades, but grades persist. Portfolio assessment and other movements have made a dent in some instances, but the quantifiable test score still wins the day. That stopwatch on the mile runner is easily validated. Today there is more testing and data being used and more complaints about its use.

Learning Beyond Letter Grades was a course offered last year that examined why so many schools use and rely on letter grades. "Where did they come from? What do they tell us and fail to tell us about the learners? What is the relationship between letter grades, student learning, and assessment?" That's a lot to ask in a six-week course, but it comes from this desire many of us have to consider authentic and alternative assessments, peer assessment, self-assessment and badges.

Some badges set an expiration date, meaning the badge bearer will need to return for more training or provide updated evidence to keep the badge.  That's an idea from the world of professional development, licensing and credentials. If you earned a computer programming or phlebotomy badge in 2001, should it still be valid today? Perhaps not.

Perhaps the most difficult hurdle in launching a competency or mastery-based program might be how to assess/validate learning. We have been hitting that one back and forth for centuries.

Competency vs. Mastery: Part 1 - A Matter of Definitions?

In 2014, I started seeing more articles about Competency-Based Education (CBE) as the new approach to higher education degrees. In 2013, I think that "mastery" might have been the buzzier word. Mastery got a big push last year from things like Khan Academy and founder Sal Khan's belief that mastery of a skill or concept before moving on was what was lacking in American education overall.

A simplified explanation of the difference, perhaps from the view of an employer, is measuring what they know (mastery) versus what they can do (competency).

Is competency the new mastery? I did some searching and turned up a piece called "Competency vs. Mastery" by John F. Ebersole (president of Excelsior College) on the Inside Higher Ed site that compares these two approaches to "validating" learning.

He suggests that “competency” could be akin to subject matter “mastery” and might be measured in traditional ways - examinations, projects and other forms of assessment.

Ask that hypothetical woman-on-the-street if they would rather hire someone who had mastered a skill or was competent, I suspect mastery would win out. Of course, that person's ability to apply what they have mastered into a practice might still be in question.

It may be semantics, but considering someone to be "competent" sounds to many people like "adequate." That article gives as an example those instructors we have experienced as students who had "complete command of their subjects, but who could not effectively present to their students. The mastery of content did not extend to their being competent as teachers."

What would you say a subject matter exam measures? Mastery? Might an undergraduate have mastered subject matter or skills but still not be competent in her chosen field?

Looking online at the available books on competency-based education and training, most of them are in healthcare and clinical supervision, which is also the programs discussed in the article. Does the CBE approach work with other disciplines?

Some interest in CBE comes from that often-heard idea that employers don't view new college graduates as ready to do the job. They expect to have to further train the new hire who has "mastered content, but [is] not demonstrating competencies."

Ebersole says that "To continue to use 'competency' when we mean 'mastery' may seem like a small thing. Yet, if we of the academy cannot be more precise in our use of language, we stand to further the distrust which many already have of us."

in part 2, I will consider measuring mastery & competency

A Ben Franklin Resolution for the New Year


“An investment in knowledge pays the best interest.” — Ben Franklin

If you're looking for a resolution for the new year, you might look to Ben Franklin as a model.

Benjamin Franklin's father dreamt of sending him to Harvard University. But he had a big family to support and was not wealthy. With 17 children, Josiah and Abiah Franklin could only afford two years of schooling for Benjamin. Instead, they made him work, and when he was 12 he became an apprentice to his brother James who was a printer in Boston.

The printing business gave Benjamin the opportunity to read books and pamphlets which was his Internet. He read everything, and taught himself every skill and discipline one could absorb from text.

Ben wrote later that he was determined to fix this lack of education by investing several hours each day in reading and self-education. Of course, self-learning is not schooling, but long-term it can have a significant effect.

Long before "massive open online courses," the idea of 10,000 hours for mastery and "personal learning networks," Franklin calculated that by reading an hour or more daily in a chosen field, he would read approximately one book per week. One book per week translates into roughly 50 per year, and that would make him "expert" in a field within 3 years.

You may not be seeking expertise and you might choose a field where expertise requires more than just "book learning." And you might want a more "liberal arts" education, and so you might read in a number of fields instead of one. Still, Ben's plan is not a bad one to undertake in the new year.

“Tell me and I forget, teach me and I may remember, involve me and I learn.” - Benjamin Franklin


Most-Popular Wired Campus Articles of 2014

The Chronicle of Higher Education's Wired Campus column posted its top ten stories with readers from 2014. I don't one trend dominating their technology on campus world, as there are ones on pedagogy, publishing, interacting with parents (surprising in that we associate that more with K-12 teaching), libraries, online security and the tools and trends. Wired Campus points out that there articles often crossover into the mainstream press coverage of technology because technology itself is so mainstream. Most major news outlets in print, on TV or on the Net have technology reporters. Based on reader clicks, these were the 10 top articles:

  1. Are Courses Outdated? MIT Considers Offering ‘Modules’ Instead

  2. Taking Notes by Hand Benefits Recall, Researchers Find

  3. Why One Professor Thinks Academics Should Write ‘BuzzFeed-Style Scholarship’

  4. 6 Technologies Will Change Colleges in Coming Years, Experts Say

  5. 5 Things Researchers Have Discovered About MOOCs

  6. The ‘Heartbleed’ Bug and How Internet Users Can Protect Themselves

  7. This Guy Drew a Cat. You Won’t Believe What Happened 4 Centuries Later

  8. Video: Tech Tools Students Say They Can’t Live Without

  9. How Streaming Media Could Threaten the Mission of Libraries

  10. U. of Tennessee at Martin Encourages Helicopter Parents to Hover

Big-Data Scientists Face Ethical Challenges After Facebook Study

"Big-Data Scientists Face Ethical Challenges After Facebook Study" By Paul Voosen from

"Jeffrey Hancock, a Cornell U. professor who teamed up with Facebook on a controversial study of emotion online, says the experience has led him to think about how to continue such collaborations “in ways that users feel protected, that academics feel protected, and industry feels protected.”
Last summer the technologists discovered how unaware everyone else was of this new world.

After Facebook, in collaboration with two academics, published a study showing how positive or negative language spreads among its users, a viral storm erupted. Facebook "controls emotions," headlines yelled. Jeffrey T. Hancock, a Cornell University professor of communications and information science who collaborated with Facebook, drew harsh scrutiny. The study was the most shared scientific article of the year on social media. Some critics called for a government investigation.

Much of the heat was fed by hype, mistakes, and underreporting. But the experiment also revealed problems for computational social science that remain unresolved. Several months after the study’s publication, Mr. Hancock broke a media silence and told The New York Times that he would like to help the scientific world address those problems"