Digital wallets are tools to collect workers’ learner and employment records. They are not a new thing and have gone through different names and conceptualizations. In 2018, I was working with "badges" but it wasn't new then. I had worked with the Mozilla Foundation that was developing an Open Badges Infrastructure in 2012 (around the time that MOOCs exploded on the learning scene).
Open Badges is still around and on their site, they claim to be "the world's leading format for digital badges. Open Badges is not a specific product or platform, but a type of digital badge that is verifiable, portable, and packed with information about skills and achievements. Open Badges can be issued, earned, and managed by using a certified Open Badges platform. Want to build new technologies to issue, display, or host Open Badges? The Open Badges standard is a free and open specification available for adoption."
The idea of digital wallets has been talked about again now around the trend of skills-based hiring. If you have read that companies are more likely to hire based on skills rather than degrees, then some way - such as a wallet - that lets individuals collect and share verifiable records of their schooling, work, training programs, military service, and other experience is necessary. This is a work in progress, though you might expect that if this idea has been around for at least ten years that it might have gotten further.
There is a push for common technical standards among wallet developers to allow importing data from a variety of sources and sharing that via employers’ applicant-tracking systems.
When I was exploring badges a decade ago, I was also looking at Competency-Based Education (CBE) and mastery as related to higher education degrees. A simplified explanation of the difference from the view of an employer: MASTERY is measuring what they know. COMPETENCY is what they can do. Formal education has always been more focused on mastery rather than competency. Employers have those priorities reversed.
As the year comes to an end, you see many end-of-year summary articles and also a glut of predictions for the upcoming year. I'm not in the business of predicting what will happen in education and technology, but I do read those predictions - with several grains of salt.
“A good science fiction story should be able to predict not the automobile but the traffic jam.” wrote sci-fi author Frederik Pohl.
Many of the education and technology predictions I see predict things rather than the impact those things will have. Here are some that I am seeing repeated, so that you don't have to read them all, but can still have an informed conversation at the holiday party at work or that first department meeting in January.
If you look at what the folks at higheredexperts.com are planning for 2019 just in the area of higher ed analytics.
Is "augmented analytics" coming to your school? This uses machine learning (a form of artificial intelligence) to augment how we develop, consume and share data.
And IT analyst firm Gartner is known for their top trends reports. For 2019, one that made the list is "immersive user experience." This concept concerns what happens when human capabilities mix with augmented and virtual realities. Looking at the impact of how that changes the ways we perceive the real and digital world is what interests me.
We are still at the early stages of using this outside schools (which are almost always behind the world in general). You can point to devices like the Amazon Alexa being used in homes to turn on music, lights, appliances or tell us a joke, This is entry-level usage. But vocal interaction is an important change. A few years ago it was touch screen interactions. A few decades before it was the mouse and before that the keyboard. A Gartner video points at companies using remote assistance for applications such as an engineer working with someone in a remote factory to get a piece of equipment back online.
Will faculty be able to do augmented analytics using an immersive user experience? Imagine you can talk to the LMS you use to teach your course and you can ask, using a natural language interface, and ask " Which students in this new semester are most likely to have problems with writing assignments?" The system scans the appropriate data sets, examines different what-if scenarios and generates insights. Yes, predictive analytics is already here, but it will be changing.
But are IT trends also educational technology trends? There is some crossover.
Perhaps, a more important trend to watch for as educators for next year is changing our thinking from individual devices (and the already too many user interfaces we encounter) to a multimodal and multichannel experience.
Multimodal connects us to many of the edge devices around them. It is your phone, your car, your appliances, your watch, the thermostat, your video doorbell, the gate in a parking lot and devices you encounter at work or in stores.
Multichannel mixes your human senses with computer senses. This is when both are monitoring things in your environment that you already recognize, like heat and humidity, but also things we don't sense like Bluetooth, RF or radar. This ambient experience means the environment will become the computer.
One broad category is "autonomous things" some of are around us and using AI. There are autonomous vehicles. You hear a lot about autonomous cars and truck, but you may be more likely to encounter an autonomous drones. Will learning become autonomous? That won't be happening in 2019.
AI-driven development is its own trend. Automated testing tools and model generation is here and AI-driven automated code generation is coming.
Of course, there is more - from things I have never heard of (digital twins) to things that I keep hearing are coming (edge computing) and things that have come and seem to already have mixed reviews (blockchain).
Data governance grows - that should be true if their survey has accurately found that 35% of responding institutions said they don't even have a data governance policy - a common set of rules for collecting, accessing and managing data.
Finding the ROI for AI and VR may be what is necessary to overcome the cost barrier to full-scale implementation of virtual and augmented reality. AI has made more inroads in education than VR. An example is Georgia State University's Pounce chatbot.
Their fourth prediction is institutions learning how to use the blockchain. The potential is definitely there, but implementation is challenging.
Predictions. I wrote elsewhere about Isaac Newton's 1704 prediction of the end of the world. He's not the first or last to predict the end. Most have been proven wrong. Newton - certainly a well respected scientist - set the end at or after 2060 - but not before that. So we have at least 41 years to go.
Using some strange mathematical calculations and the Bible's Book of Revelation, this mathematician, astronomer, physicist came to believe that his really important work would be deciphering ancient scriptures.
I use that famous line from "Treasure of the Sierra Madre" but I'm talking about badges used for learning. The film says "need" but I say that in matters of learning we don't seem to "want" any badges.
It seemed like badges used for showing learning progress was going to be a big thing. That was especially true with online learning and then when MOOCs exploded onto the scene around 2011. But badges still hve not made significant inroads in education.
They didn't make any impact in credit-bearing courses, but they should have had more impact with lifelong learning, MOOCs, alternative education and non-credit learning opportunities.
I would compare there lack of acceptance to some reasons why MOOCs never really changed higher education. Badges and MOOCs are really great for non-credit learning, but when the movement to garner college credit from their use started there was no acceptance from higher education. They saw both as threats.
Similarly, some thought badges would allow learners to get "credit" for their learning with employers, either to advance or get a job. But employers also did not take to them. I don't think they felt threatened. It was more that they weren't convinced that the learning was legitimatized. I suppose that idea of validating the learning was also a factor for colleges, though the threat of lost tuition was much greater.
So, the problem is still the same as it was years ago: We need a way to design badgesso that at completion aschool or employer will be confident that the learner has actually mastered the skill for that badge.
I wrote earlier about a project by the Education Design Lab that tried to involve employers who committed to consider badges in their hiring of recent college graduates. But I don't see much evidence on their site of progress.