Quicksearch Your search for competency returned 35 results:

Soft Skills and the Reset on Employer Degree Requirements

workers on laptops together
     Photo: fauxels

Job-market studies of more than 15 million job postings nationally between 2016 and 2021 found that more than a third of the top 20 skills specified for the average job had changed. One in five of those top skills was entirely new to the work. Some people are calling it a reset and it was accelerated by the pandemic. Employers are resetting degree requirements in a wide range of roles, dropping the requirement for a bachelor’s degree in many middle-skill and even some higher-skill roles. This trend has implications for how employers find talent. It also opens up opportunities for two-thirds of Americans without a college education. At least one report projected that an additional 1.4 million jobs could open to workers without college degrees over the next five years.

Is higher education out of the game? No, it still has a role to play, but these kinds of reports find that employers’ demand for bachelor’s and postgraduate degrees was “starting to decrease perceptibly.” Almost half of the middle-skill jobs and nearly a third of the high-skill occupations showed significant reductions in degree requirements between 2017 and 2021.

This is a good thing if you are concerned with equity in hiring. That report's estimate for the next five years is that 1.4 million more jobs will be open to workers with the requisite skills but no degree.

I wrote here recently about seeing more about skills-based hiring and competency and mastery and less about degrees. Are employers lowering their standards? But when they drop degree requirements, job postings become more specific about skills. Job postings are very likely now to be specific about soft skills. The assumption that someone applying with a college education will have skills such as writing, communication, and attention to detail is not a good assumption.

I never liked the term "soft skills" which seems to undervalue those skills. A clear majority of employers say soft skills play a critical role in their hiring decisions.

ZipRecruiter compiled some of the most in-demand soft skills on its platform. Here are the top skills on that list, including the number of jobs on the site listing the skill as a requirement.

Communication skills Number of jobs listing the skill: 6.1 million
Customer service Number of jobs listing the skill: 5.5 million
Scheduling Number of jobs listing the skill: 5 million
Time management skills Number of jobs listing the skill: 3.6 million
Project management Number of jobs listing the skill: 2.8 million
Analytical thinking Number of jobs listing the skill: 2.7 million
Ability to work independently Number of jobs listing the skill: 2 million
Flexibility Number of jobs listing the skill: 1.3 million

Having studied and done a lot of work and teaching around communications, it does not surprise me that remote and hybrid work arrangements have increased the need for good communication skills. This can range from how you respond to an email, to making a presentation to a live or virtual audience.

Flexibility is a broad skill and hardly "soft." Multitasking, shifting to virtual, using new tools and a host of other situations follow the adage that the only constant is change.

Will the skills-based hiring trend continue? Some major employers, like IBM and Accenture, have publically altered their hiring practices. But several tech companies that had made big announcements about favoring skills over degrees in hiring for IT jobs still haven’t eliminated degree requirements from their job descriptions.

Digital Wallets

skills

Image by Gerd Altmann from Pixabay

 

Digital wallets are tools to collect workers’ learner and employment records. They are not a new thing and have gone through different names and conceptualizations. In 2018, I was working with "badges" but it wasn't new then. I had worked with the Mozilla Foundation that was developing an Open Badges Infrastructure in 2012 (around the time that MOOCs exploded on the learning scene).

Open Badges is still around and on their site, they claim to be "the world's leading format for digital badges. Open Badges is not a specific product or platform, but a type of digital badge that is verifiable, portable, and packed with information about skills and achievements. Open Badges can be issued, earned, and managed by using a certified Open Badges platform. Want to build new technologies to issue, display, or host Open Badges? The Open Badges standard is a free and open specification available for adoption."

The idea of digital wallets has been talked about again now around the trend of skills-based hiring. If you have read that companies are more likely to hire based on skills rather than degrees, then some way - such as a wallet - that lets individuals collect and share verifiable records of their schooling, work, training programs, military service, and other experience is necessary. This is a work in progress, though you might expect that if this idea has been around for at least ten years that it might have gotten further.

There is a push for common technical standards among wallet developers to allow importing data from a variety of sources and sharing that via employers’ applicant-tracking systems.

When I was exploring badges a decade ago, I was also looking at Competency-Based Education (CBE) and mastery as related to higher education degrees. A simplified explanation of the difference from the view of an employer: MASTERY is measuring what they know. COMPETENCY is what they can do. Formal education has always been more focused on mastery rather than competency. Employers have those priorities reversed.

MORE
https://info.jff.org/digital-wallets

Posts related to badges

What Is on the Horizon in Higher Education

horizonThe annual EDUCAUSE Horizon Report for Higher Education is always interesting to read. The report for 2019 is online now. It is 44 pages, so it would be a full lunchtime read, but as a cheater's guide or preview I offer the two parts that I always look at first.  

One is the section on "Key Trends Accelerating Higher Education Technology Adoption."  If you look back at past reports you will see that some trends come back for several years. That is partly intentional as the report predicts ones that should be considered "Short-Term" meaning in the next one or two years, as well as ones for 3-5 years and long-term trends that are probably 5+ years away.

Of course, there are also trends and tech developments that are almost perennial. We always seem to be rethinking online learning, learning spaces and assessment. And some tech, such as blockchain and rethinking degrees, have been "on the horizon" for a chunk of years and still don't seem to be really making a big difference.

In the short-term, the report lists "Redesigning Learning Spaces" and "Blended Learning Designs."

For Mid-Term Adoption in the next 3-5 years, they list "Advancing Cultures of Innovation" and a "Growing Focus on Measuring Learning." I think the latter should be moved up as a perennial topic.

In the 5+ years category is the rather broad "Rethinking How Institutions Work" and the returning "Modularized and Disaggregated Degrees."

The other section I always jump to is called "Important Developments in Technology for Higher Education." Again, there are predicted "Time-to-Adoption Horizons" given for each. 

The report also considers the challenges in adopting any of these technologies or trends. For example, one that I have been challenged by since I started in higher education tech in 2000 is what they term "The Evolving Roles of Faculty with Ed Tech Strategies."

The report says about that (and I generally agree) that:

"At institutions of any type or size, involving faculty in the selection and implementation of educational technologies can be difficult. Whether an institution is implementing a new courseware platform for the purpose of personalizing learning or building a completely new program by applying a pedagogical approach such as competency-based learning, such efforts face a range of challenges. Identifying learning outcomes and engagement strategies before identifying educational technology solutions creates an advantage by establishing faculty buy-in at the earliest stages of a strategic initiative.

The role of full-time faculty and adjuncts alike includes being key stakeholders in the adoption and scaling of digital solutions; as such, faculty need to be included in the evaluation, planning, and implementation of any teaching and learning initiative. Institutions that address the needs of all faculty through flexible strategic planning and multimodal faculty support are better situated to overcome the barriers to adoption that can impede scale.

...in order for faculty to fully engage in educational technology, training and professional development should be provided to facilitate incorporation of technology... adjunct faculty also need to be considered in professional development...workshops that include both faculty and students could enable learning for both groups of stakeholders."

But I do always bristle when the business of education overrides pedagogy, such as the statement that "frameworks for tech implementation and prioritizing tech that offers high ROI should be a guiding principle for institutional tech adoption for faculty use."

Credit Hours and Personalized Learning

classroomCredit hours are something that still wield a lot of power in education. It plays a role in high schools, but it really rules in higher education.

Credit hours were once known as Carnegie Units. It goes back to 1906, but it was not designed as a way of measuring learning. It was meant as a method to calculate faculty workloads in order to formulate pensions.

Earlier, admission to colleges was by examinations which varied greatly among colleges, but the method was unreliable. Charles W. Eliot at Harvard University devised a contact-hour standard for secondary education, and also the original credit-hour collegiate post-secondary standard. This is where we get our 3 credit course based on 3 contact hours per week. But the widespread adoption of the 120-hour secondary standard did not occur until the Carnegie Foundation began to provide retirement pensions (now known as TIAA-CREF) for university professors. A stipulation of the pensions was that the universities needed to enforce the 120-hour secondary standard in their admissions.

It only took four years for nearly all secondary institutions in the United States to use the "Carnegie Unit" as a measure of secondary course work. 

The Carnegie Foundation also established that both high school preparation and college "work" would include a minimum of four years of study. But the Carnegie Foundation did not intend the Units to "measure, inform or improve the quality of teaching or learning."

Unfortunately, the credit hour became the standard way to measure the student's workload and progress through those four years in secondary and higher education. Should these credit units be revised or abandoned?

The Carnegie Foundation said in 2012 that "technology has revealed the potential of personalized learning," and that "it is time to consider how a revised unit, based on competency rather than time, could improve teaching and learning in high schools, colleges and universities."

Personalized learning is sometimes suggested as a way to replace the Carnegie Unit and credit hours because it could be based on competency rather than time

But what personalized learning means seems to vary by practitioner. Even the term used to describe the practice varies. Personalized learning is sometimes called individualized instruction, differentiated instruction, direct instruction or a personal learning environment. Though they are not all the same things, they are all used to describe education that is adjusted to meet the needs of different students.

Edutopia published an article on several "myths" about personalized learning that are worth considering in any discussion of changing the way we measure workload and progress.

Because many efforts in personalized learning in the 21st century involved computers and software that allowed students to work at their own pace, personalized learning is associated with technology-based instruction.

The "personalized" part of learning is often thought to mean that students work independently. In a class of 25 students it is unlikely that there will need to be 25 distinct learning paths. Students will often work on collaborative competencies along with individual competencies focused on content and skills. Student interests shared with others in the classroom will form affinity groups for group projects and learning experiences.

Personalized learning is about learners moving at their own pace which is why students demonstrating mastery of content fits into a competency-based system.

Truly personalized learning also involves learners in setting goals and being involved in the planning and learning process. This may be the most radically different aspect of personalized learning. It is very "student-centered" so learners can select their resources and explore different ways to learn in flexible learning spaces. They may also connect their learning to their interests and passions, and even have a voice in how their learning will be assessed.

What has not changed in most personalized learning settings today are the competencies that must be met.

Personalized learning allows for self-pacing, but when students move through competencies at different speeds "credit hours" are irrelevant. If one student moves through a course set of competencies in half the "normal" time should they receive all or half the credit. Obviously, they should receive all the credit. What if they move through all the competencies in a program (degree) in two years? Do they graduate?

Personalized learning is an approach to learning — not a set program. And it is still being formulated and experimented with at different grade levels. But our learning experiments should be combined with experimentation in how we measure movement through learning.