The Highest Paid Majors Demythified

gradsA newsletter from Jeff Selingo pointed to an upcoming piece forThe New York Times that he wrote about the biggest myths surrounding the college major.

How did you pick your major? You probably got guidance from school counselors but also less formally from family and friends,  and from news articles and headlines in the media that talked about the "fastest-growing fields" and who gets paid what.

Selingo cites a report that says all that advice on what to study in college perpetuates myths. This Gallup report details that the majority (55%) of U.S. adults with at least some college but no more than a bachelor's degree list their informal social network as providing advice about their college major. This is the most often-cited source of advice when choosing a major for the majority of U.S. adults.

The past few decades have seen a push to STEM fields. That push last occurred in the 1950s in the U.S. when we were in a space race and seemed to be falling behind other countries. In the 1950s, we were lagging behind Russia and Japan, but today most of the talk is about China and India. 
 
Yes, STEM fields do generally pay well and are we still have fewer students prepared to work in those fields. But the newsletter points to an interactive graphic from Doug Webber at Temple University that shows there is a lot of overlaps plenty of overlap between earnings in different fields. There also is a big difference in being an average or below-average engineering employee and at the top of earners with an English major.  
 
The most popular undergraduate major now is not in STEM but in business. The lifetime earnings of the typical business graduate is $2.85 million, but an English major is $2.76 million, and psychology is $2.57 million and even a history major totals up at $2.46 million.
 
Perhaps the lesson for high school students is that you shouldn't pick a major based on projected earnings.

Education and the Gig Economy

gigI mentioned the Gig Economy to a colleague at a college last week and he said he had never heard of the term. I said that "gig" is a term I associate with musicians who move from job to job, gig to gig. Now, it is being applied to a labor market characterized by the prevalence of short-term contracts or freelance work as opposed to permanent jobs. "But it has nothing to do with education," he commented. That got me thinking. Is it affecting education?

A study by Intuit predicted that by 2020, 40 percent of American workers will be independent contractors. Most discussions of the gig economy talk about job sharing apps like Uber, Instacart and TaskRabbit. There has long been short term, contract and freelance work being done in the labor market. But the type that is being done by college graduates is said to have grown by more than 50% over the last decade.

Jeff Selingo referenced studies that contend that all the net job growth since the Great Recession has been in the gig or contract economy, and that 47% of college-age students did some sort of freelancing work last year, along with 43% of millennials.

My first thought about gig work in higher education is adjuncts. With more and more adjuncts (and fewer full-time faculty) being used in colleges, many adjuncts put together gigs at several schools. If teaching is your only job, that means trying to get three or more classes per semester fall, spring and summer.

I pulled some books off the bookstore shelf this past weekend and looked at what is being written about The Future Workplace ExperienceThe Gig Economy and Thriving in the Gig Economy are examples. 

They talk about dealing with disruption in recruiting and engaging employees A lot of the popular of the media focus is on the low end of the skill spectrum. Less attention is given to college grads and professionals who have chosen this independent employment route.

I found so many different stats on the size of this gig workforce that I hesitate to link to a source. One book says more than a third of Americans are working in the gig economy. That seems high by my own circle of friends and colleagues, but this includes short-term jobs, contract work, and freelance assignments 

I am now officially in retirement - or unretirement as I prefer to say. I have written elsewhere about unretirement and freelance work which is part of the gig economy. I take on teaching, web and instructional design gigs on a very selective basis. I choose things that interest me and allow me the freedom to work when I want to work and from where I want to work.  Sometimes the work comes from traditional places. I did a 6-month gig with a nearby community college that I had worked at full-time in the past. I have two new web clients for whom I am designing sites and e-commerce stores.

But let's return to what this might have to do with education. Higher education as preparation for a job has always been a topic of debate. "It's not job training," is what many professors would say. Employers have always played a large role in the training and professional development of their workers whether they have degrees or not.

In a gig economy, freelancers have to be self-directed in their learning. They need to decide what knowledge they’re missing, where to acquire it, how to fit it in to their day and how to pay for it. The free (as in MOOC and other online opportunities) is very appealing. Do schools that charge tuition and have traditional classes have any appeal to these people?

Certainly, driving for Uber doesn't require a degree, though having some business training in order to be self-employed would be beneficial. But my interest is more with "professional" freelancers. Take as an example, someone who has some college, certification or preferably a degree, that makes them able to promote themselves as an instructional designer or social media manager. I choose those two because I have done both as a freelancer and I know that if I look right now on a jobs site such as Glassdoor I will find hundreds of opportunities for those two areas locally.

Businesses and colleges save resources in terms of benefits, office space and training by employing these people. They also have the ability to contract with experts for specific projects who might be too high-priced to maintain on staff.

For some freelancers I know, a gig economy appeals because it offers them more control over their work-life balance. In that case, they are selecting jobs that they're interested in, rather than entering the gig economy because they are unable to attain employment, and so pick up whatever temporary gigs they can land. The latter is often the case with adjunct faculty. 

To someone mixing together short-term jobs, contract work, and freelance assignments, where would they go to find additional professional development?

Books like The Gig Economy - with its appealing subtitle offer of being "The Complete Guide to Getting Better Work, Taking More Time Off, and Financing the Life You Want" - is more interested in real-world corporate examples (Airbnb, Lyft, Uber, Etsy, TaskRabbit, France's BlaBlaCar, China's Didi Kuaidi, and India's Ola) as crowd-based capitalism.

The freelancer may not be much concerned with emerging blockchain technologies, but she is certainly part of the changing future of work.

The future is always a land of questions: Will we live in a world of empowered entrepreneurs who enjoy professional flexibility and independence? Will these gig economy workers become disenfranchised, laborers jumping from gig to gig, always looking for work and paying heir own health benefits? How will this affect labor unions, self-regulatory organizations, labor law, and a new generation of retirees who have a more limited social safety net? Are long-term careers at one or two companies a thing of the past?

Robin Chase, founder of Zipcar, the world’s largest car sharing company, said, “My father had one job in his life, I’ve had six in mine. My kids will have six at the same time.”

The one thing all observers seem to agree on is that the way we work is changing.

Jennifer Lachs writes on opencolleges.edu.au about that changing working world and the possible impact it may have on education. I hadn't thought of it as a gig economy job but of course substitute teachers in K-12 education have long been employed on a freelance basis. The education and training industry is among the top 5 highest demand industries for freelance workers due to the high level of specialization and rise of virtual education.

I know of a dozen or so teachers who do online teaching and tutoring as a way to supplement their income. For decades, professors have done freelance writing and thesis editing and much of that has moved online. My wife and I are currently editing a dissertation via email and shared files along with the occasional phone conference.

The writing center I helped build at a community college has relied on online tutoring for student writing as a way to supplement the face-to-face tutoring. Online appealed to students, but it also offered additional work for some of out part-time tutors and others who added it to the gig list.

Are we preparing students for the gig economy once they graduate? No. 

A friend pointed me at "It’s a Project-Based World" which was a thought leadership campaign by Getting Smart to explore the economic realities of a project-based world. The purpose of the campaign: to promote equity and access to deeper learning outcomes for all students. There are blog posts, podcast interviews, publications, and infographics around the preparation of students, teachers and leaders for a project-based world. The focus there seems to be less on obtaining deeper knowledge, and more on teaching skills that students will need in the modern working world.

Finally, I think that the gig economy will have a greater impact on traditional education than traditional education will have on the gig economy. It accounts for employment growth statistics, but secondary or post-secondary schools don't prepare students for this type of work.

 

Workplace Skills Shifting - Are Colleges Shifting Too?

Jeff Selingo has been writing about higher ed for two decades and lately he has been looking at some of the "big ideas" that colleges and universities should consider. These ideas are through the lens of the changing workplace.

Whether you are talking about automation or the gig economy and the rise of the virtual (what we used to call freelance) worker, the skills required,or at least desired, have changed in two decades.

In the second part of his paper, "The Future of Work," he shows that more than half of jobs expected to require cognitive abilities as part of their core skill set in 2020 do not yet do so or do to only a small extent. 

 

You would think that colleges are always looking at what the workplace want or demands and are changing their courses and programs to offer those things. You would mostly be wrong in that assumption.

Jeff Selingo is the author of three books, the newest of which, There Is Life After College. He is a special advisor and professor of practice at Arizona State University, a visiting scholar at Georgia Tech’s Center for 21st Century Universities. More at jeffselingo.com

The Myth of Digital Natives

baby with computer

When I was fairly new to working in higher education, there was a lot of buzz about the students we were getting being "digital natives."  This was around 2001 and educator Marc Prensky had coined the term in an essay.

The claim was that these digital natives had a kind of innate facility with technology because they were born into it. This was also extended to them having increased abilities to do things like multitask.

Prensky took it further by saying that educators needed to change their ways to deal with this tech-savvy generation.

But new research (see below) indicates that this digital native theory is false. 

A digital native who is information-skilled simply just because they never knew a world that was not digital doesn't exist. This is also true in that any special abilities of students in this generation to multitask is also untrue. In fact, designing learning with this assumption hurts rather than helps learning.

We were naive to think that someone could pick up digital skills intuitively. But this may also be a dangerous fallacy that risks leaving young people lacking certain skills that were assumed to be known or so were not taught or emphasized.

I was never a proponent of this digital natives  - and digital immigrants - because I viewed "tech-savvy" as a very superficial kind of knowledge. I found most students in this group to be users of technology, but using a computer or cellphone doesn't impart understanding.

In 2007, I wrote about earlier research that was pointing towards this idea being false. Now, it seem definitive that all of this generational separation is a fallacy. It turns out that none of us is good at multitasking. We do it out of necessity, but something always suffers. Some studies have shown that a driver using a cellphone is the equivalent of a drunk driver.

Millennials - the group often labeled as being natives - don’t necessarily use technology more often and are no better at using basic computer programs than older generations.  Researchers also found that besides educators Millennials also have bought into the myth. Twice as many of them self-identify themselves as digitally proficient as actually would be assessed at that level.

The only aspect of all this that makes sense is that those people born into a technology are less likely to hesitate to use it or fear it. Clearly, a toddler today who is playing with a smartphone at age two and using apps will have no problems using it in other more serious ways in kindergarten. If these young people are better at using a totally new technology than a 70 year old person, I will consider that more about an aging brain than a birth year.

Read More

blogs.discovermagazine.com/d-brief/2017/07/27/

ecdl.org/policy-publications/digital-native-fallacy

sciencedirect.com/science/

 

When Accepted Students Don't Show Up at College

I had a discussion with some colleagues after listening to an episode of NPR's Hidden Brain podcast about research that shows that between 10% and 40% of the kids who intend to go to college at the time of high school graduation don't actually show up in the fall.

I'm doing some consulting for a community college this summer and I asked if this seemed accurate for that school. It turned out that the previous week staff at the college had been asked to "cold call" students who registered for fall courses but were dropped for non-payment and never re-registered. The college's enrollment is down 10% and it is a big concern.

meltingThis phenomenon is sometimes called "summer melt."

It is puzzling why kids who made it through the admissions process and were accepted to a college of their choice, applied for and received financial aid, never showed up for classes.

At my urban community college, financial aid was the most common reason. They registered, but aid did not come through in time to pay the bill. The odd part - the "melt" - was that when their aid did come through, they didn't re-register.

Why? Some had lost interest or felt discouraged by the process. Some reevaluated going to college. Some were just lazy. A few staffers were able to walk students over the phone through re-enrolling, so part of the problem might be information and support from the college.

In the podcast, Lindsay Page, an education researcher now at the University of Pittsburgh who did research while at Harvard, said "The rate with which kids who are college-intending do not actually get to college in the fall is surprisingly high. In one sample that we looked at in the Boston area, we find that upwards of 20% of kids who at the time of high school graduation say that they're continuing on to college don't actually show up in the fall."

This nationwide loss of seemingly college-intending students is particularly evident for those from low-income backgrounds.

But research has also identified relatively low cost interventions that can have a significant impact on alleviating the summer melt phenomenon and increasing college enrollment rates.

Page's research at Harvard was published in the "SDP Summer Melt Handbook: A Guide to Investigating and Responding to Summer Melt." In the report, they use “summer melt” to refer to a "different, but related phenomenon: when seemingly college-intending students fail to enroll at all in the fall after high school graduation. 'College-intending' students are those who have completed key college-going steps, such as applying and being accepted to college and applying for financial aid if their families qualify. In other cases, they have concretely signaled their intention to enroll in college on a high school senior exit survey. We consider a student to have “melted” if, despite being college-intending, she or he fails to attend college the following fall."

Some of their interventions go back to students' high school day and records, such as senior exit surveys, and survey high school counselors. They also provide examples of summer task lists, both personalized for specific institutions and generic, and sample documents for proactive personal outreach, such as an initial outreach checklist, assessment meeting checklist, intake form, and counselor interaction logs. 

Download the report and other resources at sdp.cepr.harvard.edu/summer-melt-handbook 

LISTEN to the Hidden Brain podcast on this topic  npr.org/2017/07/17/537740926/why-arent-students-showing-up-for-college

After the MOOC Revolution

MOOC revolutionIn 2008 when I read about a professor making his course "open learning," I wasn't prescient enough to see the rise of MOOCs or any coming revolution in learning, especially online. 

The term MOOC, for Massive Open Online Courses, became the popular terminology for the concept behind that 2008 experiment. Almost everyone was saying it was a revolution that would disrupt universities. Sebastian Thrun, co-founder of Udacity, famously predicted that in 50 years there would be only 10 higher education institutions.  That didn't happen. 

I wrote a book chapter with my wife a few years ago about whether or not the "MOOC Revolution" was in fact a revolution or rather an evolution of learning and learning online.  And recently I saw that Jeffrey Young, author of that 2008 piece, has posted this year asking "What if MOOCs Revolutionize Education After All?"

His new post and podcast on EdSurge focuses on Barbara Oakley, a professor of engineering at Oakland University, who thinks a lot about how people learn particularly because she has been teaching a lot of them in one of the most popular online courses ever. "Learning How to Learn" has had more than 2 million participants and teaching it has her believing, despite the cooling of the MOOC Revolution hype, that free online courses might still lead to a revolution in higher education.

 

Professor Oakley thinks that MOOCs will enhance classrooms and also serve as competition, which will force schools to jump over a higher bar.

In our chapter on MOOCs, we said that "Most technological change involves massive disruption whereas economic ‘bubbles’, like the trillion-dollar student loan bubble in the U.S., tend to burst, not slowly deflate. Initially, the disruption of the MOOC may have appeared to be a rapid revolution just a few years ago, but it seems more likely to become a gradual evolution over the course of the next decade." I think that prediction is holding true.

Through this blog and a LinkedIn group called "Academia and the MOOC" that I started in 2013, I have met many people from around the world who are using MOOCs. The group is for or anyone interested in how MOOCs have impacted education and how they might in the future, and it began with members of the MOOC of the same name hosted in the Canvas Network in Spring 2013 and taught my myself, my wife, Lynnette Ronkowitz and Mary Zedeck.

On of the people I have met virtually is Muvaffak Gozaydin. He contacted me last fall about a "crazy idea" he had to provide no-cost graduate degrees using MOOCs. He contacted me again this summer to tell me that his crazy idea was launching. He wants to offer "professional learners" the opportunity to get an MS degree online by selecting courses offered already courses from Harvard, Stanford, MIT, Duke, Yale and other top schools. His site is at mguniversity2017.org and he has organized it to direct students towards a course catalog for five degrees currently. His project is not accredited in any country, but all the universities and courses offered are accredited and he hopes that holders of "degrees" from MGU can use them to find or advance in jobs internationally. 

Is his idea crazy? He asked me that again this year before he launched his site. It reminded me of something Dhawal Shah, the founder of Class Central, has said recently: "...there’s been a decisive shift by MOOC providers to focus on 'professional' learners who are taking these courses for career-related outcomes. At the recently concluded EMOOCs conference, the then CEO of Coursera, Rick Levin, shared his thoughts on this shift. He thinks that MOOCs may not have disrupted the education market, but they are disrupting the labor market. The real audience is not the traditional university student but what he calls the 'lifelong career learner,' someone who might be well beyond their college years and takes these online courses with the goal of achieving professional and career growth."

That last sentence was one of the conclusions of our book chapter. Maybe the revolution is bigger than disrupting universities. Maybe the revolution is about learning and not only in schools at all grade levels but also in business, industry and professional learning. All will be disrupted.

Shah, Gozaydin, Thrun and others have concluded two things about the MOOC revolution:  1) The real audience is the professional learner working in a field and with an undergraduate degree who wants to advance.  2) There are already plenty of online courses available from top universities and other providers to offer in packages (call them degrees, certificates, mini-degrees etc.) either free or with a fee smaller than that of a traditional university that carries some evidence of quality and completion.

The biggest issue with the truly open and free online courses, massive or not, has been since the beginning using them for advancement, either towards degrees or professional advancement. If you are looking to advance your own knowledge and skills without concern for official "credits," the MOOC is ideal. 

You can find more than 1,250 free courses listed at openculture.com, but what does a learner do with those courses? Minimally, which is not to say inconsequentially, is that Gozaydin has done the work of organizing the many scattered MOOC offerings of the world into five intelligently planned paths for learners to coursework from the leading universities all on one web page.