Competency, Prior and Lifelong Learning and Letting Go

My thoughts today were triggered by listening to an interview on “life-long learning” (LLL) with Marc Singer of Thomas Edison State College (TESC) in New Jersey. (read/listen at

TESC is a "virtual" college and one of the first schools in the country designed specifically for adult learners. TESC offers degree programs and certificates in more than 100 areas of study. The interview focuses on the Associate in Science in Business Administration (ASBA) degree.

The school partners with Saylor Academy and allows students to take free online courses from Saylor and submit their work for credit evaluation by TESC. This results in a fully-online degree for about $5,000 for fees to the college.

I have written before about how competency-based degrees and credits require a rethinking of the credit hour model that higher ed has used for a very long time. This is also true for assessing prior learning and learning from other sources (including MOOCs) because the answer is not to just look at how long you spend in a classroom or online, but on showing what you learned. 

Singer is vice provost of the Center for the Assessment of Learning at TESC. The degree they are offering comes after students take a selection of pre-selected MOOCs and then having their knowledge assessed by TESC. This was a big topic for MOOCs a few years ago, but has been somewhat lost in the the boom (and bust?) of MOOC hype the past year.

">One issue that slowed that acceptance nationally was the lack of alignment between the content of what’s in a MOOC and the college curriculum. As the American Council of Education and the National College Credit Recommendation Service connect with and review the their processes for developing these online courses, acceptance increases. But the assessment and verification of student identity and competencies still usually if left to the crediting institution.

What is not new is the idea of prior learning assessment - sometimes called "experiential learning." There is some adaptation needed here, as assessing learning from a MOOC is not "prior" learning, but it is learning from another source being evaluated by an outside party.

Something that I don't feel should be the number one factor in using and accepting MOOCs is a financial model. But it is high on the list for many colleges. Marc Singer says in the interview: "The first thing people perceive is [granting credit for prior learning is] costing us money. That was an important obstacle for us to address. As it turns out, that’s not the case; I think that particularly as a state institution, where our state (New Jersey) subsidizes some of what we do, we’re not really losing money from this in the way people would expect. I’d also point people through the studies that have been done of students who come to a college, any college, whose credits they’ve acquired through prior learning. Those students tend to be more motivated, more focused on their goals, more self-directed. Because of that, we’ve seen measurable differences in the number of credits they take at an institution like this — they actually take more credits in college, not fewer, because they’re more invested in the process and we’ve validated what they’re bringing to us from outside. Not only that [but] their rates of completion … are much higher than students who don’t bring anything from the outside."

Something else that is not a new issue is the inability of most schools and most faculty in higher ed to move away from the idea that learning is not valid unless they are the source and facilitators that give the content to students.

Google Plus (and minus)

“If content is king, then context is god.” - Gary Vaynerchuk

Google+ is an amazing social media site that allows users to share photos, play games, listen to music and engage in chat.


Google+ is another failed social experiment by Google.

You will get both sides if you talk to users and non-users.

The number of Google+ users continues to grow, if for no other reason than it is tied to all the other Google tools (Gmail, Docs, Search etc.). It surprises people when they learn that YouTube is the second-largest "search engine."  Many Google account holders are on Plus and don't even know it.

If you believe the prognosticators, like Forrester Research, then Google is in a better position than Facebook to bring marketers the “database of affinity.” I hear more frequently that Google+ is a good marketing tool for businesses.  That database of affinity is their ability to collect our interests and preferences and gain insight into each of us as users.

Not that Facebook and others are not trying to gather the kind of big social data that brands want.

Both networks roll out new features like local listings, Google hangouts and verified content.

Google+ has more than 100 million active users and still continues to grow. Facebook has almost 800 million users.

Both are making a play at business. They are in this area to make money. Google+ Communities is a way to tap into prospects and put your business in Google Places for Business. 

What about education? Google Hangouts are a very good way to interact with students, especially if your school uses Gmail and Google apps. Google Circles provide a way to organize classes and groups. How about free HD video broadcasting through Hangouts on Air?

Ripples is another newly introduced  Google+ feature that creates an interactive graphic of the public shares of any public post or URL on Google+ to show you how it has "rippled" through the network.

Education needs to look at how business uses Google+ and decide of there are educational application.  If a business integrates Google Maps with their Google+ profile, it can help them connect with local customers and provide guidance to their location. Does that help with the marketing of a school?

Right now, Google+ does not have the user frequency that other social media sites have, but its too big to be ignored and probably "too big to fail." It is time for schools to define their Google+ strategy. 

Top Higher Education Blogs

Lists of the "top" of anything are debatable, but we were happy to note (a bit late) that Serendipity35 made the list at

The list is useful in that it will probably alert you to some other higher education blogs that you were not aware existed in blogland.

Here is their stated METHODOLOGY for the selections:

At the end of 2012, we looked at college and university blogs as a key source of new, meaningful information.  In order to identify the most useful resources, we ranked the blogs we came across.  The sites we found cover a wide range of ideas and concepts, but our methodology stayed the same.  To create this list, we audited blogs at two different levels.

Level 1: We aggregated a list of over 200 higher education blogs that were already recommended by other respected sources.  We then analyzed each blog one-by-one, color coding the ones we would be most likely to recommend.
Level 2: Our editors visited the top recommended blogs, assessing them for post frequency, comment volume and engagement with the higher ed reader community.  They also looked a variety of other factors, including relevancy, helpfulness, insight, design, reputation and more.

The MOOC of One

I have given several presentations in the past six months on MOOCs and audiences are always interested in the future. After all the hype that MOOCs received in 2012 and 2013, I expected the crash of attention and favor.

I'll have more to say on the future of the form, but in brief, I feel that it will have less traction in academia with formal, credit courses and greater traction with non-credit programs, lifelong learning and professional development.

The slides below seem to be moving in the same direction of thinking. In this talk, Stephen Downes looks at the transition of the massive open online course to applications in the personal learning environment.

He says that "I question what it is to become 'one' - whether it be one course graduate, one citizen of the community, or one educated person. I argue that (say) 'being a doctor' isn't about having remembered the right content, not about having done the right things, not even about having the right feelings, nor about having the right mental representations - being one is about growing and developing a certain way."

He offers audio at

Are We Less Adrift Academically?


It was 3 years ago that I posted about Richard Arum's study of student learning in higher education over a two year period to examine how institutional settings, student backgrounds, and individual academic programs influence how much students learn on campus.  He was measuring "higher order thinking skills" such as problem-solving, critical thinking, analytical reasoning and communication skills.

When he published Academically Adrift: Limited Learning on College Campuses
in 2011, it caused a lot of discussion about higher education and the internal workings of colleges.

One of his overall findings was that that many
students showed no meaningful gains on key measures of learning during
their college years.

Inside Higher Ed posted about two more recent reports that challenge Academically Adrift' underlying conclusions about students' critical thinking gains in
college, and especially the extent to which others have seized on those
findings to suggest that too little learning takes place in college. The
studies by the Council for Aid to Education show that students taking
the Collegiate Learning Assessment made an average gain of 0.73 of a
standard deviation in their critical thinking scores, significantly more
than that found by the authors of Academically Adrift.

So, college does matter?

Richard Arum (NYU) made sure to note methodological differences in how the two sets of data were drawn. (For example, the newer study does not follow the same group of
students over time.)  He also says that his study (done with Josipa Roksa (UVA) never questioned the contribution that college makes to student learning, although that has been the spin given to their research by the book's champions.

Three years ago, Academically Adrift was noteworthy because it used some new assessment tools that specifically measure the "added value" that colleges impart to their students' learning, by allowing for the comparison of
the performance of students over time.

The study was criticized for relying so heavily on the Collegiate Learning Assessment as its way to suggest whether or not students have learned. The reports are full of assessment talk: average gains of 0.73 of a
standard deviation over several test administrations, or maybe it is 0.18, or less
than 20 percent of a standard deviation tracking it over two years (rather than four), or a 
gain of 0.47 standard deviation, still significantly smaller than, but closer to, the CAE finding, or maybe because they followed the same
cohort of students throughout their collegiate careers cross-sectional rather than longitudinal comparison makes it significant, because at most institutions significant numbers of the
entering freshmen will have dropped out and hundreds of
research studies over the years have clearly demonstrated that dropouts
not comparable to degree completers.

My head is spinning.

But are we less adrift than we were three years ago? I see no major movement or changes that would indicate that things are any different. But then, we can't even agree on what they were three years ago.