Are We Less Adrift Academically?

adrift


It was 3 years ago that I posted about Richard Arum's study of student learning in higher education over a two year period to examine how institutional settings, student backgrounds, and individual academic programs influence how much students learn on campus.  He was measuring "higher order thinking skills" such as problem-solving, critical thinking, analytical reasoning and communication skills.


When he published Academically Adrift: Limited Learning on College Campuses
in 2011, it caused a lot of discussion about higher education and the internal workings of colleges.

One of his overall findings was that that many
students showed no meaningful gains on key measures of learning during
their college years.

Inside Higher Ed posted about two more recent reports that challenge Academically Adrift' underlying conclusions about students' critical thinking gains in
college, and especially the extent to which others have seized on those
findings to suggest that too little learning takes place in college. The
studies by the Council for Aid to Education show that students taking
the Collegiate Learning Assessment made an average gain of 0.73 of a
standard deviation in their critical thinking scores, significantly more
than that found by the authors of Academically Adrift.


So, college does matter?

Richard Arum (NYU) made sure to note methodological differences in how the two sets of data were drawn. (For example, the newer study does not follow the same group of
students over time.)  He also says that his study (done with Josipa Roksa (UVA) never questioned the contribution that college makes to student learning, although that has been the spin given to their research by the book's champions.


Three years ago, Academically Adrift was noteworthy because it used some new assessment tools that specifically measure the "added value" that colleges impart to their students' learning, by allowing for the comparison of
the performance of students over time.


The study was criticized for relying so heavily on the Collegiate Learning Assessment as its way to suggest whether or not students have learned. The reports are full of assessment talk: average gains of 0.73 of a
standard deviation over several test administrations, or maybe it is 0.18, or less
than 20 percent of a standard deviation tracking it over two years (rather than four), or a 
gain of 0.47 standard deviation, still significantly smaller than, but closer to, the CAE finding, or maybe because they followed the same
cohort of students throughout their collegiate careers cross-sectional rather than longitudinal comparison makes it significant, because at most institutions significant numbers of the
entering freshmen will have dropped out and hundreds of
research studies over the years have clearly demonstrated that dropouts
are
not comparable to degree completers.

My head is spinning.

But are we less adrift than we were three years ago? I see no major movement or changes that would indicate that things are any different. But then, we can't even agree on what they were three years ago.



Trackbacks

Trackback specific URI for this entry

Comments

Display comments as Linear | Threaded

No comments

Add Comment

Enclosing asterisks marks text as bold (*word*), underscore are made via _word_.
Standard emoticons like :-) and ;-) are converted to images.
BBCode format allowed
E-Mail addresses will not be displayed and will only be used for E-Mail notifications.
To leave a comment you must approve it via e-mail, which will be sent to your address after submission.

To prevent automated Bots from commentspamming, please enter the string you see in the image below in the appropriate input box. Your comment will only be submitted if the strings match. Please ensure that your browser supports and accepts cookies, or your comment cannot be verified correctly.
CAPTCHA