Memory Sculpting

photo wall
Photo by Rachel Claire from Pexels

I was having a Facebook conversation with a friend about how photos and videos change our memories. Kids who grew up in the past 30 years - and more so in the age of smartphones and social media - have definitely had their memories sculpted by images of their past. My sons have said to me several times when I ask them "Do you remember us being there?" that "I remember the photos of it." Do the photos trigger a memory to return or is the photo the memory itself?

I am fascinated by how memory works. Research shows that when we describe our memories differently to different audiences it isn't only the message that changes, but sometimes it's also the memory itself. Every time you remember an event from the past, your brain networks change in ways that can alter the later recall of the event. The next time you remember it, you might recall not the original event but what you remembered the previous time. This leads some to say that memory is like the "telephone game."

This sent me back to an article I read in 2017. I did a search and found it again since my memory of this article on memory may not be remembered correctly. It is titled "Facebook is Re-Sculpting Our Memory" by Olivia Goldhill. Facebook is not the only social network or the only place that we share photos and videos, but it is a major place for this sharing.

I have a new granddaughter and her parents have set up a shared photo album online for relatives. They don't want people (mostly me - the oversharer) to post photos of her on Facebook, Instagram et al. I understand that privacy caution. My granddaughter will have many thousands of photos and videos to look at one day. I have about two dozen black and white photos of my first two years of life. It is probably two 12 photo rolls of film from that time (the 1950s) which seemed like enough to my parents to chronicle my early life.

Those photos of baby me don't trigger any memories but they are my "memory" of that time along with my mother's narration. "That was your stuffed lamb that was your favorite toy."

I have also kept journals since my teen years. The way to chronicle life once was to write it down. Rereading those journals now is a mixed experience. For some things, the journal is now my memory. Without the entry, I couldn't recall names, places or details from 40 years ago. But for some entries, I know that the version I wrote at age 15 is a kind of augmented reality. I made some things sound better or worse than the actual event. I sculpted the memory. Maybe as my memory degrades, those entries - accurate or not - will become the only memory I have.

Those sculpted memories are not unlike the image of ourselves we put online. Not all, but many people, post almost exclusively the best parts of their lives. Alfred Hitchcock said "Drama is life with the dull bits cut out," and that's true of many virtual lives as portrayed online.

That article references Daniel Schacter, a psychology professor at Harvard University, whose 1990s research first established the effects of photographs on memories. Frighteningly, he showed that it was possible to implant false memories by showing subjects photos of an event that they might have experienced but that they didn’t experience.

Another of his experiments found that while looking at photos triggered and enhanced the memory of that particular event, it also impaired memories of events that happened at the same time and were not featured in the photographs.

This sounds terrible, but one positive effect he has found that comes from weaknesses in our memory helps allow us to think meaningfully about the future.

In our recent discussions about fake news and images and videos that are not accurate, we realize that these weaknesses in memory and the ability to implant memories can be very powerful and also very harmful. "Source information” is a weakness of memory that can be tapped for devious purposes. How often have you heard someone explain that they heard it or read it or saw it "somewhere?"  We commonly have trouble remembering just where we obtained a particular piece of information. Though true off-line, for online information we may recall a "fact" but not the source - and that source may Online, this means we could easily misremember a news story from a dubious source as being from a more credible publication.

One phenomenon of memory is now called “retrieval-induced forgetting” I spent four years living at my college but I have a limited number of photographs from the time. Those photos and ones in yearbooks and some saved campus newspapers, plus my journal entries are primarily what I recall about college life. Related things that I can't review are much harder, if not impossible, to remember.

Social media is certainly sculpting (or perhaps resculpting) our memories. Is this making our ability to remember worse? That's not fully determined as of now. Nicholas Carr wrote a book called The Shallows: What the Internet Is Doing to Our Brains that looked at some neurological science in an attempt to see the impact of computers and the Net and that is certainly related to but not exactly the same as memory and images. The controversial part of Carr's book is the idea that the Internet literally and physically rewires our brain making it more computer-like and better at consuming data. But a surprisingly large section of the book is devoted to the history of the written word and all that it has done to “mold the human mind.”

Facebook, Instagram, TimeHop and other tools are reminding me daily of memories from years past. At times, I think "Oh yes, we were in Prague on this day two years ago." Other times, I say to myself, "I don't remember writing this 4 years ago." I react the same way to my old journals and black and white photos in an album taken a half-century ago.

The Limits of Memory

7There is definitely some psychology to design. And UX design is definitely about organization.

There is a principle of organization that comes psychology that I have seen written about in terms of product and service design. It is Miller’s Law.

It was put forward in 1956 - long before UX and web design was a thing - in a paper by cognitive psychologist George A. Miller. In his well-known paper (at least in psych circles), titled "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information" he proposed a limit to memory which is now called Miller's Law.

Miller proposed that the number of perceptual ‘chunks’ an average human can hold in working memory (a component of short-term memory) is 7. He found that memory performance is great five or six different stimuli but declines aft so let's say 5-9. If the mind can handle ~7 bits of information when completing a task that requires cognitive effort, then designers need to keep that in mind when designing. That would apply to completing forms and surveys. It applies to lists in menus and lots of other tasks that might be presented to users. What happens when a catalog page shows 15 items?

Miller believed that all of us "chunk" information and that if the information is organized in categories no larger than 9, but preferably ~5 chunks, memory is best served.

A related find - which I learned in a writing course - is about primacy, and recency effect (also known as the serial position effect). These two terms describe how we remember items placed at the beginning and end of an experience, and if we forget some it's likely they will be in the middle. Combining this with Miller's Law and you would say that the bigger the number of items, the more middle to be forgotten.

Originally posted on RonkowitzLLC.com 

Lateral Thinking

head
Thinking by Magda Ehlers from Pexels

With all the concern about the pandemic this year, moving courses online and making plans for reopening, I'm afraid that what has been set aside is pedagogy. I did graduate work on a doctorate in pedagogy that I never completed, but it exposed me to a lot of ideas on how we might improve our teaching.

One of the things I learned about some decades ago is lateral thinking developed by Edward de Bono in the 1960s. Lateral thinking fosters unexpected solutions to problems. De Bono believed that we tend to go for the straightforward, and obvious solutions to problems. He encouraged seeking out more oblique, innovative answers.

Lateral thinking is sometimes called “horizontal thinking” as contrasted with vertical thinking. The latter might be defined as going for the first good solution that comes to mind and launch into the details.

Lateral thinking encourages a longer brainstorming session in order to enhance creativity and come up with the most innovative solutions.

There are several lateral thinking techniques: awareness, random stimulation, alternatives, and alteration.

For de Bono, we need to cultivate an awareness of how our minds process information. That is a skill that is very rarely part of any curriculum, and yet moving away from established patterns leads to greater innovation.

Random stimulation is something I have been employing during this pandemic year - and I suspect many readers of this have also - probably unconsciously - done it. Normally, we try to shut out all distractions in order to focus on a task. In lateral thinking, problem-solving improves with some "random" input which often includes information - taking a walk, talking with a colleague or stranger, listening to a podcast, journaling.

At the heart of de Bono's approach is to deliberately consider alternative solutions. That has been described is many ways, including "thinking out of the box." Doing this is not easy for many people. His term, "alteration," can mean using several techniques. You might reverse the relationship between parts of a problem. You might deliberately go in the opposite direction of what’s implied as the correct approach. Sometimes breaking a problem or obvious solution into smaller parts can lead to an alternate mindset about individual parts.

It didn't help the spread of de Bono's theories in academia that he was not a fan of extensive research. He had called research “artificial.” For example, he claimed that “nobody has been able to prove that literature, history or mathematics classes have prepared people for society” - though I think we all believe that they have helped prepare people.

Lateral thinking has its critics, but the basics are sound and I have always thought that incorporating them into classroom activities is a good thing. I have never "taught" de Bono to students, preferring to embed it in activities. 

 

 

The MOOC Revival

online learner
Image by Tumisu from Pixabay

I have been writing a lot about MOOCs since 2012. (Do I still need to explain that a MOOC is a Massive Open Online Course?) That was (as dubbed by The New York Times) the “year of the MOOC.” 

This year, the Times was saying that though MOOCs were "near-death" the COVID-19 crisis has put them back into the "trending" category. Their article is headlined "Remember the MOOCs? After Near-Death, They’re Booming."

Though MOOCs existed prior to 2012, the emergence of online learning networks was something new. While many colleges initially viewed these free online courses as a threat to their tuition systems, within a year many of the most elite colleges began to offer them. It was more than "if you can't beat the, join them." Schools, faculty and students (often on their own) discovered the value of not only MOOCs but online learning in general.

The Times article is negative on the impact of that MOOC revolution saying that "the reality didn’t live up to the dizzying hype." I agree that the hype was truly hype. It was too much. My wife and I wrote a chapter for the book Macro-Level Learning through Massive Open Online Courses (MOOCs): Strategies and Predictions for the Future and we titled it "Evolution and Revolution." The title was not meant as a question. Much of the discussion in 2012 was about the revolutionary nature of MOOCs, but we viewed them through the lens of 2015 and saw them as more evolutionary.  

Fast forward to 2020 - the "year of the pandemic" - and we see schools from kindergarten to graduate schools forced to use online learning in some way. A revolution? No. Again, an evolution that should have started for schools a decade ago but clearly has not for many of them who fond themselves unprepared in march 2020 to go fully online.

MOOCs have changed. My many posts here have shown that the open part of mOoc has become far less open both in the ability to reuse the materials and in the no-cost aspect. Companies have been formed around offering MOOC-like courses, certificates and degrees. 

The biggest criticism of MOOCs was probably that most learners (not always traditional students) never completed the courses. Completion rates in free courses of about 10% certainly sounded like a failure. Making students pay even a small fee or offering credit improved that percentage but not enough to make observers feel the revolution had succeeded.

I never worried about the completion rates because my research and my own experiences teaching and as a learner in these courses made it clear than the majority of students in them never intended to complete all of the coursework. They were there to get what they wanted to learn and get out. They didn't need to take a freshman year of requirements and prerequisites or gain admission to Stanford in order to take a course on artificial intelligence from Stanford. 

Of course, as the Times article points out, MOOCs kept going without all the hype. They evolved, and in some ways so did online learning because of them. Platforms and for-profit companies emerged and certificates, fully online MOOCish degrees, and nanodegress were offered. 

With the spotlight off them, MOOCs were able to evolve into different species - free, for-profit, accredited, for lifelong learning, massive, small, skills training, corporate, for K-12, etc. 

Sheltering and working and learning from home has given another boost to that second "O" in moOc. The providers like Coursera have signed up 10 million new users since mid-March, and edX and Udacity have seen similar surges. And that doesn't even take into account the less-visible use of big (such as Khan Academy) and small grassroots use of these courses by teachers and students.

My wife and I are now writing a journal article for this fall about online learning as a solution for some crises in higher education. 2020 has definitely a time of both crisis and opportunity for online learning. I hope the hype doesn't return to the MOOC. It did not serve it well in the past.

If you have any thoughts on the current state of MOOCs and online learning, contact me.