Learning and Working in the Age of Distraction

screensThere is a lot of talk about distraction these days. The news is full of stories about the Trump administration and the President himself creating distractions to keep the public unfocused on issues they wish would go away (such as the Russias connections) and some people believe the President is too easily distracted by TV news and Twitter.

There are also news stories about the "distraction economy."  So many people are vying for your attention. The average person today is exposed to 1,700 marketing messages during a 24-hour period. Most of these distractions are on screens - TV, computers and phones.  Attention is the new currency of the digital economy.

Ironically, a few years ago I was reading about "second screens," behavioral targeting and social media marketing and that was being called the "attention economy." There is a battle for attention, and the enemy is distraction.

Google estimates that we spend 4.4 hours of our daily leisure time in front of screens. We are still using computers mostly for work/productivity and search. We use smartphones for connectivity and social interactions. Tablets are used more for entertainment. My wife and I are both guilty of "multi-screening." That means we are part of the 77% of consumers watching TV while on other devices. I am on my laptop writing and researching and she is on her tablet playing games and checking mail and messages. It is annoying. We know that.

Of course, the original land of distraction is the classroom. Students have always been distracted. Before the shiny object was a screen full of apps, passing notes was texting, and doodling in your notebook and the cute classmates sitting nearby were the social media. But I have seen four articles on The Chronicle website about "The Distracted Classroom" lately. Is distraction on the rise?

If you are a teacher or student, does your school or your own classroom have a policy on using laptops and phones? If yes, is it enforced?  Anyone who has been in a classroom lately of grade 6 or higher knows that if students have phones or laptops out in class for any reason they are texting, surfing the web, or posting on social media.

Good teachers try to make classes as interactive as possible. We engage students in discussions, group work and active learning, but distractions are there.

Banning devices isn't a good solution. Things forbidden gain extra appeal.

distractionsA few books I have read discuss the ways in which distraction can interfere with learning. In The Distracted Mind: Ancient Brains in a High-Tech World , the authors say that distraction occurs when we are pursuing a goal that really matters and something blocks our efforts to achieve it. Written by a neuroscientist, Adam Gazzaley, and a psychologist, Larry D. Rosen, they join other researchers who report that our brains aren't built for multitasking. This compares to a time a few decades ago when being able to multitask was consider a positive skill.

It seems that the current belief is that we don't really multitask. We switch rapidly between tasks. Any distractions and interruptions, including the technology-related ones - act as "interference" to our goal-setting abilities. 

But is this a new problem or has our brain always worked this way? Is the problem really more about the number of possible distractions and not our "rewired" brains?

Nicholas Carr sounded an alarm in 2011 with The Shallows: What the internet is doing to our brains, arguing that our growing exposure to online media means our brains need to make cognitive changes. The deeper intellectual processing of focused and critical thinking, gets pushed aside in favor of the faster processes like skimming and scanning.

Carr contends that the changes to the brain's "wiring" is real. Neural activity shifts from the hippocampus' deep thinking, to the prefrontal cortex where we are engaged in rapid, subconscious transactions. Substitute speed for accuracy. Prioritize impulsive decision-making over deliberate judgment. 

In the book Why Don't Students Like School?: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom  the author asks questions such as Why Do Students Remember Everything That's on Television and Forget Everything I Say? and Why Is It So Hard for Students to Understand Abstract Ideas? and gives some science and suggestions as answers. But these are difficult questions and simple answers are incomplete answers in many cases.

Some teachers decide to use the tech that is being a distraction to gain attention. I had tried using a free polling service (Poll Everywhere) which allows students to respond/vote using their laptops or phones. You insert questions into your presentation software, and that allows you to track, analyze, and discuss the responses in real time. The problem for me is that all that needs to be pre-planned and is awkward to do on-the-fly, and I am very spontaneous in class with my questioning. Still, the idea of using the tech in class rather than banning it is something I generally accept. But that can't be done 100% of the time, so distracted use of the tech is still going to occur.

bubbleAnd the final book on my distraction shelf is The Filter Bubble. The book looks at how personalization - being in our own bubble - hurts the Internet as an open platform for the spread of ideas. The filter bubble puts us in an isolated, echoing world. The author, Eli Pariser, subtitles the book "How the New Personalized Web Is Changing What We Read and How We Think." Pariser coined the term “filter bubble.” The term is another one that has come up o the news in talking about the rise of Donald Trump and the news bubble that we tend to live in, paying attention to a personalized feed of the news we agree with and filtering out the rest.

Perhaps creating a filter bubble is our way of coping with the attention economy and a way to try to curate what information we have to deal with every day.

Then again, there were a number of things I could have done the past hour instead of writing this piece. I could have done work that I actually get paid to do. I could have done some work around my house. But I wrote this. Why? 

Information overload and non-stop media is hurting my/our discipline for focus and self-control.

Michael Goldhaber defined the attention economy in this more economic way: "a system that revolves primarily around paying, receiving and seeking what is most intrinsically limited and not replaceable by anything else, namely the attention of other human beings.” In order for that economy to be profitable, we must be distracted. Our attention needs to be drawn away from the competition.

As a secondary school teacher for several decades, I saw the rise of ADHD. That was occurring before the Internet and lack of attention, impulsivity and boredom were all symptoms. It worsened after the Internet was widespread, but it was there before it and all the personal digital devices.

Back in 1971,  Herbert A. Simon observed that “what information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

We are collectively wiser than ever before. We have the wisdom of the world in a handheld computer connected to almost everything. But it is so difficult to filter out the distractions and garbage that we don't have a lot of success translating information into knowledge. People used to say that finding out something on the Internet was like taking a sip from a fire hose. Search filtering has helped that, but so far the only filters for our individual brains are self-created and often inadequate.

 

Machine Learning :: Human Learning

AI - “artificial intelligence” - was introduced at a science conference at Dartmouth University in 1956. Back then it was a theory, but in the past few decade it has become something beyond theoretical. been less theory and more in practice than decades before.

The role of AI in education is still more theory than practice.

A goal in AI is to get machines to learn. I hesitate to say "think" but that is certainly a goal too. I am reading The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution currently and in that history there is a lot of discussion of the people trying to get machines to do more than just compute (calculate) but to learn from its experiences without requiring a human to program those changes. The classic example is the chess playing computer that gets better every time it wins or loses. Is that "learning?"

But has it had an impact on how you teach or how your students learn?

It may have been a mistake in the early days of AI and computers that we viewed the machine as being like the human brain. It is - and it isn't.

But neuroscientists are now finding that they can also discover more about human learning as a result of machine learning. An article on opencolleges.edu.au points to several interesting insights from the machine and human learning research that may play a role in AI in education.

One thing that became clear is that physical environment is something humans learn easier than machines. After a child has started walking or opened a few doors or drawers or climbed a few stairs, she learns how to do it. Show her a different door, drawer, or a spiral staircase and it doesn't make much of a difference. A robot equipped with some AI will have a much steeper learning curve to learn these simple things. It also has a poor sense of its "body." Just watch any videos online of humanoid robots trying to do those things and you'll see how difficult it is for a machine.


Then again, it takes a lot longer for humans to learn how to drive a car on a highway safely. And even when it is learned, our attention, or lack thereof, is a huge problem. AI in vehicles is learning how to drive fairly rapidly, and its attention is superior to human attention. Currently, it is still a fall back human error in mist cases, but that will certainly change in a decade or two. I learned to parallel park a car many years ago and I am still lousy at doing it. A car can do it better than me.

Although computers can do tasks they are programmed to do without any learning curve, for AI to work they need to learn by doing - much like humans. The article points out that AI systems that traced letters with robotic arms had an easier time recognizing diverse styles of handwriting and letters than visual-only systems. 

AI means a machine gets better at a task the more it does it, and it can also apply that learning to similar but not identical situations. You can program a computer to play notes and play a series of notes as a song, but getting it to compose real music requires AI.

Humans also learn from shared experiences. A lot of the learning in a classroom comes from interactions between the teacher and students and student to student. This makes me feel pretty confident in the continued need for teachers in the learning process.

One day, I am sure that machines will communicate with each other and learn from each other. This may be part of the reason that some tech and learning luminaries like Elon Musk have fears about AI

I would prefer my smart or autonomous vehicle to "talk" to other vehicles on the roads nearby and share information on traffic, obstructions and vehicles nearby with those quirky human drivers only.

AI built into learning systems, such as an online course, could guide the learning path and even anticipate problems and offer corrections to avoid them. Is that an AI "teacher" or the often-promoted "guide on the side?"

This year on the TV show Humans, one of the human couples goes for marriage counseling with a "synth" (robot). She may be a forerunner of a synth teacher.

Humans TV
The counselor (back to us) can read the husband's body language and knows he does not like talking to a synth marriage counselor.

 

What Is a Modern Learning Experience?

social on mobile

Jane Hart, who I have been following online for many years, is the Director of the Centre for Modern Workplace Learning, which she set up to help organizations and learning professionals modernize their approaches to workplace learning. Reading her online Modern Workplace Learning Magazine has alerted me to trends outside academia and outside the United States.  

She recently posted an article titled "Designing, delivering and managing modern learning experiences" and that made me consider how I would define "modern learning." It would include school experiences for some of us, but for most people today it is more likely an experience that occurs in the workplace and on our own. That itself seems like a big shift from the past. Or is it?

If in 1917, someone had wanted to become a journalist, he could go to college, but he could also get a job without a degree - if he could show he was a good writer. He could do some freelance writing with or without pay to get some experience and samples. Move 50 years to 1967, and the path was more likely to be a school of journalism. What about today?

As Jane points out, the modern learning experience path for the workplace probably includes using: 

  • Google and YouTube to solve their own learning and performance problems
  • social networks like Twitter and LinkedIn to build their own professional network (aka personal learning network)
  • messaging apps on their smartphones to connect with colleagues and groups
  • Twitter to participate in conference backchannels and live chats
  • participating in online courses (or MOOCs) on platforms like Coursera, edX and FutureLearn

The modern learning experience is on demand and continuous, not intermittent, and takes place in minutes rather than hours. It occurs on mobile devices more than on desktop computers.

Jane Hart believes it is also more social with more interacting with people, and that it is more of a personally-designed experience. I don't know if that is true for educational learning. Is it true for the workplace on this side of the pond? Does the individual design the learning rather than an experience designed by someone "in charge."

Modernizing classroom learning has often been about making learning more autonomous (self-directed, self-organized and self-managed) but that model does not easily fit into the model used for the past few hundred years in classrooms.

Defining Personalized Learning

mazeThe term "personalized learning" came up recently in several articles about Facebook founder and CEO Mark Zuckerberg and his pediatrician wife Priscilla Chan investing hundreds of millions of dollars a year in a new vision of “whole-child personalized learning.”

Their recently established Chan Zuckerberg Initiative (CZI) intends to support the development of software that might help teachers better recognize and respond to each student’s academic needs. But they also intend to use a holistic approach to nurturing children’s social, emotional, and physical development. That's a tall order. And not one that has not been attempted before.

In the 40 years I have been an educator, I have heard about personalized learning under terms like individualized instruction, personal learning environment, direct instruction differentiation and even adaptive learning. All refer to efforts to tailor education to meet the different needs of students.

The use of the term "personalized learning" dates back to at least the early 1960s, but definitions still vary and it is still an evolving term. In 2005, Dan Buckley defined two ends of the personalized learning spectrum: "personalization for the learner", in which the teacher tailors the learning, and "personalization by the learner", in which the learner develops skills to tailor his own learning. This spectrum was adopted by the (2006) Microsoft's Practical Guide to Envisioning and Transforming Education and has been updated by Microsoft in other publications.

CZI now has former Deputy U.S. Secretary of Education James H. Shelton as the initiative’s president of education. It is encouraging to me that he said “We’ve got to dispel this notion that personalized learning is just about technology. In fact, it is about understanding students, giving them agency, and letting them do work that is engaging and exciting... Many people have a preconceived notion that ‘personalized learning’ is a kid in the corner alone with a computer. Forget about that.”

CZI will direct 99 percent of their Facebook shares (an $45 billion) to causes related to education and science, through a combination of charitable giving and investment.

Being in technology, you would expect Zuckerberg to want to put a lot of the money and efforts into that area. That's what happened with many of the efforts that the Bill and Melinda Gates Foundation have made in education.

Adaptive learning - which I don't see as the same thing as personalized learning but some people do -  is an educational method which uses computers as interactive teaching devices. The technology allocates both the human (teachers, tutors, counselors)  and mediated resources according to the unique needs of each learner. Computers adapt the presentation of educational material according to students' learning needs. A lot of computer-aided assessment and responses to questions, tasks and experiences direct the next step for the learner.

Adaptive learning technology encompasses aspects derived from various fields of study including computer science, education, psychology, and brain science. Although this approach is not teacher- or student-centered, it does attempt to transform the learner from passive receptor of information to collaborator in the educational process. Adaptive learning systems have been used in education and also in business training. 

CZI realize this personalized learning will extend over decades. They began in December 2015, shortly after the birth of their first child.

The Initiative has invested in BYJU’S, an India-based startup behind a popular online-learning app, and Enlearn, a Seattle-based nonprofit that has developed a new adaptive-learning platform. CZI has also partnered with the Bill & Melinda Gates Foundation on a $12 million “venture philanthropy” grant award. 

When I was starting my teaching career in the mid-1970s, the personalization was mostly driven by teachers and rarely used technology. 

But how does this fit into the newest version of the main federal K-12 education law, Every Student Succeeds Act. Unfortunately, our national plans usually only last for only 4 or 8 years (based on administrations), so we never see a cohort of students go through an educational lifetime. The new law does seem to push states and schools to think about more than standardized-test scores when determining what it means to help students thrive.

Do we need a clear and set definition of personalized learning in order to move forward? How does the CZI idea of educating the "whole child" fit into personalizing learning? 

Virtual Reality Education and Flying Cars

holodeck

The Holodeck

People love to use the prediction that we would all be using flying cars by the 21st century as an example of a future technology that never happened. Remember how virtual reality and the augmented reality was going to change everything? So far, it's not.

Last summer, Pokemon Go was huge and even though many people would dismiss it as a silly game, it was AR and seemed like it might change gaming and who knows what else. The promise, or perhaps more accurately the potential, of VR in education is also a popular topic. 

We know that the Internet enabled students to access materials from other institutions and to travel to distant places for their research. Virtual reality may one day change the ways in which we teach and learn. That has me thinking about "virtual reality education" - something I imagine to be unbound by physical spaces like classrooms or campuses and time.That sounds like online learning, but it would be beyond the online learning.

Remember the "holodeck?" Originally, it was a set from the television series Star Trek where the crew could engage with different virtual reality environments. It came back into my view with Janet Murray's book Hamlet on the Holodeck: The Future of Narrative in Cyberspace. She considered whether the computer might provide the basis for an expressive narrative form in the way that print technology supported the development of the novel. In the 20th century, film technology supported the development of movies. 

And remember virtual worlds like Second Life and Active Worlds? I knew a number of educators and schools that made a real commitment to its use in education. I don't know of any of them that are still using virtual worlds.

I'm hopeful that VR, AR, or some version of a holodeck or virtual world will some day enhance education, but so far, I'm still operating in Reality Reality.

The Internet As Café

caféI was quite charmed last year when I made my first visit to Prague in the Czech Republic. I had in my mind a Romanticized version of the city and its famed café culture. In my imagination, it was people sipping coffee on sidewalk table and talking about art and literature. When my wife and I went for coffee and dessert at the Café Imperial, it was certainly much grander than anything I had imagined.

We did find those little cafés too, so I was able to embrace my Romantic version of the city. There is also the well-documented role of  the coffeehouse in the Age of Enlightenment. These informal gatherings of people played an important role in innovation in politics, science, literature and religion.

Next year, I hope to visit the Café de Flore which is one of the oldest coffeehouses in Paris. Located at the corner of Boulevard Saint-Germain and Rue Saint-Benoît, it is known for its history of serving intellectual clientele. At one time, those tables overheard conversations from existentialist philosopher Jean-Paul Sartre,  writer Albert Camus and artist Pablo Picasso.

In science, breakthroughs seem to rarely come from just one person working alone. Innovation and collaboration usually sit at the table together. We are currently in a time when, at least in American politics, collaboration seems nonexistent.

This notion is what caught my attention in an interview I heard with Steven Johnson who wrote Where Good Ideas Come From.
He writes about how “stacked platforms” of ideas that allow other people to build on them.  This way of ideas coming together from pieces borrowed from another field or another person and remixing feels very much like what has arisen in our digital age.

One example he gives is the 1981 record My Life in the Bush of Ghosts by Brian Eno and David Byrne. It is an innovative album for that time in its use of samples well before the practice became mainstream. Eno was inspired by the varied voices and music and advertising on New York AM radio which was so different from the straightforward BBC radio he grew up with in England. He thought about repurposing all that talk into music.

We call that “decontextualizing” now – in this case a sound or words taken out of context and put in a new place. But this borrowing and remixing also occurs with ideas in culture, science and technology.

Unfortunately, ideas are not always free to connect with each other. Things like copyright and intellectual property law get in the way. We often silo innovators in proprietary labs or departments and discourage the exchange of ideas.

I didn’t know that Ben Franklin had a Club of Honest Whigs that would meet at the London Coffeehouse, when he was in England and they would hang out and exchange ideas.

Johnson describes these as “liquid networks” – not so much for the coffee, but for the fluidity in the conversation. These informal networks work because they are made up of different kinds of people from different backgrounds and experiences. Diversity is not just necessary as a biological concept but as an intellectual one.

The Internet was built on ideas stacked on top of ideas. A whole lot of code and ideas are underneath this post. At its best, when I write online I am connecting, if only virtually, with other writers, artists and thinkers, and connecting literally through hyperlinks to those ideas.

I know there are “Internet cafés,” but what about Internet as a café?