Hello LX: Learning Experience Design

LXI have been teaching since 1975. I have done instructional design (ID) since 2000. The job of an ID was not one I knew much about before I started managing a department tasked with doing it at a university. I hired people trained in ID, but I learned it myself along the way.

As others have said, the job of an instructional designer seems mysterious. One suggestion has been to change the title to Learning Experience Designer. Does that better describe the job and also apply to people who work in corporate and training settings?

I have taught courses about UX (user experience) which involves a "person’s behaviors, attitudes, and emotions about using a particular product, system or service” (according to Wikipedia). Part of that study involves UI (user interface) which “includes the practical, experiential, affective, meaningful and valuable aspects” of the interaction as well as “a person’s perceptions of system aspects such as utility, ease of use and efficiency.”

With more online learning and also blended online and face-to-face learning, there is more attention being given to the learner experience (LX). How students interact with learning, seems to be more than what “user experience” (UX) entails.

UX was coined in the mid ‘1990s by Don Norman. He was then VP of advanced technology at Apple, and he used it to describe the relationship between a product and a human. It was Norman's idea that technology should evolve to put user needs first. That was actually the opposite of how things were done at Apple and most companies. But by 2005, UX was fairly mainstream.

"Learning experience design" was coined by Niels Floor in 2007, who taught at Avans University of Applied Sciences in the Netherlands.

I wrote earlier here about how some people in education still find the job of an instructional designer to be "mysterious."  But call it UX or LX or ID, customizing learning, especially online, is a quite active job categories in industry and and education. Designers are using new tools and analytics to decode learning patterns.

In higher-education job postings and descriptions, I am seeing more examples of LX design as a discipline. That is why some people have said that Learning Experience Design is a better title than Instructional Design. It indicates a shift away from “instruction” and more to "learning." 

Learning and Working in the Age of Distraction

screensThere is a lot of talk about distraction these days. The news is full of stories about the Trump administration and the President himself creating distractions to keep the public unfocused on issues they wish would go away (such as the Russias connections) and some people believe the President is too easily distracted by TV news and Twitter.

There are also news stories about the "distraction economy."  So many people are vying for your attention. The average person today is exposed to 1,700 marketing messages during a 24-hour period. Most of these distractions are on screens - TV, computers and phones.  Attention is the new currency of the digital economy.

Ironically, a few years ago I was reading about "second screens," behavioral targeting and social media marketing and that was being called the "attention economy." There is a battle for attention, and the enemy is distraction.

Google estimates that we spend 4.4 hours of our daily leisure time in front of screens. We are still using computers mostly for work/productivity and search. We use smartphones for connectivity and social interactions. Tablets are used more for entertainment. My wife and I are both guilty of "multi-screening." That means we are part of the 77% of consumers watching TV while on other devices. I am on my laptop writing and researching and she is on her tablet playing games and checking mail and messages. It is annoying. We know that.

Of course, the original land of distraction is the classroom. Students have always been distracted. Before the shiny object was a screen full of apps, passing notes was texting, and doodling in your notebook and the cute classmates sitting nearby were the social media. But I have seen four articles on The Chronicle website about "The Distracted Classroom" lately. Is distraction on the rise?

If you are a teacher or student, does your school or your own classroom have a policy on using laptops and phones? If yes, is it enforced?  Anyone who has been in a classroom lately of grade 6 or higher knows that if students have phones or laptops out in class for any reason they are texting, surfing the web, or posting on social media.

Good teachers try to make classes as interactive as possible. We engage students in discussions, group work and active learning, but distractions are there.

Banning devices isn't a good solution. Things forbidden gain extra appeal.

distractionsA few books I have read discuss the ways in which distraction can interfere with learning. In The Distracted Mind: Ancient Brains in a High-Tech World , the authors say that distraction occurs when we are pursuing a goal that really matters and something blocks our efforts to achieve it. Written by a neuroscientist, Adam Gazzaley, and a psychologist, Larry D. Rosen, they join other researchers who report that our brains aren't built for multitasking. This compares to a time a few decades ago when being able to multitask was consider a positive skill.

It seems that the current belief is that we don't really multitask. We switch rapidly between tasks. Any distractions and interruptions, including the technology-related ones - act as "interference" to our goal-setting abilities. 

But is this a new problem or has our brain always worked this way? Is the problem really more about the number of possible distractions and not our "rewired" brains?

Nicholas Carr sounded an alarm in 2011 with The Shallows: What the internet is doing to our brains, arguing that our growing exposure to online media means our brains need to make cognitive changes. The deeper intellectual processing of focused and critical thinking, gets pushed aside in favor of the faster processes like skimming and scanning.

Carr contends that the changes to the brain's "wiring" is real. Neural activity shifts from the hippocampus' deep thinking, to the prefrontal cortex where we are engaged in rapid, subconscious transactions. Substitute speed for accuracy. Prioritize impulsive decision-making over deliberate judgment. 

In the book Why Don't Students Like School?: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom  the author asks questions such as Why Do Students Remember Everything That's on Television and Forget Everything I Say? and Why Is It So Hard for Students to Understand Abstract Ideas? and gives some science and suggestions as answers. But these are difficult questions and simple answers are incomplete answers in many cases.

Some teachers decide to use the tech that is being a distraction to gain attention. I had tried using a free polling service (Poll Everywhere) which allows students to respond/vote using their laptops or phones. You insert questions into your presentation software, and that allows you to track, analyze, and discuss the responses in real time. The problem for me is that all that needs to be pre-planned and is awkward to do on-the-fly, and I am very spontaneous in class with my questioning. Still, the idea of using the tech in class rather than banning it is something I generally accept. But that can't be done 100% of the time, so distracted use of the tech is still going to occur.

bubbleAnd the final book on my distraction shelf is The Filter Bubble. The book looks at how personalization - being in our own bubble - hurts the Internet as an open platform for the spread of ideas. The filter bubble puts us in an isolated, echoing world. The author, Eli Pariser, subtitles the book "How the New Personalized Web Is Changing What We Read and How We Think." Pariser coined the term “filter bubble.” The term is another one that has come up o the news in talking about the rise of Donald Trump and the news bubble that we tend to live in, paying attention to a personalized feed of the news we agree with and filtering out the rest.

Perhaps creating a filter bubble is our way of coping with the attention economy and a way to try to curate what information we have to deal with every day.

Then again, there were a number of things I could have done the past hour instead of writing this piece. I could have done work that I actually get paid to do. I could have done some work around my house. But I wrote this. Why? 

Information overload and non-stop media is hurting my/our discipline for focus and self-control.

Michael Goldhaber defined the attention economy in this more economic way: "a system that revolves primarily around paying, receiving and seeking what is most intrinsically limited and not replaceable by anything else, namely the attention of other human beings.” In order for that economy to be profitable, we must be distracted. Our attention needs to be drawn away from the competition.

As a secondary school teacher for several decades, I saw the rise of ADHD. That was occurring before the Internet and lack of attention, impulsivity and boredom were all symptoms. It worsened after the Internet was widespread, but it was there before it and all the personal digital devices.

Back in 1971,  Herbert A. Simon observed that “what information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

We are collectively wiser than ever before. We have the wisdom of the world in a handheld computer connected to almost everything. But it is so difficult to filter out the distractions and garbage that we don't have a lot of success translating information into knowledge. People used to say that finding out something on the Internet was like taking a sip from a fire hose. Search filtering has helped that, but so far the only filters for our individual brains are self-created and often inadequate.

 

Alternative Postsecondary Learning Pathways

arrowsSeveral bills that recently came before the U.S. House of Representatives that would provide funding for people to enroll in alternative postsecondary pathways. As one article on usnews.com points out, this funding comes at the same time as a new study that looks at  the quality of these programs and the evidence of their efficacy.

That report, "The Complex Universe of Alternative Postsecondary Credentials and Pathways" authored by Jessie Brown and Martin Kurzweil and published by American Academy of Arts and Sciences, evaluated alternatives that I have written about here: certificate programs, market-focused training, work-based training, apprenticeships, skills-based short courses, coding bootcamps, MOOCs, online micro-credentials, competency-based education programs and credentials based on skill acquisition rather than traditional course completion.

The report is wide-ranging and worth downloading if these are educational issues that concern you. If they don't concern you and you plan to work in education for another decade, you should really pay attention.

I'm not at all surprised that the earning power for "graduates" of alternative programs varies widely depending on the subject studied. A computer science certificate program graduate, for example, can expect to earn more than twice what a health care or cosmetology certificate recipient will receive.

Who pursues these programs? Certificate programs, work-based training and competency-based programs tend to attract older, lower-income learners who have not completed a college degree. But 80% of bootcamp enrollees and 75% of MOOC participants already have a bachelor's degree.

What do the authors of this study recommend? Policy changes to collect more comprehensive data on educational and employment outcomes and to enforce quality assurance standards. Also to devote resources to investigating efficacy and return on investment. The U.S. News article also points out that 19 organizations have promoted greater federal oversight of career and technical education programs in a June letter to the House of Representatives about the Perkins Act Reauthorization.

Information Wants To Be Free - but

scihub logoThis is a brief followup to my previous article about BitTorrent. That is because this post is about SciHub, which is self-described as "the first pirate website in the world to provide mass and public access to tens of millions of research papers".

Sci-Hub is a website with over 62,000,000 academic papers and articles available for direct download, bypassing publisher paywalls by allowing access through educational institution proxies. Sci-Hub stores papers in its own repository, and additionally the papers downloaded by Sci-Hub are also stored in Library Genesis (LibGen). A Russian graduate student, Alexandra Elbakyan, founded it in 2011, as a reaction to the high cost of research papers behind paywalls.

The site has had its legal problems. In 2015 academic publisher Elsevier filed a legal complaint against Sci-Hub alleging copyright infringement and the subsequent lawsuit led to a loss of the original sci-hub.org domain.  legal to even access http://sci-hub.io/ 

"If Elsevier manages to shut down our projects or force them into the darknet, that will demonstrate an important idea: that the public does not have the right to knowledge," says founder Alexandra Elbakyan in an interview with TorrentFreak.  

Elbakyan has been called a hero and "spiritual successor to Aaron Swartz for her creation of Sci-Hub and compared to Edward Snowden, because she is hiding in Russia after having "leaked" files in violation of American law.

Is she a modern-day "Robin Hood of science?"

BitTorrent Reconsidered

shirt

This past weekend I was wearing an old BitTorrent t-shirt that has printed on the back: "Give and ye Shall receive." While waiting in a store checkout line, a man behind me said, "BitTorrent? Are you a software pirate?"

To many people, BitTorrent is still synonymous with piracy. BitTorrent was and probably used for some questionable and illegal file transfers, but it’s also being used for many legitimate tasks.

A programmer, Bram Cohen, designed the protocol and released the first available version in July 2001, and it quickly became the preferred way to share large files, especially movies. In the public mind, it is blurred together with other file sharing programs like Napster, which was used to share music (mp3) files.

animationLike HTTP, which your browser uses to communicate with websites, BitTorrent is just a protocol. People were sharing pirated files of all kinds before BitTorrent using anonymous peer-to-peer networks, but this new protocol made it much faster and more efficient. 

The BitTorrent protocol uses client computers to share individual piece of the file. After the initial pieces transfer from the seed, the pieces are individually transferred from client to client and that original seeder only needs to send out one copy of the file for all the clients to receive a copy.

BitTorrent Sync is a use that is comparable to Dropbox, a popular file sharing system. But unlike Dropbox, Sync doesn’t store your files in a centralized server online. It syncs them between computers you own or computers your friends own. It allows easy file sharing and you can sync an unlimited number of files as long as you have the space on your computers for them. (Dropbox offers that extra space, which many of us need.)

The most recent version of BitTorrent was released in 2013 and BitTorrent clients are available for a variety of computing platforms and operating systems including an official client released by BitTorrent, Inc.

What are some of the current legal uses? 

Some game companies use it for game updates and downloads. For example, Blizzard Entertainment uses its own BitTorrent client to download World of Warcraft, Starcraft II, and Diablo III. When you legally purchase one of these games and download it, you’re downloading a BitTorrent client that does it and the game’s launcher automatically downloads updates for you.

Facebook uses the BitTorrent protocol for propagating large files over a large number of different servers.

It also has educational users. Florida State University uses BitTorrent to distribute large scientific data sets to its researchers. Many universities that have BOINC distributed computing projects have used the BitTorrent functionality of the client-server system to reduce the bandwidth costs of distributing the client-side applications used to process the scientific data. The developing Human Connectome Project uses BitTorrent. 

The popular Internet Archive uses the protocol to make its public domain content downloadable.

In 2010, the UK government released several large data sets showing how public money was being spent that were offered via BitTorrent to save on bandwidth costs and speed the process.

NASA has also used BitTorrent to make a 2.9GB picture of the Earth available.

Like Napster, which rebranded and reinented itself after all the lawsuits into a "legitimate" music service, the official BitTorrent website has a list of “bundles” of music and videos. Artists make them freely available to hook fans, just as radio was once used to provide free music to large audiences in hopes that they’ll attend live shows and buy albums.

If we got rid of BitTorrent, another similar protocol would need to emerge. 

Machine Learning :: Human Learning

AI - “artificial intelligence” - was introduced at a science conference at Dartmouth University in 1956. Back then it was a theory, but in the past few decade it has become something beyond theoretical. been less theory and more in practice than decades before.

The role of AI in education is still more theory than practice.

A goal in AI is to get machines to learn. I hesitate to say "think" but that is certainly a goal too. I am reading The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution currently and in that history there is a lot of discussion of the people trying to get machines to do more than just compute (calculate) but to learn from its experiences without requiring a human to program those changes. The classic example is the chess playing computer that gets better every time it wins or loses. Is that "learning?"

But has it had an impact on how you teach or how your students learn?

It may have been a mistake in the early days of AI and computers that we viewed the machine as being like the human brain. It is - and it isn't.

But neuroscientists are now finding that they can also discover more about human learning as a result of machine learning. An article on opencolleges.edu.au points to several interesting insights from the machine and human learning research that may play a role in AI in education.

One thing that became clear is that physical environment is something humans learn easier than machines. After a child has started walking or opened a few doors or drawers or climbed a few stairs, she learns how to do it. Show her a different door, drawer, or a spiral staircase and it doesn't make much of a difference. A robot equipped with some AI will have a much steeper learning curve to learn these simple things. It also has a poor sense of its "body." Just watch any videos online of humanoid robots trying to do those things and you'll see how difficult it is for a machine.


Then again, it takes a lot longer for humans to learn how to drive a car on a highway safely. And even when it is learned, our attention, or lack thereof, is a huge problem. AI in vehicles is learning how to drive fairly rapidly, and its attention is superior to human attention. Currently, it is still a fall back human error in mist cases, but that will certainly change in a decade or two. I learned to parallel park a car many years ago and I am still lousy at doing it. A car can do it better than me.

Although computers can do tasks they are programmed to do without any learning curve, for AI to work they need to learn by doing - much like humans. The article points out that AI systems that traced letters with robotic arms had an easier time recognizing diverse styles of handwriting and letters than visual-only systems. 

AI means a machine gets better at a task the more it does it, and it can also apply that learning to similar but not identical situations. You can program a computer to play notes and play a series of notes as a song, but getting it to compose real music requires AI.

Humans also learn from shared experiences. A lot of the learning in a classroom comes from interactions between the teacher and students and student to student. This makes me feel pretty confident in the continued need for teachers in the learning process.

One day, I am sure that machines will communicate with each other and learn from each other. This may be part of the reason that some tech and learning luminaries like Elon Musk have fears about AI

I would prefer my smart or autonomous vehicle to "talk" to other vehicles on the roads nearby and share information on traffic, obstructions and vehicles nearby with those quirky human drivers only.

AI built into learning systems, such as an online course, could guide the learning path and even anticipate problems and offer corrections to avoid them. Is that an AI "teacher" or the often-promoted "guide on the side?"

This year on the TV show Humans, one of the human couples goes for marriage counseling with a "synth" (robot). She may be a forerunner of a synth teacher.

Humans TV
The counselor (back to us) can read the husband's body language and knows he does not like talking to a synth marriage counselor.