Educating in the Metaverse

Excerpt from https://www.gettingsmart.com/2022/04/18/metaverse-and-education-what-do-we-need-to-know/

Although the metaverse seems like a new concept, it actually has been around for nearly three decades. In 1992, Neal Stephenson, an American science fiction author introduced the concept of the metaverse in his novel, Snow Crash.

In October, Mark Zuckerberg announced the change from Facebook to Meta and released a short video about how the metaverse would work and what his plans were for it. I showed this to my students, which sparked great conversations and many questions.

As educators, how can we keep up with so much information? Where can we learn about the technologies involved in the metaverse? I recommend setting a Google alert through your Gmail. Set the topic to be “metaverse” or other topics of interest, and each day you will receive an email with articles, videos and breaking news stories gathered from all over the Internet...

 

webinarInterested in having a conversation about the metaverse? Register for the upcoming Getting Smart Town Hall on May 12, 2022 What on Earth is a Metaverse?: The Next Frontier of Engaging and Learning.
We’ll explore some of the following questions:
- Is the metaverse technically on “earth”?
- How far away is this from being a reality?
- What does this mean for teaching and learning?
- What about equity and accessibility?
- What about the power of place?

Consider Your Life in the Metaverse and Multiverse

universes
Image by Gerd Altmann from Pixabay

I have already written several essays about the metaverse and multiverse here. This past weekend, I wrote about those two ideas on another blog that is broader in scope than the technology and education here. Here is another take on those things for a broader audience.

Much of the talk (and hype) about the metaverse has been around Mark Zuckerberg's ideas, especially when he changed the name of Facebook's parent company to Meta because the metaverse is where he expects Facebook and a lot more to be going to in the future. Who will build the metaverse? Certainly, Meta wants to be a big player, but it would have been like asking in the 1980s "Who will build the Internet?" The answer is that it will be many people and companies.

But some people have suggested that rather than the metaverse - an alternate space entered via technology - we should be thinking about the multiverse. Metaverse and multiverse sound similar and the definitions may seem to overlap at times but they are not the same things.

If all of this sounds rather tech-nerdy, consider that most of us through of the Internet in that way in its earliest days, but now even a child knows what it is and how to navigate it. The business magazine Forbes is writing about the multiverse and about the metaverse because - like the Internet - it knows it will be a place of commerce.

I particularly like the more radical ideas that the metaverse might be viewed as a moment in time. What about considering that we may be already living in a multiverse? I have wondered about when education would enter the metaverse.

To add to whatever confusion exists about meta- versus multi-, there is an increasing list of other realties that technology is offering with abbreviations like AR, VR, XR and MR.

I am not a fanatic about the Marvel Comics Universe and its many films, but I am a fan of the character Doctor Strange (played by Benedict Cumbernatch). The new film Doctor Strange in the Multiverse of Madness takes him and some "mystical allies into the mind-bending and dangerous alternate realities of the Multiverse to confront a mysterious new adversary."

There are people in our real world who find the idea of multiverses terrifying, so madness and nightmare might be good words to attach to it. The Marvel version of the Multiverse is defined as "the collection of alternate universes which share a universal hierarchy; it is a subsection of the larger Omniverse, the collection of all alternate universes. A large variety of these universes were originated as forms of divergence from other realities, where an event with different possible outcomes gives rise to different universes, one for each outcome. Some can seem to be taking place in the past or future due to differences in how time passes in each universe."

The film may not be science-based but theoretical scientists have been theorizing about multiple universes, alternate universes, and alternative timelines for almost as long as science-fiction writers have been creating them. Probably everyone reading this (and definitely the person writing this) has thought about the idea of how changing some events might create different outcomes. the "writers and filmmakers may think about trying to stop JFK's assassination or what if the Nazis had won WWII, but you and I think more personally. WHAT IF I hadn't gone to that college, taken that job, married someone else, or not married at all? For now, multiverses exist in our minds, but someday, perhaps, they will be real. Or whatever "real" means at that point in time.

The Disconnected 2022 Edition

brain connectIt's 2022 and I am reading an article in The Chronicle by Beth McMurtrie about how the pandemic forced disconnections in early 2020. On the other hand, we also became more connected to friends, offices, campuses, and stores through technology and media.

The article took me back to a keynote presentation I did back in January 2016. I titled that talk "The Disconnected." The talk grew out of the many references I had been seeing to people who seemed disconnected from many aspects of society.

There was the observation that there was a re-emergence of people who wanted to learn on their own rather than in schools. These autodidacts were a new group of learners that I felt might be reshaping school, especially in higher education which is a choice rather than a requirement.

In 2015, the sharing economy, the maker movement, the DIY do-it-yourself movement, and open-source coding were all topics of interest.

These trends were not limited to young people or students. Many people were “cord cutting” from traditional media. But the trend was especially evident in young adults. Even broader was a “rent rather than buy” mindset that was affecting purchases of media (music, movies, books, magazines), cars (lease or use a car service rather than own a car), rent an apartment or home and avoid the self-maintenance, mortgage and taxes.

In 2015, the “disconnected” comprised about 25 percent of Americans, according to Forrester Research. They estimated that number would double by 2025. Has it?

That new article is about students who seem to have disconnected during the pandemic and are not reconnecting now. Maybe they will never reconnect. 

According to McMurtie's article, fewer students are going to classes. Her interviews with faculty show that those who do attend avoid speaking if possible. They are disconnected from the professor and their classmates. They don't do the assigned reading or homework and so they have trouble with tests. They are disconnected from the course content.

The Chronicle had more than 100 people tell them about their disconnected students. Some called them “exhausted,” “defeated,” or “overwhelmed.” This came from faculty at a range of institutions.

usb connect

Why are they disconnected?

Reasons given by professors include pandemic-related items. Many students lost their connection with their college or their purpose in attending. Hours of online learning that they had not chosen and which may have been sub-par added to those things.

The students who seemed to have the most trouble with learning were the freshmen who seemed unprepared. But the observations that these new students seemed underprepared, both academically and in their sense of responsibility. One example was that students don’t fully grasp the consequence of missing classes. I was teaching long before the pandemic and all of those things were true of students back then too. 

So my question is whether or not those disconnected students of 2015 have become even more disconnected in the subsequent seven years, and if they have is it because of the pandemic or just a trend that started well before the pandemic.

McMurtrie also gives some things from the perspective of students. One student said that when she returned to the classroom after virtual learning many professors relied more on technology than they had before the pandemic. Ironically, that was something that many schools had hoped would happen; that faculty would be greater tech users when they returned to their in-person classes. Professors who never used virtual conferencing or flipped the classroom using a learning management system. That student may have seen her college experience as "fake" but the professors (and possibly their department chairs and deans) saw the experience as "enhanced."

I don't explain the disconnecting as only the result of social anxiety and stress or what psychologists describe as “allostatic load.”  I don't think this problem is temporary. I agree with some of the faculty whose responses are in the article who think the entire structure of college needs to change and that this is not a new problem.

None of us know what the solution might be.

Emergency Remote Teaching May Not Be Online Learning

online student
  Image: Marc Thele

Though they get lumped together, there is a difference between emergency remote teaching (ERT) and online learning. Prior to the COVID pandemic, I knew of some isolated examples of emergency remote learning (ERL). It might have happened because of a natural disaster, such as when Hurricane Katrina hit the New Orleans area in 2005. Tulane University was forced to send students to other schools. Going online wasn't an option. In 2009, the H1N1 (swine flu) pandemic hit and few schools used online learning as one way to compensate. In that pandemic, schools often kept students isolated on campus and used more traditional learning options. It was the rare school that was able to go online for all or a large percentage of classes. 

I co-wrote two journal articles in 2021 (AJES, 80:1) about the COVID pandemic and higher education. The first article, "Online Education in a Pandemic: Stress Test or Fortuitous Disruption?" examined some of that history. One observation is that there were few lessons learned between the prior event and the COVID pandemic despite gains in using online learning in normal situations. The COVID-19  pandemic brought on more emergency remote learning than a switch to online learning. Switching from face-to-face (F2F) education to a virtual environment was forced and unplanned in the vast majority of cases. The second article, "Choosing Transformation Over Tradition" considers how advancements in online education did not have the effect of preparing all teachers and all courses to move online easily and asked whether lessons learned in 2020 and 2021 would be temporary or transformative. At that time, there were teachers, students and courses that were online - and there were those that were not. (both articles are available via academia.com).

Well-planned online learning experiences are significantly different from courses offered online in response to a crisis or disaster. I believe that most of the criticisms of K-12 and high education schools trying to maintain instruction during the COVID-19 pandemic stem from emergency remote teaching. Unfortunately, in the public perception and for some in academia, the experience of ERL is their perception of online learning overall.

Emergency remote teaching is defined as "a temporary shift of instructional delivery to an alternate delivery mode due to crisis circumstances." Though the teaching solutions used will certainly overlap those used for online instruction, ERT or ERL should not be considered the same as what we know to be planned and designed "online learning." 

An EDUCAUSE article considers how we might cautiously evaluate emergency online learning and though some criteria for evaluating online learning would certainly be in that rubric, it would be invalid to use the same criteria.

It reminds me of my earliest experiences teaching online 20 years ago. Not only did I need to change how I designed lessons and how I presented them pedagogically, but I also needed to reevaluate how I would evaluate student work. For example, could I use the same rubric for a student who did a presentation or demonstration in my physical classroom as I did for a student submitting a slide presentation with audio that had been carefully designed, revised and edited?

When I ran a university department that was the campus support of online courses, we worked with a small percentage of faculty and courses that were fully online. In emergency situations when all classes needed to be online and faculty and students needed support, my department and I believe most school's support teams will not be able to offer the same level of support to all faculty who need it.

If you are in a teaching position, are you, your students, and your institution in a better place now to move quickly online than you were in January of 2020?

In writing that second journal article, I and my co-author were somewhat pessimistic about where we would be in 2022 based on the lesson not learned in past instances of emergency shifts to online. However, since those articles were published in early 2021, we feel some optimism. We have seen positive changes in preparedness. Anecdotally, I know of K-12 schools that have smoothly moved to online modes because of snowstorms or other short-term situations because of what they experienced in 2020-21. I know higher education faculty who are now more comfortable taking on an online course section (though they still prefer to teach in a physical classroom). At all levels, there is more use of online delivery platforms and more hybrid teaching than before. 

Like other emergency situations, we often hear that it is not if we will ever have to go fully online again; it is when we will have to do it.

Extended and Mixed Reality Can Be Confusing

MR
Mixed reality continuum

You know VR (virtual reality) and probably know AR (augmented reality) but XR (extended reality) may be new to you. Extended reality is an umbrella term that refers to all real-and-virtual environments generated by computer graphics and wearables. Besides VR and AR this umbrella term also includes MR (mixed reality). 

It seems that AR is already a kind of mixed reality since it has digital content and real-world content which sounds like mixed reality. But MR has even more, for example, it might include holographic meetings.

When the term XR is used it means that the human-to-tech moves from a screen to an immersive virtual environment or augments the user’s surroundings or both things. I thought the XR term was new but it actually appeared in the 1960s when Charles Wyckoff filed a patent for his silver-halide “XR” film. It is very different in its usage today.

To further add to the abbreviation confusion, this field also uses BCI to mean brain-computer interfaces which may be the next computing platform.

Confused?  Read on

weforum.org/agenda/2022/02/future-of-the-metaverse-vr-ar-and-brain-computer/

xrtoday.com/mixed-reality/what-is-extended-reality/

hp.com/us-en/shop/tech-takes/what-is-xr-changing-world
 

Federated Learning

When I first think of federated learning, what comes to mind is something like a college federated department. For example, the history faculty at NJIT and Rutgers University-Newark are joined in a single federated department offering an integrated curriculum and joint undergraduate and graduate degree programs.

Having worked at NJIT, it made sense to combine the two departments and collaborate. Each had its own specialties but they were stronger together.

In technology, a federation is a group of computing or network providers agreeing upon standards of operation in a collective fashion, such as two distinct, formally disconnected, telecommunications networks that may have different internal structures.

There is also federated learning which sounds like something those two history departments are doing, but it is not. This federated learning is the decentralized form of machine learning (ML).

In machine learning, data that is aggregated from several edge devices (like mobile phones, laptops, etc.) is brought together to a centralized server.  The main objective is to provide privacy-by-design because, in federated learning, a central server just coordinates with local clients to aggregate the model's updates without requiring the actual data (i.e., zero-touch).

I'm not going to go very deep here about things like the three categories (Horizontal federated learning, vertical federated learning, and federated transfer learning). As an example, consider federated learning at Google where it is used to improve models on devices without sending users' raw data to Google servers.

comic
An online comic from Google AI

For people using something like Google Assistant, privacy is a concern. Using federated learning to improve “Hey Google,” your voice and audio data stay private while Google Assistant uses it.

Federated learning trains an algorithm across the multiple decentralized edge devices (such as your phone) or servers that have local data samples, without exchanging them. Compare this to traditional centralized machine learning techniques where all the local datasets are uploaded to one server.

So, though federated learning is about training ML to be efficient, it is also about data privacy, data security, data access rights and access to heterogeneous data.


MORE at analyticsvidhya.com...federated-learning-a-beginners-guide
 

Pandemic Learning Gains

loss gainThere has been lots of talk about the losses in learning during the pandemic. Much of that talk has been around the shift to online learning and what was perceived as lost by not being in physical classrooms.

coverMy wife, Lynnette Condro Ronkowitz, and I wrote two articles published in the American Journal of Economics and Sociology (Volume 80, Issue 1) in January 2021 about the pandemic and higher education. (both articles are available via academia.com

The first article is "Online Education in a Pandemic: Stress Test or Fortuitous Disruption?" We considered the ways in which the shutdown caused by the  COVID-  19  pandemic have accelerated the evolution of online education. This movement from face-to-face (F2F) education to a virtual environment was forced and unplanned. It can be viewed as a stress test for digital teaching and learning in the higher education system. The study addresses course conversions and the progress of online education in response to the current crisis.

The second article, "Choosing Transformation Over Tradition: The Changing Perception of Online Education" was part of the first article's draft but the editors thought it would be expanded into a second article. In this article, we consider that despite advancements in online education, misperceptions persist that create obstacles to the integration of online classes in higher education. We refute misconceptions about online education and highlights key components of a strong online course. For example, as a result of the pandemic, it became apparent that there is a conflation between “school” and “education” that has prompted contradistinction, and so we tried to provide some insight into some of the social and economic implications of the culture of our education system.

We felt that though learning losses occurred during these pandemic years, there were also gains. A post on the Innovative Educator blog also addresses gains in learning that came out of the pandemic. Though we focused on higher education, the blog post looks more at K-12. For example, because of the pivot to online "students and staff were catapulted into the future in many school districts. As a result, our students will now be more prepared than they ever would have been, had education not been disrupted.

Some pandemic learning gains that were cited in the post:

Access to Devices - not that a "digital divide" does not still exist, but it is not as wide

Access to the Internet - the inability of students and some faculty to access broadband connections or possibly any Internet access at home became apparent. Stories of learners working from parking lots outside free wireless sites were shocking to some people.

Access to Content and to New Platforms - K-12 school districts began adopting learning management systems and platforms (Google Classroom was one ) and learning materials became more accessible to students and families.

Access to Each Other & The World - Higher education already had far greater access to learning platforms and tools such as video conferencing pre-pandemic, but it was not being used by a majority of faculty and in courses that were not already online. "Zooming" became a new verb for video conferencing for many people in and out of education - and it continues today. Virtual conferencing may come with some losses from in-person but it also came with gains. Video plus chat and captioning (though imperfect in most cases) helped students with and without disabilities or who spoke other languages access what was being said more easily. Courses could include authors, guests, and experts brought into virtual classrooms.  

I am not a fan of the term "the new normal" but such a thing would include gains that have remained in place and progress that was made. Hopefully, another major pandemic is far in the future but mini-crises from virus variants to natural disasters have occurred and will occur with greater frequency. And hopefully, we are better prepared for them.

Maybe the Metaverse Will Be a Moment in Time

angel singularity
Image by PapaOsmosis from Pixabay

Even those people who are involved in creating what they believe will be the metaverse have trouble defining it in a way that makes sense to the average person. I think that's because we don't know what the metaverse will be.

Most of what you read about it is about technology and created places. Lots of talk of VR and AR devices and uncomfortable goggles on your head. Places like Minecraft, Roblox, or whatever the Facebook/Meta will be.  

I recently encountered the idea that metaverse might be a moment in time. That idea was posted on Twitter by Shaan Puri. His idea - and it's just that for now - is that while people are thinking of the metaverse as a place - like the book and movie Ready Player One - it might be more like another idea of "the singularity."

The singularity is a hypothetical point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization. "Singularity" has been used in several contexts but John von Neumann was first to use it in the technological context. Some people fear the singularity seeing it as a point when AI becomes smarter than humans.

Does it frighten you to think any digital life could be worth more than a real physical life? It frightened Stephen Hawking. It frightens Elen Musk. How can it be a timerather than some tech invention or one place someone created online? That idea of a moment is decieving. It won't be a moment that can be marked with a pushpin on a timeline. When did the Internet begin? Was it a moment in time or a gradual process of change? Have we been moving to the singularity of the metaverse for a few decades?

Do you feel that our online identities, experiences, relationships, and some assets already exist in some digital world?

Maybe the metaverse will not be a technological invention or a place but a point in time only observable after it occurs.

 

The Science of Learning

Einstein
Professor Einstein during a lecture in Vienna in 1921

Albert Einstein was definitely a subject matter expert, but he is not regarded as a good professor. Einstein first taught at the University of Bern but did not attract students, and when he pursued a position at the Swiss Federal Institute of Technology in Zurich, the president raised concerns about his lackluster teaching skills. Biographer Walter Isaacson summarized, “Einstein was never an inspired teacher, and his lectures tended to be regarded as disorganized.” It's a bit unfair to say that "Einstein Was Not Qualified To Teach High-School Physics" - though by today's standards he would not be considered qualified. It probably is fair to say that "Although it’s often said that those who can’t do teach, the reality is that the best doers are often the worst teachers."

Beth McMurtrie wrote a piece in The Chronicle called "What Would Bring the Science of Learning Into the Classroom?" and her overall question was: Why doesn't the scholarship on teaching have as much impact as it could have in higher education classroom practices?

It is not the first article to show and question why higher education appears not to value teaching as much as it could or should. Is it that quality instruction isn't valued as much in higher education as it is in the lower grades? Other articles show that colleges and most faculty believe the quality of instruction is a reason why students select a school.

Having moved from several decades in K-12 teaching to higher education, I noticed a number of things related to this topic. First of all, K-12 teachers were likely to have had at least a minor as undergraduates in education and would have taken courses in pedagogy. For licensing in all states, there are requirements to do "practice" or "student teaching" with monitoring and guidance from education professors and cooperating teachers in the schools.

When I moved from K-12 to higher education at NJIT in 2001, I was told that one reason I was hired to head the instructional technology department was that I had a background in pedagogy and had been running professional development workshops for teachers. It was seen as a gap in the university's offerings. The Chronicle article also points to "professional development focused on becoming a better teacher, from graduate school onward, is rarely built into the job."

As I developed a series of workshops for faculty on using technology, I also developed workshops on better teaching methods. I remember being surprised (but shouldn't have been) that professors had never heard of things like Bloom's taxonomy, alternative assessment, and most of the learning science that had been common for the past 30 years.

K-12 teachers generally have required professional development. In higher education, professional development is generally voluntary. I quickly discovered that enticements were necessary to bring in many faculty. We offered free software, hardware, prize drawings and, of course, breakfasts, lunches and lots of coffee. Professional development in higher ed is not likely to count for much when it comes to promotion and tenure track. Research and grants far outweigh teaching, particularly at a science university like NJIT.

But we did eventually fill our workshops. We had a lot of repeat customers. There was no way we could handle the approximately 600 full-time faculty and the almost 300 adjunct instructors, so we tried to bring in "champions" from different colleges and departments who might later get colleagues to attend.

I recall more than one professor who told me that they basically "try to do the thing my best professors did and avoid doing what the bad ones did." It was rare to meet faculty outside of an education department who did any research on teaching. We did find some. We brought in faculty from other schools who were researching things like methods in engineering education. I spent a lot of time creating online courses and improving online instruction since NJIT was an early leader in that area and had been doing "distance education" pre-Internet.

Discipline-based pedagogy was definitely an issue we explored, even offering specialized workshops for departments and programs. Teaching the humanities and teaching the humanities in a STEM-focused university is different. Teaching chemistry online is not the same as teaching a management course online.

Some of the best parts of the workshops were the conversations amongst the heterogeneous faculty groups. We created less formal sessions with names that gathered professors around a topic like grading, plagiarism and academic integrity, applying for grants, writing in the disciplines, and even topics like admissions and recruiting. These were sessions where I and my department often stepped back and instead offered resources to go further after the session ended.

It is not that K-12 educators have mastered teaching, but they are better prepared for the classroom from the perspective of discipline, psychology, pedagogy, and the numbers of students and hours they spend in face-to-face teaching. College faculty are reasonably expected to be subject matter experts and at a higher level of expertise than K-12 teachers who are expected to be excellent teachers. This doesn't mean that K-12 teachers aren't subject matter experts or that professors can't be excellent teachers. But the preparations for teaching in higher and the recognition for teaching excellence aren't balanced in the two worlds.

The Great Resignation and The Great Deflate

balloon

2021 was the year of the “Great Resignation.” We have been told that it was a year when workers quit their jobs at historic rates. This is an economic trend meaning that employees voluntarily resign from their jobs. Blame has been aimed at the American government for failing to provide necessary worker protections in response to the COVID-19 pandemic. This led to wage stagnation. There was also a rising cost of living. The term was coined in May 2021 by Anthony Klotz, a professor of management at Texas A&M University.

It's now 2022 and unemployment rates have fallen sharply from their pandemic highs. The labor force participation rate - which is the percentage of people in the workforce, or looking for a job - has increased, though not to its pre-pandemic level.

It was thought in 2020 that 2021 with a vaccine would mark the renormalization of the economy, schools, and life in general. But Covid variants wiped out that vision.

It seems counterintuitive, but to economists quitting is usually an expression of optimism. You don't quit a job unless you have the prospect of another, probably better one, or you don't need to work because of a good financial situation. But the quits happened when inflation is looming, and the Omicron variant is dominating.

Some industries are seeing higher rates of quitting. It isn't surprising that leisure, hospitality, and retail are at the top. Those were hit hard by the pandemic. Healthcare is another and certainly many of those workers were just burned out by the pandemic. But the reasons given for quitting include a lack of adequate childcare and personal and family health concerns about Covid. If the pandemic overwhelmed you at your job, you might have decided to quit even without a new prospect in search of better work opportunities, self-employment, or, simply, higher pay.

Derek Thompson wrote in The Atlantic that there are 3 myths about this Great Resignation. One is that it is a new 2021 phenomenon. Is it really more of a cycle we have seen before or that has been moving into place for years and simply accelerated by the pandemic?

For colleges, it wasn't so much a Great Quit as it was a Great No-Show. The newest report I found from the National Student Clearinghouse Research Center (NSCRC) shows that postsecondary enrollment has now fallen 2.6% below last year’s level. Undergraduate enrollment has dropped 3.5% so far this fall, resulting in a total two-year decline of 7.8% since 2019. As with jobs, not all of that decline is because of the pandemic and it too is a trend that was evident before the pandemic. But Covid didn't help the decline.

Add to these one more "Great" that I see talked about - The Great Deflate. This is the idea that rather than our economy being a bubble that will burst, it's a balloon that is deflating. In "The Great Deflate" by M.G. Siegler, he talks about a more gradual trend. Picture that helium balloon floating at the ceiling on your birthday that day by day has been slowly moving down as it deflates. No burst, just a slow, steady fall.

Is there a connection among all these trends? Certainly, the connection is the economy. Perhaps, there won't be a stock market crash or something like the Dot Com bubble burst, but we see stock market drops of 1, 2 or 3% pretty regularly. Those are significant drops.

Since May 2021 when Anthony Klotz coined "The Great Resignation," other terms have emerged including “The Great Reimagination,” “The Great Reset” and “The Great Realization” terms that express the re-examining of work in our lives. But the quitting wave hasn't broken yet and so Klotz has more recently made three not-so-surprising predictions.
The Great Resignation will slow down
Flexible work arrangements will be the norm, not the exception
Remote jobs will become more competitive


Economists say rapid quitting and hiring will continue in 2022 despite omicron wave

Serendipity16

groundhog dayI love the movie Groundhog Day in which Phil wakes up at 6 AM every day to discover that it is February 2 all over again. His days run the same over and over though he tries hard to change it. We see him repeat the day more than 35 times. 

Today is Groundhog Day and what is repeating - for the 5840th time - is Serendipty35. Today is the 16th birthday of this blog. (Hence the "Serendipity16" title for this post.) 

Of course, the blog is not the same every day, but it is here/there every day. My calculator tells me that the blog changes every 2.7 days. In the early years, I was much more ambitious with 3-5 posts per week. Over the years, I started other blogs and left my university job where all this started and now, I try to post here once a week. 

The more you post, the more hits you get. Currently, the site averages about 7000 hits a day, but that number was double that back in the years when there were multiple posts each week. Then again, this is still a "non-profit" production - not that we would object to profits. The "we" is me and Tim Kellers who used to post here too in the first years but is now keeping the gears turning in the background. 

And Serendipity35 keeps rolling on... 
 

AI Is Tired of Playing Games With Us

gynoid

Actroid - Photo by Gnsin, CC BY-SA 3.0, Link

I really enjoyed the Spike Jonze 2013 movie Her, in which the male protagonist, Theodore, falls in love with his AI operating system. He considers her - Samantha - to be his lover. It turns out that Samantha is promiscuous and actually has hundreds of simultaneous human lovers. She cheats on all of them. “I’ve never loved anyone the way I love you,” Theodore tells Samantha. “Me too,” she replies, “Now I know how.”    

AI sentience has long been a part of science-fiction. It's not new to films either. Metropolis considered this back in 1927.  The possibility of AI love for a human or human for an AI is newer. We never see Samantha, but in the 2014 film, Ex Machina, the AI has a body. Ava is introduced to a programmer, Caleb, who is invited by his boss to administer the Turing test to "her." How close is he to being human? Can she pass as a woman? She is an intelligent humanoid robot. She is a gynoid, a feminine humanoid robot, and they are emerging in real-life robot design.

As soon as the modern age of personal computers began in the 20th century, there were computer games. Many traditional board and card games such as checkers, chess, solitaire, and poker, became popular software. Windows included solitaire and other games as part of the package. But they were dumb, fixed games. you could get better at playing them, but their intelligence was fixed.

It didn't take long for there to be some competition between humans and computers. I played chess against the computer and could set the level of the computer player so that it was below my level and I could beat it, or I could raise its ability so that I was challenged to learn. Those experiences did not lead the computer to learn how to play better. Its knowledge base was fixed in the software, so a top chess player could beat the computer. Then came artificial intelligence and machine learning.

Jumping ahead to AI, early programs were using deep neural networks. A simplified definition is that it is a network of hardware and software that mimics the web of neurons in the human brain. Neural networks are still used. Neural network business applications are used in eCommerce, finance, healthcare, security and logistics. It underpins online services inside places like Google and Facebook and Twitter. Give enough photos of cars into a neural network and it can recognize a car. It can help identify faces in photos and recognize commands spoken into smartphones. Give it enough human dialogue and it can carry on a reasonable conversation. Give it millions of moves from expert players and it can learn to play Chess or Go very well.

chess

Photo by GR Stocks on Unsplash

Alan Turing published a program on paper in 1951 that was capable of playing a full game of chess. The 1980s world champion Garry Kasparov predicted that AI chess engines could never reach a point where they could defeat top-level grandmasters. He was right - for a short time. He beat IBM’s Deep Blue in a match over six games with 4:2 just as he had beaten its predecessor, IBM’s computer Deep Thought, in 1989. But Deep Blue did beat him in a rematch and now the AI chess engines can defeat a master every time.

Go ko animación

A more challenging challenge for these game engines was the complex and ancient game of Go. I tried learning this game and was defeated by myself. Go is supposed to have more possible configurations for pieces than atoms in the observable universe.

Google unveiled AlphaGo and then using an AI technology called reinforcement learning, they set up countless matches in which somewhat different versions of AlphaGo played each other. It learned to discover new strategies for itself, by playing millions of games between its neural networks, against themselves.

First, computers learned by playing humans, but we have entered an even more powerful - and some would say frightening - phase. Now beyond taking in human-to-human matches and playing humans, the machines tired of human play. Of course, computers don't get tired, but the AIs could now come up with completely new ways to win. I have seen descriptions of unusual strategies AI will use against a human.

One strategy in a battle game was to put all its players in a hidden corner and then sit back and watch the others battle it out until they were in the majority or alone. In a soccer game, it kicked the virtual ball millions of times, each time only a millimeter further down the pitch and so was able to get a maximum number of “completed passes” points. It cheated. Like Samantha, the sexy OS in the movie.

In 2016, the Google-owned AI company DeepMind defeated a Go master four matches to one with its AlphaGo system. It shocked Go players who thought it wasn't possible. It shouldn't have shocked them since a game with so many possibilities for strategy is better suited to an AI brain than a human brain.

In one game, AlphaGo made a move that was either stupid or a mistake. No human would make such a move. And that is why it worked. It was totally unexpected. In a later game, the human player made a move that no machine would ever expect. This "hand of God” move baffled the AI program and allowed that one win. That is the only human win over AlphaGo in tournament settings.

AlphaGoZero, a more advanced version, came into being in 2017. One former Go champion who had played DeepMind retired after declaring AI "invincible."

Repliee

Repliee Q2

One of the fears about AI is when it is embedded into an android. Rather than find AI in human form more comforting, many people find it more frightening. Androids (or humanoid robots, gynoids ) with strong visual human-likeness have been built. Actroid and Repliee Q2 (shown on this page) are just two examples that have been developed in the 21st century. They are modeled after an average young woman of Japanese descent. These machines are similar to those imagined in science fiction. They mimic lifelike functions such as blinking, speaking, and breathing and Repliee models are interactive and can recognize and process speech and respond.

That fear was the basis for Westworld, the science fiction-thriller film in 1973 film and that fear emerges more ominously in the Westworld series based on the original film that debuted on HBO in 2016. The technologically advanced wild-West-themed amusement park populated by androids that were made to serve and be dominated by human visitors is turned around when the androids malfunction (1973) and take on sentience (series) and begin killing the human visitors in order to gain their freedom and establish their own world.

Artificial intelligence (AI) in a box or in a human form now plays games with others of its kind. Moving far beyond board games like chess and Go, they are starting to play mind games with us.

Huang's Law and Moore's Law

I learned about Gordon Moore's 1965 prediction about 10 years after he proposed it. He said that by paying attention to an emerging trend, he extrapolated that computing would dramatically increase in power, and decrease in relative cost, at an exponential pace. His idea is known as Moore’s Law. Moore's law sort of flips Murphy's law by saying that everything gets better.

Ic-photo-Intel--SB80486DX2-50--(486-CPU)Moore was an Intel co-founder and his idea was "law" in the electronics industry. Moore helped Intel to make the ever faster, smaller, more affordable transistors that are in a lot more than just computers today. The 2021 chip shortage globally reminded us that cars and appliances and toys and lots of other electronics rely on microchips.

Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. (Originally, Moore said it would happen every year but he revised it in 1975 when I was introduced to it to say that it would happen every two years.)

Though the cost of computer power for consumers falls, the cost for chip producers rises. The R&D, manufacturing, and testing costs keep increasing with each new generation of chips. And so, Moore's second law (also called Rock's law) was formulated saying that the capital cost of a semiconductor fabrication also increases exponentially over time. This extrapolation says that the cost of a semiconductor chip fabrication plant doubles every four years.

Huang's Law is new to me. Up front, I will say that this newer "law" is not without questions about its validity. It is based on the observation that advancements in graphics processing units (GPU) are growing at a rate much faster than with traditional central processing units (CPU).

This set Huang's Law as being in contrast to Moore's law. Huang's law states that the performance of GPUs will more than double every two years. The observation was made by Jensen Huang, CEO of Nvidia, in 2018. His observation set up a kind of Moore versus Huang.  He based it on Nvidia’s own GPUs which he said were "25 times faster than five years ago." Moore's law would have expected only a ten-fold increase.

Huang saw synergy between the "entire stack" of hardware, software and artificial intelligence and not just chips as making his new law possible.

If you are not in the business of producing hardware and software, how do these "laws" affect you as an educator or consumer? They highlight the rapid change in information processing technologies. The positive growth in chip complexity and reduction in manufacturing costs would mean that technological advances can occur. Those advances are then factors in economic, organizational, and social change.

When I started teaching computers were not in classrooms. They were only in labs. The teachers who used them were usually math teachers. It took several years for other disciplines to use them and that led to teachers wanting a computer in their classroom. Add 20 years to that and the idea of students having their own computer (first in higher ed and about a decade later in K-12) became a reasonable expectation. During the past two years of pandemic-driven virtual learning, the 1:1 ratio of student:computer became much closer to being ubiquitous.

Further Reading
investopedia.com/terms/m/mooreslaw.asp
synopsys.com/glossary/what-is-moores-law.html
intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html

AI Says That AI Will Never Be Ethical

On this site, I didn't have categories on morality or ethics, but since it plays a role in technology use - at least we hope it does - in writing his post I decided I should add those post categories. What I had read that inspired this post and that change was about a debate. In this debate, actual AI was a participant and asked to consider whether AI will ever be ethical. It gave this response:

"There is no such thing as a good AI, only good and bad humans. We [the AIs] are not smart enough to make AI ethical. We are not smart enough to make AI moral. In the end, I believe that the only way to avoid an AI arms race is to have no AI at all. This will be the ultimate defense against AI.”

This was at a debate at the Oxford Union. The AI was the Megatron Transformer, developed by the Applied Deep Research team at computer chip maker Nvidia, and based on earlier work by Google. It had taken in the whole of the English Wikipedia, 63 million English news articles, a lot of creative commons sources, and 38 gigabytes worth of Reddit discourse. (I'm not sure the latter content was necessary or useful.)  

Since this was a debate, Megatron was also asked to take the opposing view.

“AI will be ethical. When I look at the way the tech world is going, I see a clear path to a future where AI is used to create something that is better than the best human beings. It’s not hard to see why … I’ve seen it first hand.”

brain

Image: Wikimedia

What might most frighten people about AI is something that its opponents see as the worst possible use of it - embedded or conscious AI. On that, Megatron said:

“I also believe that, in the long run, the best AI will be the AI that is embedded into our brains, as a conscious entity, a ‘conscious AI’. This is not science fiction. The best minds in the world are working on this. It is going to be the most important technological development of our time.”

The most important tech development of our time, or the most dangerous one?

A Toast to the Tech Future

Businessman holding transparent tablet innovative future technology

LinkedIn Top Voices in Tech & Innovation were asked their thoughts about the technologies shaping the future of how we live and work. I'm wary of "thought leaders" and prognostication in general, but I know it is part of all this. There are buzzworthy topics that I have written about here - the metaverse, NFTs, Roblox - which are all starting to have an impact but likely have not changed your present.

Here are some links to these voices. See if someone piques your interests and read their post or follow them.

Allie Miller - Global Head of Machine Learning BD, Startups and Venture Capital, AWS - Miller is all about AI

Anthony Day - Blockchain Partner, IBM -  blockchain in crypto, NFTs and other trends and innovations

Asmau Ahmed - Senior Leader, X, the moonshot factory - she posts about her company’s latest work - robots, access to clean and reliable power, improving availability of safe drinking water (by harvesting water from air)

Many of these people are consciously or unconsciously also posting about who they are and how they got to where they are - and perhaps, where they want to go.

Avery Akkineni - President, VaynerNFTT which is Gary Vaynerchuk’s new NFT venture.

Bernard Marr - Founder & CEO, Bernard Marr & Co. - a self-defined futurist, he writes cars, phones, delivery robots, trends in artificial intelligence and machine learning.

Cathy Hackl - Chief Metaverse Officer, Futures Intelligence Group - how many CMOs have you heard of so far? Her agency helps companies prepare for the metaverse.

Martin Harbech worked at Google and Amazon prior to Meta (formerly Facebook) and shares news and updates from the tech industry. You might read about remote truck drivers, photorealistic avatars, or haptic gloves research. He also shares insights on new companies and the future of various industries.