I Am In a Strange Loop

Magritte
    ”The Treachery of Images” by René Magritte says that "This is not a pipe." A strange loop.

I got a copy of Douglas Hofstadter's book, Godel, Escher, Bach: an Eternal Golden Braid, when I started working at NJIT in 2000. It was my lunch reading. I read it in almost daily spurts. I often had to reread because it is not light reading.

book coverIt was published in 1979 and won the 1980 Pulitzer Prize for general non-fiction. It is said to have inspired many a student to pursue computer science, though it's not really a CS book. It was further described on its cover as a "metaphorical fugue on minds and machines in the spirit of Lewis Carroll." In the book itself, he says "I realized that to me, Godel and Escher and Bach were only shadows cast in different directions by some central solid essence. I tried to reconstruct the central object, and came up with this book."

I had not finished the book when I left NJIT and it went on a shelf at home. This summer I was trying to thin out my too-many books and I came upon it again with its bookmarker glowering at me from just past the halfway point in the pages. So, I went back to reading it. Still, tough going, though very interesting.

I remembered writing a post here about the book (it turned out to be from 2007) when I came upon a new book by Hofstadter titled I Am a Strange Loop. That "strange loop" was something he originally proposed in the 1979 book. This post is a rewrite and update on that older post.

The earlier book is a meditation on human thought and creativity. It mixes the music of Bach, the artwork of Escher, and the mathematics of Godel. In the late 1970s when he was writing interest in computers was high and artificial intelligence (AI) was still more of an idea than a reality. Reading Godel, Escher, Bach exposed me to some abstruse math (like undecidability, recursion, and those strange loops) but (here's where Lewis Carroll's "What the Tortoise Said to Achilles" gets referenced though some of you will say it's really a Socratic dialogue as in Xeno's fable, Achilles and the Tortoise) each chapter has a dialogue between the Tortoise and Achilles and other characters to dramatize concepts. Allusions to Bach's music and Escher's art (that loves paradox) also are used, as well as other mathematicians, artists, and thinkers. Godel's Incompleteness Theorem serves as his example of describing the unique properties of minds.

His new book back then was I Am a Strange Loop which focuses on the "strange loop" that he originally proposed in the 1979 book. I haven't read that book, but since I made it through the earlier volume (albeit in 18 years), I may give Strange Loop a try.

From what I read about the author, he was disappointed with how Godel, Escher, Bach (GEB) was received. It certainly got good reviews - and a Pulitzer Prize - but he felt that readers and reviewers missed what he saw as the central theme. I have an older edition but in a 20th-anniversary edition, he added that the theme was "a very personal attempt to say how it is that animate beings can come out of inanimate matter. What is a self, and how can a self come out of stuff that is as selfless as a stone or a puddle?"

I Am a Strange Loop focuses on that theme. In both books, he addresses "self-referential systems." (see link at bottom)

One thing that stuck with me from my first attempt at GEB is his using "meta" and defining it as meaning "about." Some people might say that it means "containing." Back on the early part of this century, I thought about that when I first began using Moodle as a learning management system. When you set up a new course in Moodle (and in other LMSs since then), it asks if this is a "metacourse." In Moodle, that means that it is a course that "automatically enrolls participants from other 'child' courses." Metacourses (AKA "master courses") feature all or part of the same content but customized to the enrollments of other sections. 

This was a feature used in big courses like English or Chemistry 101. In my courses, I thought more about having things like meta-discussions or discussions about discussions. My metacourse might be a course about the course. Quite self-referential.

I suppose it can get loopy when you start saying that if we have a course x, the metacourse X could be a course to talk about course x but would not include course x within itself. Though I suppose that it could.

Have I lost you?

Certainly, metatags are quite common on web pages, photos and for cataloging, categorizing and characterizing content objects. Each post on Serendipity35 is tagged with one or more categories and a string of keyword tags that help readers find similar content and help search engines make the post searchable.

A brief Q&A with Hofstadter published in Wired  in March 2007 about the newer book says that he considers the central question to him to be "What am I?."

His examples of "strange loops" include Escher's piece, "Drawing Hands," which shows two hands drawing each other, and the sentence, "I am lying."

Hofstadter gets spiritual in his further thinking and he finds at the core of each person a soul. He feels the "soul is an abstract pattern." Because he felt the soul is strong in mammals (weaker in insects), it brought him to vegetarianism.

He was considered to be an AI researcher, but he now thought of himself as a cognitive scientist.

Reconsidering GED, he decides that another mistake in that book's approach may have been not seeing that the human mind and smarter machines are fundamentally different. He has less of an interest in computers and claims that he always thought that his writing would "resonate with people who love literature, art, and music" more than the tech people.

If it has taken me much longer to finish Godel, Escher, Bach than it should, that makes sense if we follow the strange loop of Hofstadter's Law. ("It always takes longer than you expect, even when you take into account Hofstadter's Law.)



End Note: 
A self-referential situation is one in which the forecasts made by the human agents involved serves to create the world they are trying to forecast. http://epress.anu.edu.au/cs/mobile_devices/ch04s03.html. Social systems are self-referential systems based on meaningful communication. http://www.n4bz.org/gst/gst12.htm.

Are You Prometheus or Zeus?

Prometheus Brings Fire by Heinrich Friedrich Füger
Prometheus Brings Fire by Heinrich Friedrich Füger, 1817

 

Do you know the myth of Prometheus and his argument with Zeus?  I am reading Stephen Fry's books that are retellings of the myths of Ancient Greece, Mythos and the companion volume Heroes, and he has suggested that we are approaching a similar moment in our history. 

I don't know if you can see yourself as Jason aboard the Argo ques ting for the Golden Fleece, or as Oedipus solving the riddle of the Sphinx. But I think we might divide all of us into two groups by deciding on which side we stand when it comes to artificial intelligence as "personified" by any robot with a human appearance and advanced artificial intelligence.

The myth that applies is the story of Prometheus and his argument with Zeus.

In Greek mythology, Prometheus, whose name means "forethought" is credited with the creation of man from clay, and also the one who defies Zeus by stealing fire and giving it to humanity.

To humans, his theft is heroic. Fire, perhaps our first technology, enabled progress, civilization and the human arts and sciences.

Prometheus believed that humans needed and deserved fire. Zeus did not.

In Hesiod's version of the story of Prometheus and the theft of fire it is clear that Zeus withheld not only fire from humanity, but also "the means of life." Zeus feared that if humans had fire and all that it would lead them to, they would no longer need the gods.

Fry writes that “The Titan Prometheus made human beings in clay. The spit of Zeus and the breath of Athena gave them life. But Zeus refused to allow us to have fire. And I think fire means both literal fire – to allow us to become Bronze Age man, to create weapons and to cook meat. To frighten the fierce animals and become the strongest, physically and technically. But also the internal fire of self-consciousness and creativity. The divine fire. Zeus did not want us to have it. And Prometheus stole fire from heaven and gave it to man.”

If we think about a modern Prometheus, perhaps we can make him into a scientist who has created a very powerful android.

It is fitting that the word "android" was coined from the Greek andr-, meaning "man" (male, as opposed to anthr?p-, meaning human being) and the suffix -oid, meaning "having the form or likeness of. (We use "android" to refer to any human-looking robot, but a robot with a female appearance can also be referred to as a "gynoid.")

Our Prometheus the AI scientist is ready to give his android to the world. But his boss, Mr. Zeus, is opposed. What will happen when the android become sapient?" Zeus asks. Sapience is the ability of an organism or entity to act with judgment. "And what if these androids also become sentient?" Zeus asks. Sentience is the capacity to feel, perceive or experience subjectively.

Stephen Fry takes up that argument:

"In a hundred years time, we can guarantee there will be sapient beings on this earth that have been intelligently designed. You could call them robots, you could call them compounds of augmented biology and artificial intelligence, but they will exist. The future is enormous, it has never been more existentially transformative.

Will the Prometheus who makes the first piece of really impressive robotic AI – like Frankenstein or the Prometheus back in the Greek myth – have the question: do we give it fire? Do we give these creatures self-knowledge, self-consciousness? An autonomy that is greater than any other machine has ever had and would be similar to ours? In other words: shall we be Zeus and deny them fire because we are afraid of them? Because they will destroy us? The Greeks, and the human beings, did destroy the gods. They no longer needed them. And it is very possible that we will create a race of sapient beings who will not need us.”

So, are you like Prometheus wanting mankind to have these highly evolved robots? Or do you agree with Zeus that they will eventually destroy us?

 

Here is an excerpt concerning this idea from an interview Stephen Fry did in Holland.
(Full interview at https://dewerelddraaitdoor.bnnvara.nl/nieuws/de-twee-kanten-van-stephen-fry)

The AI of Job Search

"Human-centered design paired with human-centric AI is key to the future of work," says Sara Ortloff Khoury, Director of UX Design at Google Together. That is a group that considers the "critical user journeys of what people do every day." That often means looking at tasks that humans don’t want to do, but automation can do. That might mean that an invite in your email is added to your calendar automatically or a contact you email frequently is moved up in priority, and now it also has to do with looking for a job or looking for a new employee.

As part of "designing the future of work," Khoury's team developed 3 foundational human-centered design AI-for-enterprise principles. The principles are:
1. enhance what people can achieve at work
2. anticipate what people need at work
3. reduce bias and increase opportunities

I keep reading predictions by technologists and educators that say things like that by 2030 one-third of jobs will require skills that aren’t common or even don’t exist today.

What the Google folks did in search was put job search directly on Google Search. That means that a search will produce up-to-date job descriptions as well as information about companies, salaries, commute-times, and more. I tested that by simply searching on jobs blogger and it resulted in 100 jobs with the ones nearest me at the top.

They have also made Cloud Talent Solution, which offers plug-and-play access to Google’s AI and search capabilities for large companies to find talent.  (They report that job boards like CareerBuilder, and employers like Johnson and Johnson already use it.).

They also launched about a year ago Hire, which is a recruiting app that integrates with G Suite and is suited to small and medium-sized businesses.

New Jobs in an AI Machine Learning World

You can argue about the good and bad of AI, but there is no argument that artifical intelligence is here and it is affecting jobs. Elon Musk says he fears where AI will ultimately lead, but he uses AI in his Tesla vehicles. 

I keep hearing that AI will free humans of boring drudgery jobs and give us more free time. Then, we can do human work rather than machine work. I also hear that for all the jobs lost to AI there will be at least half as many new ones created.

bookThe book Human + Machine: Reimagining Work in the Age of AI, examines organizations that deploy AI systems — from machine learning to computer vision to deep learning. The authors found that AI systems are augmenting human capabilities and enabling people and machines to work collaboratively, changing the very nature of work and transforming businesses.  

A symbiosis between man and machine is not here now, but it is already being called the third wave of business transformation and it concerns using adaptive processes. (The first wave was standardized processes, and the second was automated processes.)

Even when AI has advanced and humans and machines are symbiotic partners, humans will be needed. In this book, they identify three broad types of new jobs in the "missing middle" of the third wave. 

Trainers will be needed to teach AI systems how they should perform, helping natural-language processors and language translators make fewer errors and teaching AI algorithms how to mimic human behaviors.

Explainers will be needed to bridge the gap between technologists and business leaders, explaining the inner workings of complex algorithms to nontechnical professionals. 

The third category of jobs will involve sustainers who will ensure that AI systems are operating as designed. They might be in roles such as context designers, AI safety engineers and ethics compliance managers. For example, a safety engineer will try to anticipate and address the unintended consequences of an AI system, and a ethics complaince manager acts as ombudsmen for upholding generally accepted human values and morals.

And for education? These jobs will require new skills. The skills the authors describe all sound unfamiliar, as I suppose they should. Arewe ready to teach Rehumanizing Time, Responsible Normalizing, Judgment Integration, Intelligent Interrogation, Bot-Based Empowerment, Holistic Melding, Reciprocal Apprenticing, and Relentless Reimagining.

Their labels may be unfamiliar, but the skills can also be seen as extensions or advancements of more familiar ones in a new contect. 

For example, "Judgment Integration" is needed when a machine is uncertain about what to do or lacks necessary business or ethical context in its reasoning model. A human ready to be smart about sensing where, how and when to step in means that human judgment will still be needed even in a reimagined process. 

Imagine that autonomus vehicle approaching at high speed a deer and a child in the road ahead. It needs to swerve, but it will also need to hit one of them in its avoidance maneuver. Which would it choose? The decision will not be based on how we feel about a child versus a wild animal - unless a human has been involved in the process earlier.


Read more in the book and in "What Are The New Jobs In A Human + Machine World?" by Paul R. Daugherty and H. James Wilson on forbes.com