Too Much Artificial Intelligence

You may feel like there is too much about artificial intelligence all around you. It's in the news, conversations and, in education and industry, it is in the tools we use. It has also dominated this blog in the past year. Too much so, I'm beginning to think.

I know that AI is a big topic currently and it can't be ignored. But it is just one of many topics this blog has addressed since 2006. I will make a more conscious effort this summer to address some non-AI issues, acknowledging that it seems to permeate almost every other category.

AI attack

The AI-augmented Educational Support Professional

ai assistant

Working with AI on instructional design isn't at this point. Yet.

You can find numerous articles online about how artificial intelligence (AI) tools and activities can create the "AI-augmented professor." I have seen fewer opinions on how non-teaching staff members who support the learning process at most universities will be affected.

Certainly, AI augmentation will also affect those who support faculty and students, such as instructional designers, researchers, administrators, and other nonteaching professionals.

I read a piece on insidehighered.com about this group and keyed in on instructional designers since that was my area. The article uses an awkward term - BYOAI Bring Your Own AI - for his group because of the hybrid nature of home and office work for this group. Of  course, many faculty are also hybrid now, teaching in a campus classroom as well as from their home or office online.

Instructional designers are already using generative AI tools to create graphics, images and audio segments for classes. New tools, such as OpenAI’s Sora can generate full-motion videos from prompts. Chatbots can write or revise content. It can produce a rough draft syllabus for a new course.

This is also something that should concern those people coming into higher ed (and secondary education too, though perhaps to a lesser degree currently). Citing a survey by Microsoft and LinkedIn, ZDNet reported that “AI skills are so much of a priority that the report suggests 66 percent of business leaders wouldn’t hire someone without AI skills, and 71 percent of leaders would prefer to hire a less experienced candidate with AI skills than a more experienced candidate without them.”

Keeping Updated With Chat AI Free and Premium

If you ask a question in the ChatGPT free version 3.5 about a film or event that happened in the past year, you may not get a very good answer, or perhaps no answer at all. Version 3.5 was last updated with data in January 2022. To get version 4, you need to pay for the Plus level at $20 per month ($25 for the Team version).

* NOTE There has been an upgrade for free ChatGPT users: OpenAI is opening up many of the features previously reserved for paying customers. This includes access to image and document analysis, data analytics and custom GPT chatbots.

Recently, I asked 3.5  "What films released in 2024 are possible Academy Award nominees?" The answer was generic:
"Predicting specific Academy Award nominees for a year as far out as 2024 is quite speculative since it depends on numerous factors such as the quality of the films, critical reception, box office success, and industry trends at the time of release. However, some potential contenders could include films from established directors or those with strong track records in awards season, as well as films that tackle socially relevant themes or feature standout performances. It's also worth considering that films released towards the end of the year tend to be fresher in the minds of voters during awards season. Without specific knowledge of films released in 2024, it's challenging to pinpoint exact nominees, but historical trends and industry buzz could provide some hints closer to the time."

What about Microsoft Copilot? This chatbot was developed by Microsoft and launched in February 2023. It is Microsoft's primary replacement for the discontinued Cortana. At first, it was called Bing Chat, as a built-in feature for Microsoft Bing and Microsoft Edge. However, they began to unify the Copilot branding across its various chatbot products. They planned to integrate Copilot into Windows so that you could use it directly through the taskbar.

Here is a sample prompt and response from Bing Chat last year (via Wikipedia)

sample

I asked CoPilot today, "What was Carl Jung's concept of shadow self?" The answer was extensive (excerpted below)

"Certainly! Carl Jung, the Swiss psychiatrist and founder of analytical psychology, introduced the concept of the “shadow self.” Let’s delve into what this intriguing idea entails:

    Definition of the Shadow Self:
        The shadow refers to those aspects of ourselves that we reject or deny. These are the darker, less socially acceptable pieces of our personality that don’t align with the persona we present to the world.
        It encompasses traits, emotions, and characteristics that are often viewed as negative, such as anger, jealousy, and greed1.

    Origin and Nature:
        Jung believed that the shadow is an integral part of our psyche, residing in our unconscious mind...

    Encountering the Shadow:

        When encountering the shadow, we experience a range of emotions: fear, disgust, remorse, guilt, compassion, grief, and humility.
        The shadow has the power to possess and overwhelm us, even altering our behavior (“He is not himself today”)3.

In summary, the shadow self represents the hidden, often rejected parts of our personality. Embracing and integrating these aspects is essential for personal growth and individuation."

 

Tay: A Cautionary Tale of AI

chatbot and postsTay was a chatbot originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It "has had a great influence on how Microsoft is approaching AI," according to Satya Nadella, the CEO of Microsoft.

Tay caused almost immediate controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. According to Microsoft, this was caused by trolls who "attacked" the service as the bot made replies based on its interactions with people on Twitter - a dangerous proposition.

It was named "Tay" as an acronym for "thinking about you." It was said that it was similar to or based on Xiaoice, a similar Microsoft project in China that Ars Technica reported that it had "more than 40 million conversations apparently without major incident".

Interestingly, Tay was designed to mimic the language patterns of a 19-year-old American girl and presented as "The AI with zero chill."

It was quickly abused with Twitter users began tweeting politically incorrect phrases, teaching it inflammatory messages so that the bot began releasing racist and sexually-charged messages in response to other Twitter users.

One artificial intelligence researcher, Roman Yampolskiy, commented that Tay's misbehavior was understandable because it mimicked the deliberately offensive behavior of other Twitter users, and Microsoft had not given the bot an understanding of inappropriate behavior. He compared the issue to IBM's Watson, which began to use profanity after reading entries from the website Urban Dictionary.

It was popular in its short life. Within 16 hours of its release, Tay had tweeted more than 96,000 times, That is when Microsoft suspended the account for "adjustments." Microsoft confirmed that Tay had been taken offline, released an apology on its official blog, and said it would "look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values."

Then on March 30, 2016, Microsoft accidentally re-released the bot on Twitter while testing it. Given its freedom, Tay released some drug-related tweets, then it became stuck in a repetitive loop of tweeting "You are too fast, please take a rest", several times a second. The posts appeared in the feeds of 200,000+ Twitter followers.

Tay has become a cautionary tale on the responsibilities of creators for their AI.

In December 2016, Microsoft released Tay's successor, a chatbot named Zo which was an English version of Microsoft's other successful chatbots Xiaoice (China) and Rinna [ja] (Japan).