ELIZA and Chatbots

sheldonI first encountered a chatterbot, it was ELIZA on the Tandy/Radio Shack computers that were in the first computer lab in the junior high school where I taught in the 1970s.

ELIZA is an early natural language processing program that came into being in the mid-1960s at the MIT Artificial Intelligence Laboratory. The original was by Joseph Weizenbaum, but there are many variations on it.

This was very early artificial intelligence. ELIZA is still out there, but I have seen a little spike in interest because she was featured in an episode of the TV show Young Sheldon. The episode, "A Computer, a Plastic Pony, and a Case of Beer," may still be available at www.cbs.com. Sheldon and his family become quite enamored by ELIZA, though the precocious Sheldon quickly realizes it is a very limited program.

ELIZA was created to demonstrate how superficial human to computer communications was at that time, but that didn't mean that when it was put on personal computers, humans didn't find it engaging. Sure, kids had fun trying to trick it or cursing at it, but after awhile you gave up when it started repeating responses.

The program in all the various forms I have seen it still uses pattern matching and substitution methodology. She (as people often personified ELIZA), gives canned responses based on a keyword you input. If you say "Hello," she has a ready response. If you say "friend," she has several ways to respond depending on what other words you used. Early users felt they were talking to "someone" who understood their input.

ELIZA was one of the first chatterbots (later clipped to chatbot) and a sample for the Turing Test. That test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human, is not one ELIZA can pass by today's standards. ELIZA fails very quickly if you ask her a few complex questions.

The program is limited by the scripts that are in the code. The more responses you gave her, the more variety there will be in her answers and responses. ELIZA was originally written in MAD-Slip, but modern ones are often in JavaScript or other languages. Many variations on the original scripts were made as amateur coders played around with the fairly simple code.

One variation was called DOCTOR and was made to be a crude Rogerian psychotherapist who likes to "reflect" on your questions by turning the questions back at the patient.  This was the version that my students when I taught middle school found fascinating and my little programming club decided to hack the code and make their own versions.

Are chatbots useful to educators?  They have their uses, though I don't find most of those applications to be things that will change education in ways I want to see it change. I would like to see them used for things like e-learning support and language learning

If you want to look back at an early effort, you can try a somewhat updated version of ELIZA that I used in class at my NJIT website. See what ELIZA's advice for you turns out to be.

 

Edge Computing

I learned about edge computing a few years ago. It is a method of getting the most from data in a computing system by performing the data processing at the "edge" of the network. The edge is near the source of the data, not at a distance. By doing this, you reduce the communications bandwidth needed between sensors and a central datacenter. The analytics and knowledge generation are right at or near the source of the data.

The cloud, laptops, smartphones, tablets and sensors may be new things but the idea of decentralizing data processing is not. Remember the days of the mainframe computer?

The mainframe is/was a centralized approach to computing. All computing resources are at one location. That approach made sense once upon a time when computing resources were very expensive - and big. The first mainframe in 1943 weighed five tons and was 51 feet long. Mainframes allowed for centralized administration and optimized data storage on disc.

Access to the mainframe came via "dumb" terminals or thin clients that had no processing power. These terminals couldn't do any data processing, so all the data went to, was stored in, and was crunched at the centralized mainframe.

Much has changed. Yes, a mainframe approach is still used by businesses like credit card companies and airlines to send and display data via fairly dumb terminals. And it is costly. And slower. And when the centralized system goes down, all the clients go down. You have probably been in some location that couldn't process your order or or access your data because "our computers are down."

It turned out that you could even save money by setting up a decentralized, or “distributed,” client-server network. Processing is distributed between servers that provide a service and clients that request it. The client-server model needed PCs that could process data and perform calculations on their own in order to have applications to be decentralized. 

Google car

Google Co-Founder Sergey Brin shows U.S. Secretary of State John Kerry the computers inside one of
Google's self-driving cars - a data center on wheels. June 23, 2016. [State Department photo/ Public Domain]

Add faster bandwidth and the cloud and a host of other technologies (wireless sensor networks, mobile data acquisition, mobile signature analysis, cooperative distributed peer-to-peer ad hoc networking and processing) and you can compute at the edge.  Terms like local cloud/fog computing and grid/mesh computing, dew computing, mobile edge computing, cloudlets, distributed data storage and retrieval, autonomic self-healing networks, remote cloud services, augmented reality and more that I haven't encountered yet have all come into being.

Recently, I heard a podcast on "Smart Elevators & Self-Driving Cars Need More Computing Power" that got me thinking about the millions of objects (Internet of Things) connecting to the Internet now. Vehicles, elevators, hospital equipment, factory machines, appliances and a fast-growing list of things are making companies like Microsoft and GE put more computing resources at the edge of the network. 

This is computer architecture for people not things. In 2017, there were about 8 billion devices connect to the net. It is expected that in 2020 that number will be 20 billion. Do you want the sensors in your car that are analyzing traffic and environmental data to be sending it to some centralized resource - or doing it in your car? Milliseconds matter in avoiding a crash. You need the processing to be done on the edge. Cars are "data centers on wheels." 

Remember the early days of the space program? All the computing power was on Earth. You have no doubt heard the comparison that the iPhone in your pocket has hundreds or even thousands of times the computing power of the those early spacecraft. That was dangerous, but it was the only option. Now, much of the computing power is at the edge - even if the vehicle is also at the edge of our solar system. And things that are not as far off as outer space - like a remote oil pump - also need to compute at the edge rather than needing to connect at a distance to processing power. 

Plan to spend more time in the future at the edge.

The Trends at the NJEDge Conference 2018

There are so many posts I have the past few weeks about trend for education and technology, but one way of seeing what trends may emerge this year is by looking at the tracks, presentations and keynote speakers at EdTech conferences. 

logoI'll be moderating a track next week at NJEdge.Net's Annual Conference: NJEdgeCon2018 "DIGITAL LEADERSHIP & ENTERPRISE TRANSFORMATION" January 11 & 12, 2018 in New Jersey.  My track is, naturally, Education and Technology which has presentations on best practices, innovations and the effectiveness associated with current LMS and online learning tools, effective infrastructure, resources, sustainability models and integrated assessment tools.

But if you look at the other tracks offered, you can see that INFORMATION Technology outweighs instructional technology here. Other tracks at the conference are Big Data & Analytics, Networking & Data Security, Customer Support & Service Excellence,  Aligning Business & Technology Strategies. and Transformation Products & Services.

Amber Mac (as in MacArthur) will talk about adaptation and the accelerating pace of corporate culture in the digital economy.

I have followed her career for a decade from her early tech TV and podcast venture to her current consulting business. She helps companies adapt to, anticipate, and capitalize on lightning-quick changes—from leadership to social media to the Internet of Things, from marketing to customer service to digital parenting and beyond. It’s not about innovation, she says; it’s about adaptation.

When it comes to teachers and technologies, the battle cry of Virginia Tech professor John Boyer is embrace, not replace. In his talk, he presents his view that the best teachers will embrace technologies that help them better communicate with students, but do not fear because those technologies will never replace human to human interaction. But blending the best communicators with the best technology has to offer will produce some amazing and unpredictable opportunities!


Wayne Brown, CEO and Founder of Center for Higher Ed CIO Studies (CHECS), will talk in his session on longitudinal higher education CIO research and the importance of technology leaders aligning technology innovations and initiatives with the needs of the higher education institution. His two-part survey methodology enables him to compare and contrast multiple perspectives about higher education technology leaders. The results provide essential information regarding the experiences and background an individual should possess to serve as a higher education CIO. In collaboration with NJEdge, Wayne will collect data from NJEdge higher education CIOs and will compare the national results with those of the NJ CIOs.

Timothy Renick (a man of many titles: Vice President for Enrollment Management and Student Success, Vice Provost, and Professor of Religious Studies at Georgia State University) is talking about "Using Data and Analytics to Eliminate Achievement Gaps."  The student-centered and analytics-informed programs at GSU has raised graduation rates by 22% and closed all achievement gaps based on race, ethnicity, and income-level. It now awards more bachelor’s degrees to African Americans than any other college or university in the nation. Through a discussion of innovations ranging from chatbots and predictive analytics to meta-majors and completion grants, the session covers lessons learned from Georgia State’s transformation and outlines several practical and low-cost steps that campuses can take to improve outcomes for underserved students.

Greg Davies' topic is "The Power of Mobile Communications Strategies and Predictive Analytics for Student Success and Workforce Development." The technology that has been used to transform, to both good and bad ends, most other major industries can connect the valuable resources available on campus to the students who need them most with minimal human resources. Technology has been used to personalize the digital experience in such industries as banking, retail, information and media, and others by reaching consumers via mobile technology. Higher Education has, in some cases, been slow to adapt innovative and transformative technology. Yet, its power to transform the student engagement and success experience has been proven. With the help of thought leaders in industry and education, Greg discusses how the industry can help achieve the goal of ubiquity in the use of innovative student success technologies and predictive data analytics to enable unprecedented levels of student success and, as a consequence, workforce development.

Learning and Working in the Age of Distraction

screensThere is a lot of talk about distraction these days. The news is full of stories about the Trump administration and the President himself creating distractions to keep the public unfocused on issues they wish would go away (such as the Russias connections) and some people believe the President is too easily distracted by TV news and Twitter.

There are also news stories about the "distraction economy."  So many people are vying for your attention. The average person today is exposed to 1,700 marketing messages during a 24-hour period. Most of these distractions are on screens - TV, computers and phones.  Attention is the new currency of the digital economy.

Ironically, a few years ago I was reading about "second screens," behavioral targeting and social media marketing and that was being called the "attention economy." There is a battle for attention, and the enemy is distraction.

Google estimates that we spend 4.4 hours of our daily leisure time in front of screens. We are still using computers mostly for work/productivity and search. We use smartphones for connectivity and social interactions. Tablets are used more for entertainment. My wife and I are both guilty of "multi-screening." That means we are part of the 77% of consumers watching TV while on other devices. I am on my laptop writing and researching and she is on her tablet playing games and checking mail and messages. It is annoying. We know that.

Of course, the original land of distraction is the classroom. Students have always been distracted. Before the shiny object was a screen full of apps, passing notes was texting, and doodling in your notebook and the cute classmates sitting nearby were the social media. But I have seen four articles on The Chronicle website about "The Distracted Classroom" lately. Is distraction on the rise?

If you are a teacher or student, does your school or your own classroom have a policy on using laptops and phones? If yes, is it enforced?  Anyone who has been in a classroom lately of grade 6 or higher knows that if students have phones or laptops out in class for any reason they are texting, surfing the web, or posting on social media.

Good teachers try to make classes as interactive as possible. We engage students in discussions, group work and active learning, but distractions are there.

Banning devices isn't a good solution. Things forbidden gain extra appeal.

distractionsA few books I have read discuss the ways in which distraction can interfere with learning. In The Distracted Mind: Ancient Brains in a High-Tech World , the authors say that distraction occurs when we are pursuing a goal that really matters and something blocks our efforts to achieve it. Written by a neuroscientist, Adam Gazzaley, and a psychologist, Larry D. Rosen, they join other researchers who report that our brains aren't built for multitasking. This compares to a time a few decades ago when being able to multitask was consider a positive skill.

It seems that the current belief is that we don't really multitask. We switch rapidly between tasks. Any distractions and interruptions, including the technology-related ones - act as "interference" to our goal-setting abilities. 

But is this a new problem or has our brain always worked this way? Is the problem really more about the number of possible distractions and not our "rewired" brains?

Nicholas Carr sounded an alarm in 2011 with The Shallows: What the internet is doing to our brains, arguing that our growing exposure to online media means our brains need to make cognitive changes. The deeper intellectual processing of focused and critical thinking, gets pushed aside in favor of the faster processes like skimming and scanning.

Carr contends that the changes to the brain's "wiring" is real. Neural activity shifts from the hippocampus' deep thinking, to the prefrontal cortex where we are engaged in rapid, subconscious transactions. Substitute speed for accuracy. Prioritize impulsive decision-making over deliberate judgment. 

In the book Why Don't Students Like School?: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom  the author asks questions such as Why Do Students Remember Everything That's on Television and Forget Everything I Say? and Why Is It So Hard for Students to Understand Abstract Ideas? and gives some science and suggestions as answers. But these are difficult questions and simple answers are incomplete answers in many cases.

Some teachers decide to use the tech that is being a distraction to gain attention. I had tried using a free polling service (Poll Everywhere) which allows students to respond/vote using their laptops or phones. You insert questions into your presentation software, and that allows you to track, analyze, and discuss the responses in real time. The problem for me is that all that needs to be pre-planned and is awkward to do on-the-fly, and I am very spontaneous in class with my questioning. Still, the idea of using the tech in class rather than banning it is something I generally accept. But that can't be done 100% of the time, so distracted use of the tech is still going to occur.

bubbleAnd the final book on my distraction shelf is The Filter Bubble. The book looks at how personalization - being in our own bubble - hurts the Internet as an open platform for the spread of ideas. The filter bubble puts us in an isolated, echoing world. The author, Eli Pariser, subtitles the book "How the New Personalized Web Is Changing What We Read and How We Think." Pariser coined the term “filter bubble.” The term is another one that has come up o the news in talking about the rise of Donald Trump and the news bubble that we tend to live in, paying attention to a personalized feed of the news we agree with and filtering out the rest.

Perhaps creating a filter bubble is our way of coping with the attention economy and a way to try to curate what information we have to deal with every day.

Then again, there were a number of things I could have done the past hour instead of writing this piece. I could have done work that I actually get paid to do. I could have done some work around my house. But I wrote this. Why? 

Information overload and non-stop media is hurting my/our discipline for focus and self-control.

Michael Goldhaber defined the attention economy in this more economic way: "a system that revolves primarily around paying, receiving and seeking what is most intrinsically limited and not replaceable by anything else, namely the attention of other human beings.” In order for that economy to be profitable, we must be distracted. Our attention needs to be drawn away from the competition.

As a secondary school teacher for several decades, I saw the rise of ADHD. That was occurring before the Internet and lack of attention, impulsivity and boredom were all symptoms. It worsened after the Internet was widespread, but it was there before it and all the personal digital devices.

Back in 1971,  Herbert A. Simon observed that “what information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

We are collectively wiser than ever before. We have the wisdom of the world in a handheld computer connected to almost everything. But it is so difficult to filter out the distractions and garbage that we don't have a lot of success translating information into knowledge. People used to say that finding out something on the Internet was like taking a sip from a fire hose. Search filtering has helped that, but so far the only filters for our individual brains are self-created and often inadequate.