The Augmented Reality of Pokémon Go

Go
People have been searching for creatures and running down their phone batteries this month since Pokémon Go was released.
Is there any connection of this technology to education, Ken? Let's see.

First off, Pokémon Go is a smartphone game that uses your phone’s GPS and clock to detect where and when you are in the game and make Pokémon creatures appear around you on the screen. The objective is to go and catch them.

This combination of a game and the real world interacting is known as augmented reality (AR). AR is often confused with VR - virtual reality. VR creates a totally artificial environment, while augmented reality uses the existing environment and overlays new information on top of it.

The term augmented reality goes back to 1990 and a Boeing researcher, Thomas Caudell, who used it to describe the use of head-mounted displays by electricians assembling complicated wiring harnesses.

A commercial applications of AR technology that most people have seen is the yellow "first down" line that we see on televised football games which, of course, is not on the actual field.

Google Glass and the displays called "heads-up" in car windshields are another consumer AR application. there are many more uses of the technology in industries like healthcare, public safety, gas and oil, tourism and marketing.

Back to the game... My son played the card game and handheld video versions 20 years ago, so I had a bit of Pokémon education. I read that it is based on the hobby of bug catching which is apparently popular in Japan, where the games originated. Like bug catching or birding, the goal is to capture actual bugs or virtual birds and Pokémon creatures and add them to your life list. The first generation of Pokémon games began with 151 creatures and has expanded to 700+, but so far only the original 151 are available in the Pokémon Go app.

I have seen a number of news reports about people doing silly, distracted things while playing the game, along with more sinister tales of people being lured by someone via a creature or riding a bike or driving while playing. (The app has a feature to try to stop you using from it while moving quickly, as in a car.)

Thinking about educational applications for the game itself doesn't yield anything for me. Although it does require you to explore your real-world environment, the objective is frivolous. So, what we should consider is the use of VR in education beyond the game, while appreciating that the gaming aspect of the app is what drives its appeal and should be used as a motivator for more educational uses.
AR
The easiest use of VR in college classrooms is to make use of the apps already out there in industries. Students in an engineering major should certainly be comfortable with understanding and using VR from their field. In the illustration above, software (metaio Engineer) allows someone to see an overlay visualization of future facilities within the current environment. Another application can be having work and maintenance instructions directly appear on a component when it is viewed.
Augmented reality can be a virtual world, even a MMO game. The past year we have heard more about virtual reality and VR headsets and goggles (like Oculus Rift) which are more immersive, but also more awkward to use.This immersiveness is an older concept and some readers may recall the use of the term "telepresence.” 

Telepresence referred to a set of technologies which allowed a person to feel as if they were present, or to to give the appearance of being present, or to have some impact at place other than their true location. Telerobotics does this, but more commonly it was the move from videotelephony to videoconferencing. Those applications have been around since the end of the last century and we have come a god way forward from traditional videoconferencing to doing it with hand-held mobile devices, enabling collaboration independent of location.

In education, we experimented with these applications and with the software for MMOs, mirror worlds, augmented reality, lifelogging, and products like Second Life. Pokémon Go is Second Life but now there is no avatar to represent us. We are in the game and the game is the world around us, augmented as needed. The world of the game is the world.

Deep Text

typeface
What is "deep text?"  It may have other meanings in the future, but right now Deep Text is an AI engine that Facebook is building. The goal of Deep Text is big - to understand the meaning and sentiment behind all of the text posted by users to Facebook. It is also intended that the engine will help identify content that people may be interested in, and also to weed out spam.

The genesis of Deep Text goes back to an AI paper published last year,"Text Understanding from Scratch," by Xiang Zhang and Yann LeCun.

Facebook pays attention to anything you type in the network, not just "posts" but also comments on other people's posts, Facebook researchers say that 70% of us regularly type and then decide not to post. They are interested in this self-censorship that occurs. Men are more likely to self-censor their social network posts, compared to women. Facebook tracks what you type, even if you never post it. 

Why does Facebook care? If they know what your typing is about, it can show it to other people who care about that topic, and, of course, it can better target ads to you and others.

This is not easy if you want to get deeper into the text. If you type the word "Uber" what does that mean? Do you need a ride? Are you complaining or complimenting the Uber service? What can Facebook know if you type "They are the Uber of food trucks"? 

This is a deep use of text analysis. With 400,000 new stories and 125,000 comments on public posts being shared every minute on Facebook, they need to analyze several thousand per second across 20 languages. A human might be able to do a few per minute, but obviously this is far beyond the capabilities of humans, The intelligence need to be artificial.

A piece on slate.com talks about how in Facebook Messenger might use Deep Text for chat bots to talk more like a human, interpreting text rather than giving programmed replies to anticipated queries. Saying "I need a new phone" is very different from "I just bought a new phone."The former opens the opportunity to suggest a purchase. The latter might mean you are open to writing a review.

Along with filtering spam, it could also filter abusive comments and to generally improve search from within Facebook.
Parsing text with software has been going on for decades. Machine scoring of writing samples has been an ongoing and controversial since it began. Ellis Batten Page discussed back in 1966 the possibility of scoring essays by computer, and in 1968 he published his successful work with a program called Project Essay Grade™ (PEG™). But the technology of that time, didn't allow computerized essay scoring to be cost-effective, Today, companies like Educational Testing Service offer these services, but what Facebook wants to do is quite different and in some ways more sophisticated.

Deep Text leverages several deep neural network architectures. These are things that are beyond my scope of knowledge - convolutional and recurrent neural nets, word-level and character-level based learning - but I will be using it, and so I want to understand what is going on behind the page.

If you read about Microsoft’s Tay chatbot, you know that it was artificial intelligence that was supposed to learn how to talk like a teenager. users gamed the software and "taught" it to talk like a Nazi, which became big news on social media. The bot was created by Microsoft's Technology and Research, and Bing divisions and named "Tay" as an acronym "thinking about you."

Facebook is testing Deep Text in limited and specific scenarios.

Listening to Wikipedia


visualscreenshot of the hatnote visualization of Wikipedia edits



There is a wonderful STEAMy mashup application online that lets you listen to the edits being made right now on Wikipedia. How? It assigns sounds to the edits being made. If someone makes an addition, it plays a bell. It someone makes a subtraction from an entry, you'll hear a string plucked. The pitch changes according to the size of the edit - the larger the edit, the deeper the note.

The result is a pleasantly tranquil random but musical composition that reminds me of some music from Japan and China.

You can also watch recent changes. A the top of the page, green circles show edits being made by unregistered contributors, and purple circles mark edits performed by automated bots. White circles are brought to you by Registered Users.

If you hear  a swelling string sound, it means that a new user has join the site.(You can welcome him or her by clicking the blue banner and adding a note on their talk page.)

You can select a language version of Wikimedia to listen to. When I selected English Wikipedia edits at midday ET, there were about 100 edits per minute resulting in a a slow but steady stream of sound. You can select multiple languages (refresh the page first) if you want to create a cacophony of sounds. You can listen to the very quiet sleeping side of the planet or the busier awake and active side. The developers say that there is something reassuring about knowing that every user makes a noise, every edit has a voice in the roar.

The site is at listen.hatnote.com and the notes there tell us that Hatnote grew out of a 2012 WMF Group hackathon. It is built using D3 and HowlerJS and is is based on BitListen by Maximillian Laumeister. The source code is available on GitHub. It was built by Hatnote, Stephen LaPorte and Mahmoud Hashemi.

Audiation is a term used to refer to comprehension and internal realization of music, or the sensation of an individual hearing or feeling sound when it is not physically present. Musicians previously used terms such as aural perception or aural imagery to describe this concept, though aural imagery would imply a notational component while audiation does not necessarily do so. Edwin Gordon suggests that "audiation is to music what thought is to language," and his research based on similarities between how individuals learn language and how they learn to make and understand music.

As the Hatnote site points out, Wikipedia is a top 10 website worldwide with hundreds of millions of users. It includes more than a dozen actual projects including Wiktionary, Commons, and Wikibooks. It uses more than 280 languages. Perhaps more amazingly, it has only about 200 employees and relies mostly on community support for content, edits - and donations. Compare that business model to other top 100 websites worldwide.

 


Damn Algorithms

code

In this information age steeped in computers, the engine humming under the surface of this website and much of the technology we use is full of mathematics and computer science. That means it uses algorithms. I was reading a story about how Facebook again (probably constantly) tweaked its algorithms for what we see in our news feed. What is all this about and where did it come from?

Without getting too complicated, an algorithm is a self-contained step-by-step set of operations to be performed. Algorithms can perform calculations, process data and automate reasoning.

The concept and origin of the word goes back centuries. The words 'algorithm' and 'algorism' come from the name al-Khwarizmi. Al-Khwarizmi (c.780-850) and from Algoritmi, the Latin form of his name. He was a Persian mathematician, astronomer, geographer, and scholar. The importance of his contributions to mathematics can also be seen in the word "algebra" which is derived from al-jabr, one of the two operations he used to solve quadratic equations. His name is also the origin of the Spanish guarismo and of Portuguese algarismo, both meaning "digit."

So, why do I damn algorithms? Because they are used to control so much of what we do online. They are hiding behind all of this. You can't read a financial article without hearing about someone changing the algorithms in order to do flash trading or something devious on the stock market. They control the ads you see and what offers come to you and what Amazon recommends that you buy.

The audio below is about how Facebook's stock could benefit from a new Instagram algorithm. Facebook acquired Instagram, which is currently a hot social media property, and their intent is to increase ad revenue rather than increase your pleasure in using the networks.

Some people are very worried about artificial intelligence and robots taking over. The nasty little algorithms are already here.

*  If you want to know how to compute the RSA algorithm shown partially at the top of this post, including how to select values for d, e, n, p, q, and ? (phi), you can watch this video. It also will help insomniacs.
 

Teach Online, Even If Your School Doesn't Offer a Platform

If you have never had the opportunity to teach online and have wondered what it's like, here's a chance to find out. Canvas offers you a chance to try out their learning management system (LMS) for free. They offer two options: Take Canvas for a test drive with a free, two-week trial account that is pre-loaded with course content so that you can explore without having to build from scratch. But, even better, is the offer to actually teach your existing class on Canvas for free, forever. "You bring the content and students. We’ll provide the awesome platform, " says Canvas.

Sure, this is an offer meant to help market the platform and entice you to recommend it at your institution, but take advantage of it. That is especially true if you have never taught online and want to give it a try. Perhaps your school doesn't even offer the option to supplement your face-to-face class with an online section. Though I am more involved in how any LMS including Canvas is used in higher education, this is probably even more applicable to pre-college. (Look at how the platform is being used in K-12 education.)  

I have designed online learning and taught in a number of learning management systems over the years - WebBoard, WebCT, Blackboard, eCollege, Sakai, Moodle and Canvas. My first experience with Canvas was when I taught a MOOC in the Canvas Network back in 2013. That was a meta-MOOC called "Academia and the MOOC" and was intended to attract teachers as well as others in academic roles (instructional designer, support staff, administration and student).

I found Canvas easy to use, but it seemed like a work-in-progress at the time. It lacked many of the tools I was used to having built-in (equation editor, white board, blog, wiki and journal features etc.). But here are some interesting things that came out of that experience.

Teaching that MOOC led me to connect with many other online instructors. Some had take my "course" (which was more of a large conversation) in order to try out Canvas as much as to learn about MOOCs.

dip your toe inWhile I was facilitating the MOOC, I was contacted by two other New Jersey colleges that were considering moving to Canvas. The instructional designers at both schools separately reported the same phenomena at their colleges. The instructional design staff felt as I had when I encountered Canvas - it seemed "underpowered." But, their faculty really liked it for pretty much the same reason: it was clean and simple and didn't have all those "tools we never use." Both colleges now use Canvas.

I think that anyone currently teaching at any level should have experienced being a student and being a teacher in an online setting. There is just no getting around the fact that it is and will continue to be a part of what learning has become and how it is offered.

Dip your foot into the online water - or just jump in with your whole course. It's not as scary as it looks.