People have been searching for creatures and running down their phone batteries this month since Pokémon Go was released.
Is there any connection of this technology to education, Ken? Let's see.
First off, Pokémon Go is a smartphone game that uses your phone’s GPS and clock to detect where and when you are in the game and make Pokémon creatures appear around you on the screen. The objective is to go and catch them.
This combination of a game and the real world interacting is known as augmented reality (AR). AR is often confused with VR - virtual reality. VR creates a totally artificial environment, while augmented reality uses the existing environment and overlays new information on top of it.
The term augmented reality goes back to 1990 and a Boeing researcher, Thomas Caudell, who used it to describe the use of head-mounted displays by electricians assembling complicated wiring harnesses.
A commercial applications of AR technology that most people have seen is the yellow "first down" line that we see on televised football games which, of course, is not on the actual field.
Google Glass and the displays called "heads-up" in car windshields are another consumer AR application. there are many more uses of the technology in industries like healthcare, public safety, gas and oil, tourism and marketing.
Back to the game... My son played the card game and handheld video versions 20 years ago, so I had a bit of Pokémon education. I read that it is based on the hobby of bug catching which is apparently popular in Japan, where the games originated. Like bug catching or birding, the goal is to capture actual bugs or virtual birds and Pokémon creatures and add them to your life list. The first generation of Pokémon games began with 151 creatures and has expanded to 700+, but so far only the original 151 are available in the Pokémon Go app.
I have seen a number of news reports about people doing silly, distracted things while playing the game, along with more sinister tales of people being lured by someone via a creature or riding a bike or driving while playing. (The app has a feature to try to stop you using from it while moving quickly, as in a car.)
Thinking about educational applications for the game itself doesn't yield anything for me. Although it does require you to explore your real-world environment, the objective is frivolous. So, what we should consider is the use of VR in education beyond the game, while appreciating that the gaming aspect of the app is what drives its appeal and should be used as a motivator for more educational uses.
The easiest use of VR in college classrooms is to make use of the apps already out there in industries. Students in an engineering major should certainly be comfortable with understanding and using VR from their field. In the illustration above, software (metaio Engineer) allows someone to see an overlay visualization of future facilities within the current environment. Another application can be having work and maintenance instructions directly appear on a component when it is viewed.
Augmented reality can be a virtual world, even a MMO game. The past year we have heard more about virtual reality and VR headsets and goggles (like Oculus Rift) which are more immersive, but also more awkward to use.This immersiveness is an older concept and some readers may recall the use of the term "telepresence.”
Telepresence referred to a set of technologies which allowed a person to feel as if they were present, or to to give the appearance of being present, or to have some impact at place other than their true location. Telerobotics does this, but more commonly it was the move from videotelephony to videoconferencing. Those applications have been around since the end of the last century and we have come a god way forward from traditional videoconferencing to doing it with hand-held mobile devices, enabling collaboration independent of location.
In education, we experimented with these applications and with the software for MMOs, mirror worlds, augmented reality, lifelogging, and products like Second Life. Pokémon Go is Second Life but now there is no avatar to represent us. We are in the game and the game is the world around us, augmented as needed. The world of the game is the world.
Two years ago, I wrote about the prediction that your ever-smarter phone will be smarter than you by 2017. We are half way there and I still feel superior to my phone - though I admit that it remembers things that I can't seem to retain, like my appointments, phone numbers, birthdays and such.
The image I used on that post was a watch/phone from The Jetsons TV show which today might make you think of the Apple watch which is connected to that ever smarter phone.
But the idea of cognizant computing is more about a device having knowledge of or being aware of your personal experiences and using that in its calculations. Smartphones will soon be able to predict a consumer’s next move, their next purchase or interpret actions based on what it knows, according to Gartner, Inc.
This insight will be performed based on an individual’s data gathered using cognizant computing — "the next step in personal cloud computing.
“Smartphones are becoming smarter, and will be smarter than you by 2017,” said Carolina Milanesi, Research Vice President at Gartner. “If there is heavy traffic, it will wake you up early for a meeting with your boss, or simply send an apology if it is a meeting with your colleague."
The device will gather contextual information from your calendar, its sensors, your location and all the personal data you allow it to gather. You may not even be aware of some of that data it is gathering. And that's what scares some people.
When your phone became less important for making phone calls and added apps, a camera, locations and sensors, the lines between utility, social, knowledge, entertainment and productivity got very blurry.
But does it have anything to do with learning?
Researchers at Pennsylvania State University already announced plans to test out the usefulness in the classroom of eight Apple Watches this summer.
Back in the 1980s, there was much talk about Artificial Intelligence (AI). Researchers were going to figure out how we (well, really how "experts") do what they do and reduce those tasks to a set of rules that a computer could follow. The computer could be that expert. The machine would be able to diagnose disease, translate languages, even figure out what we wanted but didn’t know we wanted.
AI got lots of VC dollars thrown at it. But it was not much of a success.
Part of the (partial) failure can be attributed to a lack of computer processing power at the right price to accomplish those ambitious goals. The increase in power, drop in prices and the emergence of the cloud may have made the time for AI closer.
Still, I am not excited when I hear that this next phase will allow "services and advertising to be automatically tailored to consumer demands."
Gartner released a newer report on cognizant computing that continues that idea of it being "the strongest forces in consumer-focused IT" in the next few years.
Mobile devices, mobile apps, wearables, networking, services and the cloud is going to change educational use too, though I don't think anyone has any clear predictions.
Does more data make things smarter? Sometimes.
Will the Internet of Things and big data converge with analytics and make things smarter? Yes.
Is smarter better? When I started in education 40 years ago, I would have quickly answered "yes," but my answer is less certain these days.
Your smartphone will be smarter than you by the year 2017. That is from an analysis from market research firm Gartner. It won't have much to do with hardware. It will come from the data and computational ability in the cloud. Phones will appear smarter than you - if you equate smarts with being able to recall information and make inferences. It was a part of a discussion of smart devices at Gartner Symposium/ITxpo 2013, November 10-14 in Barcelona.
What made mobile phones smartphones was new tech and and apps, Cameras, enabling locations and sensors, and tying them into apps and social interactions via apps has been the biggest trend the past 5 years. The easier things are already in place - scheduling, sending out reminders, letting you know what friends are doing or where they are, alerting you to things in your vicinity.
A newer trend is having phones that predict your next action based on personal data already gathered. This is called cognizant computing and many people see it as the next step in personal cloud computing.
Carolina Milanesi, research vice president at Gartner, says “If there is heavy traffic, it will wake you up early for a meeting with your boss, or simply send an apology if it is a meeting with your colleague. The smartphone will gather contextual information from its calendar, its sensors, the user’s location and personal data.”
Of course, allowing your phone to do these things is part of the equation. And not everyone is okay with granting permissions to apps, opening up their data and feeling confident in allowing apps and services to take control of aspects of their lives.
This idea of cognizant computing is said to occur in 4 phases. Those phases (according to Gartner) are sync me, see me, know me and be me.
Sync me is familiar to users and probably appreciated: store copies of digital assets and sync them across devices. So, my iPhone knows what my iPad knows and my cloud documents are on all my devices, including several laptops.
See me is here in its early stages and means devices can track history and context. The phone knows where I am now and where I have been.
Using the data from those two phases (which many of us have granted permissions for), phones can move to phases 3 and 4. That's when things get a bit scary for some people. When my phone "knows me" it act act proactively. Do I want to purchase something now based on my earlier spending habits?
And, taking it a step further, how much do I want my device to "Be Me" and act on my behalf? It will pay my bills. It will send selected friends and relatives birthday greeting and pick out a gift. (After all, I have tied my wife's purchases to my account and it knows where she likes to shop and what she likes to buy.)
Scary? Or are you happy to let that little package of power make your life "easier"?
I still haven't gotten my jetpack or flying car, but I might get some cousin of The Jetsons' Rosie that can slip into my pocket - and into my life - quite easily.
Blended learning is not a new course design concept. It refers to a mixing of different learning environments. Usually, that means blending traditional face-to-face (F2F) classroom methods and class time with online and computer-mediated activities.
There is not one definition of blended learning. In fact, I hear the terms "blended," "hybrid," and "mixed-mode" used interchangeably.
In blended learning, technology always plays a bigger role than in the traditional classroom. As schools "allow" and actually encourage the use of smartphones and tablets, these devices allow the F2F experience to overlap with the experiences outside the classroom.
Students bringing their own devices to campus (known as BYOD) changes things. It changes technology policies and it lowers the cost of technology for blended-learning. Statistics are always changing but at least 75 percent of teens now own cellphones, according to a Pew Research Center report. Is there a socio-economic, racial "gap" with mobile technology as there was in the early days of personal computers and Internet access? Another Pew study reported that African Americans and English-speaking Hispanics are slightly more likely than whites to own a cell phone.
Remember 1:1 computing? Mobile devices, particularly smartphones, bring us much closer to that as a reality. And that also makes blended learning more viable.
The real blending may occur when students don't see a big difference between the experience in a classroom and the experience outside.
Mobile doesn't eliminate all the issues with blended courses - and many of those issues have been around since the earliest days of online learning. Some things are harder to do - maybe impossible - to do online. Assessments on mobile devices require considerations of academic integrity.
But anyone considering designing or teaching in a blended setting needs to be be making mobile part of the design.
If you look across all mobile platforms, nearly 90% all app store downloads this year will be free apps.
Add to that information from a report from Gartner that says that 90 percent of the apps that users pay for will cost less than $3. The report, "Market Trends: Mobile App Stores, Worldwide, 2012."
App downloads in 2011 were at the 24.94 billion mark from Google Play/Android Market, the Apple App Store, and others. Of those, 88.4 percent (about 22 billion) were for free apps, while about 2.9 billion were for paid apps.
This year? The forecast is for 83% growth with annual growth projected at 50 to 79 percent each year through 2016.
What does this mean for educators? As mobile continues to move into classrooms, sometimes only because students bring it there, we will find ourselves using more and more free and cheap apps rather than traditional, expensive software.
That makes money available from school budgets and from students' budgets that could be used in other ways. It also open the door to using more mobile technology without considering software cost as a critical factor.
Read more: Free Apps To Make Up 89 Percent of Mobile Downloads This Year
That's not surprising. It took longer to get computers and then the Internet into classrooms than all the prognosticators were saying 25 years ago.
Students, especially at the higher levels, are bringing their own devices to class. That's enough of a trend in itself that a search on BYOD will turn up lots of results. As is often the case with technology, the business world has already been dealing with BYOD issues (such as usage policies) before schools gave it any serious thought. BYOD has a Wikipedia entry too, so it's official.
Students bringing their own technology (smartphones, tablets, and laptops) is moving down from higher ed to K-12 education. The model has always been that schools provided the technology that students would need. Some of that tech "funding" is being passed on to students and parents without schools even asking via the BYOD trend. This has also reduced a school's responsibility for support and upgrades.
But one thing that hasn't changed much in 25 years is deciding what software should be used. Schools or teachers still have most of the control over content and oftentimes that also means the software.
In 1990, there may have been dozens of software titles in an academic area and it was difficult to preview, review and test them. With the rise of apps on mobile devices, there are hundreds or thousands of titles to sift through to find ones with good educational uses.
Most educators don't have the time to go through the process. More and more, textbook companies drive adoption by bundling software with textbooks. Hopefully, educators can begin to use the filters, curation and recommendations of peers aided by sites (and even apps) and contribute their own reviews for others.
I find many more sites with a K-12 focus rather than higher ed, so far. Here are a few samples:
IEAR- I Education Apps Review - reviews on apps, schools spreadsheets of Apps, student reviews
SNapps4Kids these reviews have an embedded list of skills that are addressed in the app (very important in K-12's world of objectives and assessment
Scoop it- Recommended Educational App Lists - on this site you can join or just look at the reviews
Apps in Education - a blog that includes apps for music, math, English, special needs and more
App Advice is interesting because it is a website and also an app itself. The appadvice app is $1.99.
Have you found other reliable sources?