I first encountered the term "Internet of Things" (IoT) in 2013. It is the idea that "things" (physical devices) would be connected in their own network(s). The talk was that things in your home, office and vehicles would be wirelessly connected because they were embedded with electronics, software, sensors, actuators, and network connectivity. Things would talk to things. Things would collect and exchange data.
Some of the early predictions seemed rather silly. Taking a tagged carton of milk out of the refrigerator and not putting it back would tell my food ordering device (such as an Amazon Echo) that I was out of milk. My empty Bluetooth coffee mug would tell the Keurig coffeemaker to make me another cup.
But the "smart home" - something that pre-dates the Internet - where the HVAC knew I was almost home and adjusted the temperature off the economical setting to my comfort zone and maybe put on the front light and started dinner, was rather appealing.
In 2014, the EDUCAUSE Learning Initiative (ELI) published its “7 Things You Should Know About the Internet of Things. The Internet of Things (and its annoying abbreviation of IoT) sounded rather ominous as I imagined them proliferating across our social and physical landscapes. The ELI report said “the IoT has its roots in industrial production, where machine-to-machine communication enabled the manufacture of complex items, but it is now expanding in the commercial realm, where small monitoring devices allow such things as ovens, cars, garage doors, and the human heartbeat to be checked from a computing device.”
Some of the discussions have also been about considerations of values, ethics and ideology, especially if you consider the sharing of the data gathered.
As your watch gathers data about your activity, food intake and heart rate, it has valuable data about your health. I do this on my Fitbit with its app. Perhaps you share that with an online service (as with the Apple watch & Apple itself) in order to get further feedback information about your health and fitness and even recommendations about things to do to improve it. If you want a really complete analysis, you are asked (hopefully) to share your medications, health history etc. Now, what if that is shared with your medical insurer and your employer?
Might we end up with a Minority Report of predictive analytics that tell the insurance company and your employer whether or not you are a risk?
Okay, I made a leap there, but not a huge one.
This summer, EDUCAUSE published a few articles on IoT concerning higher education and the collaboration required for the IoT to work. I don't see education at any level really making significant use of IoT right now, though colleges are certainly gathering more and more data about students. That data might be used to improve admissions. Perhaps, your LMS gathers data about student activity and inactivity and can use it to predict what students need academic interventions.
It's more of an academic challenge to find things that can be used currently.
History Lesson: Way back in 1988, Mark Weiser talked about computers embedded into everyday objects and called this third wave "ubiquitous computing." Pre-Internet, this was the idea of many computers, not just the one on your desk, for one person. Add ten years and in 1999, Keven Ashton posited a fourth wave which he called the Internet of Things.
Connection was the key to both ideas. It took another decade until cheaper and smaller processors and chipsets, growing coverage of broadband networks, Bluetooth and smartphones made some of the promises of IoT seem reasonable.
Almost any thing could be connected to the Internet. We would have guessed at computers of all sizes, cars and appliances. I don't think things such as light bulbs would have been on anyone's list.
Some forecasters predict 20 billion devices will be connected by 2020; others put the number closer to 40-100+ billion connected devices by that time.
And what will educators do with this?
Two years ago, I wrote about the prediction that your ever-smarter phone will be smarter than you by 2017. We are half way there and I still feel superior to my phone - though I admit that it remembers things that I can't seem to retain, like my appointments, phone numbers, birthdays and such.
The image I used on that post was a watch/phone from The Jetsons TV show which today might make you think of the Apple watch which is connected to that ever smarter phone.
But the idea of cognizant computing is more about a device having knowledge of or being aware of your personal experiences and using that in its calculations. Smartphones will soon be able to predict a consumer’s next move, their next purchase or interpret actions based on what it knows, according to Gartner, Inc.
This insight will be performed based on an individual’s data gathered using cognizant computing — "the next step in personal cloud computing.
“Smartphones are becoming smarter, and will be smarter than you by 2017,” said Carolina Milanesi, Research Vice President at Gartner. “If there is heavy traffic, it will wake you up early for a meeting with your boss, or simply send an apology if it is a meeting with your colleague."
The device will gather contextual information from your calendar, its sensors, your location and all the personal data you allow it to gather. You may not even be aware of some of that data it is gathering. And that's what scares some people.
When your phone became less important for making phone calls and added apps, a camera, locations and sensors, the lines between utility, social, knowledge, entertainment and productivity got very blurry.
But does it have anything to do with learning?
Researchers at Pennsylvania State University already announced plans to test out the usefulness in the classroom of eight Apple Watches this summer.
Back in the 1980s, there was much talk about Artificial Intelligence (AI). Researchers were going to figure out how we (well, really how "experts") do what they do and reduce those tasks to a set of rules that a computer could follow. The computer could be that expert. The machine would be able to diagnose disease, translate languages, even figure out what we wanted but didn’t know we wanted.
AI got lots of VC dollars thrown at it. But it was not much of a success.
Part of the (partial) failure can be attributed to a lack of computer processing power at the right price to accomplish those ambitious goals. The increase in power, drop in prices and the emergence of the cloud may have made the time for AI closer.
Still, I am not excited when I hear that this next phase will allow "services and advertising to be automatically tailored to consumer demands."
Gartner released a newer report on cognizant computing that continues that idea of it being "the strongest forces in consumer-focused IT" in the next few years.
Mobile devices, mobile apps, wearables, networking, services and the cloud is going to change educational use too, though I don't think anyone has any clear predictions.
Does more data make things smarter? Sometimes.
Will the Internet of Things and big data converge with analytics and make things smarter? Yes.
Is smarter better? When I started in education 40 years ago, I would have quickly answered "yes," but my answer is less certain these days.