What Happened to the Internet of Things?

IoT uses

IoT applications

After writing here about how the Internet and websites are not forever, I started looking at some old posts that perhaps should be deleted or updated. With 2200+ posts here since 2006, that seems like an overwhelming and unprofitable use of my time. Plus, maybe an old post has some historical value. But I do know that there are older posts that have links to things that just don't exist on the Internet anymore.

The last post I wrote here labeled "Internet of Things" (IoT) was in June 2021. IoT was on Gartner's trends list in 2012, and my first post about IoT here was in 2009, so I thought an new update was due.

When I wrote about this in 2014, there were around 10 billion connected devices. In 2024, the number has increased to over 30 billion devices, ranging from smart home gadgets (e.g., thermostats, speakers) to industrial machines and healthcare devices. Platforms like Amazon Alexa, Google Home, and Apple HomeKit provide hubs for connecting and controlling a range of IoT devices.

The past 10 years have seen the IoT landscape evolve from a collection of isolated devices to a more integrated, intelligent, and secure ecosystem. Advancements in connectivity, AI, edge computing, security, and standardization have made IoT more powerful, reliable, and accessible, with applications transforming industries, enhancing daily life, and reshaping how we interact with technology. The number of connected devices has skyrocketed, with billions of IoT devices now in use worldwide. This widespread connectivity has enabled smarter homes, cities, and industries.

IoT devices have become more user-friendly and accessible, with smart speakers, wearables, and home automation systems becoming commonplace in households. If you have a washing machine or dryer that reminds you via an app about its cycles. or a thermostat that knows when you are in rooms or on vacation, then IoT is in your home, whether you use that term or not.

Surveying the topic online turned up a good number of things that have pushed IoT forward or that IoT has pushed forward. Most recently, I would say that the 3 big things that have pushed IoT forward are 5G and advanced connectivity, the rise of edge computing, and AI and machine learning integration:

Technological improvements, such as the rollout of 5G networks, have greatly increased the speed and reliability of IoT connections. This has allowed for real-time data processing and more efficient communication between devices.

Many IoT devices now incorporate edge computing and AI to process data locally, reducing the reliance on cloud-based servers. This allows faster decision-making, less latency, and improved security by limiting the amount of data transmitted. IoT devices have increasingly incorporated AI and machine learning for predictive analytics and automation. This shift has allowed for smarter decision-making and automation in various industries, such as manufacturing (predictive maintenance), healthcare (patient monitoring), and agriculture (smart farming).

The integration of big data and advanced analytics has enabled more sophisticated insights from IoT data. This has led to better decision-making, predictive maintenance, and personalized user experiences.

One reason why I have heard less about IoT (and written less about it) is that it has expanded beyond consumer devices to industrial applications. I discovered a new term - Industrial Internet of Things (IIoT) that includes smart manufacturing, agriculture, healthcare, and transportation, improving efficiency and productivity.

There are also concerns that have emerged. As IoT devices proliferate, so have concerns about security. Advances in cybersecurity measures have been implemented to protect data and ensure the privacy of users. The IoT security landscape has seen new protocols and encryption standards being developed to protect against vulnerabilities, with an emphasis on device authentication and secure communication.

The rollout of 5G has enhanced IoT capabilities by providing faster, more reliable connections. This has enabled more efficient real-time data processing for smart cities, autonomous vehicles, and industrial IoT applications, which can now operate at a larger scale and with lower latency.

IoT devices are now able to use machine learning and AI to learn from user behavior and improve their performance. For example, smart thermostats can learn a household’s schedule and adjust settings automatically, while security cameras can differentiate between human and non-human motion.

Edge computing has allowed IoT devices to process data locally rather than relying solely on cloud-based servers. This reduces latency and bandwidth usage, making it especially beneficial for time-sensitive applications like healthcare monitoring, industrial automation, and smart grids.

Despite the growth, the IoT market faces challenges such as chipset supply constraints, economic uncertainties, and geopolitical conflicts

 

 

 

Synergy

Synergy is one of those words that caught fire with the general public in the late 20th century, especially in tech-related fields. In general, it is taken to mean the interaction of two or more things (organizations, substances, products, fields, etc.) that produces a greater effect when combined than separately. For example, if two colleges work jointly on a project, or the way there was cooperation between some pharmaceutical researchers in developing the COVID-19 vaccines.

But the word synergy is not a recent addition to the language. It appeared in the mid 19th century mostly in the field of physiology concerning the interaction of organs. It comes from the Greek sunergos meaning "working together" which comes from sun- ‘together’ + ergon ‘work’.

It has been used in diverse ways. In Christian theology, it was said that salvation involves synergy between divine grace and human freedom. I received a wedding engagement announcement that talked about the synergy between the two people. (They do both work in tech fields.)

The informational synergies which can be applied also in media involve a compression of transmission, access and use of information’s time, the flows, circuits and means of handling information being based on a complementary, integrated, transparent and coordinated use of knowledge.[32]

Walt Disney is given as an example of pioneering synergistic marketing. Back in the 1930s, the company licensed dozens of firms the right to use the Mickey Mouse character in products and ads. These products helped advertise their films. This kind of marketing is still used in media. For example, Marvel films are not only promoted by the company and the film distributors but also through licensed toys, games and posters. 

Shifting to tech, synergy can also be defined as the combination of human strengths and computer strengths. The use of robots and AI are clear synergies. If you read into information theory, you will find discussions of synergy when multiple sources of information taken together provide more information than the sum of the information provided by each source alone.

In education, synergy can be when schools and colleges, departments, disciplines, researchers,

Strong and Weak AI

programming
Image by Gerd Altmann from Pixabay

Ask several people to define artificial intelligence (AI) and you'll get several different definitions. If some of them are tech people and the others are just regular folks, the definitions will vary even more. Some might say that it means human-like robots. You might get the answer that it is the digital assistant on their countertop or inside their mobile device.

One way of differentiating AI that I don't often hear is by the two categories of weak AI and strong AI.

Weak AI (also known as “Narrow AI”) simulates intelligence. These technologies use algorithms and programmed responses and generally are made for a specific task. When you ask a device to turn on a light or what time it is or to find a channel on your TV, you're using weak AI. The device or software isn't doing any kind of "thinking" though the response might seem to be smart (as in many tasks on a smartphone). You are much more likely to encounter weak AI in your daily life.

Strong AI is closer to mimicking the human brain. At this point, we could say that strong AI is “thinking” and "learning" but I would keep those terms in quotation marks. Those definitions of strong AI might also include some discussion of technology that learns and grows over time which brings us to machine learning (ML), which I would consider a subset of AI.

ML algorithms are becoming more sophisticated and it might excite or frighten you as a user that they are getting to the point where they are learning and executing based on the data around them. This is called "unsupervised ML." That means that the AI does not need to be explicitly programmed. In the sci-fi nightmare scenario, the AI no longer needs humans. Of course that is not even close to true today as the AI requires humans to set up the programming, supply the hardware and its power. I don't fear the AI takeover in the near future.

But strong AI and ML can go through huge amounts of data that it is connected to and find useful patterns. Some of those are patterns and connections that itis unlikely that a human would find. Recently, you may have heard of the attempts to use AI to find a coronavirus vaccine. AI can do very tedious, data-heavy and time-intensive tasks in a much faster timeframe.

If you consider what your new smarter car is doing when it analyzes the road ahead, the lane lines, objects, your speed, the distance to the car ahead and hundreds or thousands of other factors, you see AI at work. Some of that is simpler weak AI, but more and more it is becoming stronger. Consider all the work being done on autonomous vehicles over the past two decades, much of which has found its way into vehicles that still have drivers.

Of course, cybersecurity and privacy become key issues when data is shared. You may feel more comfortable in allowing your thermostat to learn your habits or your car to learn about how you drive and where you drive than you are about letting the government know that same data. Discover the level of data we share online dong financial operations or even just our visiting sites, making purchases and our search history, and you'll find the level of paranoia rising. I may not know who you are reading this article, but I suspect someone else knows and is more interested in knowing than me.

Event-Based Internet

Event-based Internet is going to be something you will hear more about this year. Though I had heard the term used, the first real application of it that I experienced was a game. But don't think this is all about fun and games. Look online and you will find examples of event-based Internet biosurveillance and event-based Internet robot teleoperation systems and other very sophisticated uses, especially connected to the Internet of Things (IoT).

HQWhat did more than a million people do this past Sunday night at 9pm ET? They tuned in on their mobile devices to HQ Trivia, a game show, on their phones.  

For a few generations that have become used to time-shifting their viewing, this real-time game is a switch. 

The HQ app has had early issues in scaling to the big numbers with game delays, video lag and times when the game just had to be rebooted. But it already has at least one imitator called "The Q" which looks almost identical in design, and imitation is supposed to be a form of flattery.

This 12-question trivia quiz has money prizes. Usually, the prize is $2000, but sometimes it jumps to $10 or $20K. But since there are multiple survivors of the 12 questions that win, the prizes are often less than $25 each.

Still, I see the show's potential (Is it actually a "show?") Business model? Sponsors, commercial breaks, sponsors and product placement in the questions, answers and banter in-between questions.

The bigger trend here is that this is a return to TV "appointment viewing."  Advertisers like that and it only really occurs these days with sports, some news and award shows. (HQ pulled in its first audience of more than a million Sunday during the Golden Globe Awards, so...) 

And is there some education connection in all this?  Event-based Internet, like its TV equivalent, is engaging. Could it bring back "The Disconnected" learner?  

I found a NASA report on "Lessons Learned from Real-Time, Event-Based Internet Science Communications."  This report is focused on sharing science activities in real-time in order to involve and engage students and the public about science.

Event-based distributed systems are being used in areas such as enterprise management, information dissemination, finance,
environmental monitoring and geo-spatial systems.

Education has been "event-based" for hundreds of years. But learners have been time-shifting learning via distance education and especially via online learning for only a few decades. Event-based learning sounds a bit like hybrid or blended learning. But one difference is that learners are probably not going to tune in and be engaged with just a live lecture. Will it take a real event and maybe even gamification to get live learning? 

In all my years teaching online, I have never been able to have all of a course's student attend a "live" session either because of time zone differences, work schedules or perhaps content that just wasn't compelling enough.

What will "Event-based Learning" look like?