Strong and Weak AI

programming
Image by Gerd Altmann from Pixabay

Ask several people to define artificial intelligence (AI) and you'll get several different definitions. If some of them are tech people and the others are just regular folks, the definitions will vary even more. Some might say that it means human-like robots. You might get the answer that it is the digital assistant on their countertop or inside their mobile device.

One way of differentiating AI that I don't often hear is by the two categories of weak AI and strong AI.

Weak AI (also known as “Narrow AI”) simulates intelligence. These technologies use algorithms and programmed responses and generally are made for a specific task. When you ask a device to turn on a light or what time it is or to find a channel on your TV, you're using weak AI. The device or software isn't doing any kind of "thinking" though the response might seem to be smart (as in many tasks on a smartphone). You are much more likely to encounter weak AI in your daily life.

Strong AI is closer to mimicking the human brain. At this point, we could say that strong AI is “thinking” and "learning" but I would keep those terms in quotation marks. Those definitions of strong AI might also include some discussion of technology that learns and grows over time which brings us to machine learning (ML), which I would consider a subset of AI.

ML algorithms are becoming more sophisticated and it might excite or frighten you as a user that they are getting to the point where they are learning and executing based on the data around them. This is called "unsupervised ML." That means that the AI does not need to be explicitly programmed. In the sci-fi nightmare scenario, the AI no longer needs humans. Of course that is not even close to true today as the AI requires humans to set up the programming, supply the hardware and its power. I don't fear the AI takeover in the near future.

But strong AI and ML can go through huge amounts of data that it is connected to and find useful patterns. Some of those are patterns and connections that itis unlikely that a human would find. Recently, you may have heard of the attempts to use AI to find a coronavirus vaccine. AI can do very tedious, data-heavy and time-intensive tasks in a much faster timeframe.

If you consider what your new smarter car is doing when it analyzes the road ahead, the lane lines, objects, your speed, the distance to the car ahead and hundreds or thousands of other factors, you see AI at work. Some of that is simpler weak AI, but more and more it is becoming stronger. Consider all the work being done on autonomous vehicles over the past two decades, much of which has found its way into vehicles that still have drivers.

Of course, cybersecurity and privacy become key issues when data is shared. You may feel more comfortable in allowing your thermostat to learn your habits or your car to learn about how you drive and where you drive than you are about letting the government know that same data. Discover the level of data we share online dong financial operations or even just our visiting sites, making purchases and our search history, and you'll find the level of paranoia rising. I may not know who you are reading this article, but I suspect someone else knows and is more interested in knowing than me.

Tik Tok and To Tok

bannedA few recent banned apps in the news should be of interest to educational institutions where students may very well be using them - and even some schools themselves may be using them. Here are two summaries from The Newsworthy podcast:

The U.S.. Navy banned TikTok from government-issued smartphones. They say the video-sharing app could be a cybersecurity threat. The Navy didn't expand on the reason, but we do know the U.S. has opened a national security review into TikTok’s China-based parent company ByteDance. TikTok hasn’t commented but has said before it uses U.S. rules.
Read more: Reuters

And then I read that ToTok (not to be confused with TikTok) has been banned from Google and Apple’s app stores. The messaging app appears to have millions of downloads, but the government of the United Arab Emirates allegedly uses it to track locations and conversations. If you have it, experts say you should uninstall it.
Read more: NYTWired

Tracking Your Health Data

fitness trackerThe Verge reported that in another move to gain more of our personal data, Google is teaming up with the nation’s second-largest health system, Ascension, in an effort it’s calling Project Nightingale.

Ascension will share the personal health data of tens of millions of patients with Google’s Cloud division in order to develop a search engine for medical records and other new artificial intelligence services for medical providers. That sounds helpful. But the announcement came right after Google announced it was buying the fitness tracker company Fitbit.

We could assume that Google is interested in selling this kind of hardware, but access to Fitbit user's personal data could be an even bigger and more profitable asset. (Fitbit data has already been used in some serious but not health manners - such as police investigations. )

Google is certainly not alone in wanting to gain this type of personal wellness data and do health care-tech collaborations. Apple would like to see its watch (similar to but more powerful than most fitness trackers) function as a medical-monitoring device. Its health data-sharing capabilities through Apple HealthKit are being enhanced.

All these companies will point out that the data they obtain is anonymized, but there are many examples of anonymized data being reversed so that it is far less anonymous. Are laws and policies ready for all this?

To Build a Surveillance State

If you wanted to build a surveillance state, what would you do?
...you would have wiretap in your home listen to your conversations. You'd have cameras on every door seeing who is coming in and have a  a network of neighbors spying on you... facial recognition capabilities... a system knowing what you read and watch and buy. When you think about it, that's what Amazon offers you. Alexa in our homes is listening. Rings on our door watch and the neighbors' apps are telling on each other. They know what you read through Kindle and what you buy through Amazon and they're pretty good about predictive analytics, so in some ways Amazon is building a very effective surveillance state that we would be offended if the government tried to mandate, but somehow as consumers we seem okay with giving up this information to a private company.

- Andrew Ferguson, author of The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement  in an interview from an interview on Marketplace Tech