Its [not just] an Algorithms fault

Scales

40 years ago, when physically absent social interactions first began to invade our lives with unseen, unheard people sending us messages, services like CompuServe’s CB  supported the clamoring digital hoi polloi.  The fascination of exchanging unspoken ideas with strangers, imposters and (sometimes) fools was so compelling, participants would spend hours interacting, oblivious to the ever-running connect-time charges meter.

Groups of online users would form, chat rooms were created where these users could share their thoughts and ideas with users that wanted to contribute to, debunk or annihilate the discussions.  Flame-wars would break out over impossibly trivial matters ('Clara Peller was NOT Wendy’s mother in the “Where’s the Beef?” ad!’ What’s wrong with you??). I posted “Don’t drink and type!” to those belligerents which, of course, only enraged them further. By the time CompuServe CB died, these groups had all backed into their own corners and waited for the next-round conflict bell to ring. There was no external source, no online referee that defined the groups or their behaviors. Cooperation and opposition were organic functions.

I own up to my cyber-dyslexia. Every time I see or hear the phrase “social media,” my brain immediately translates it to “anti-social” media, but I’m not sure the “Anti-“ part is a result of the way the media is crafted or whether it is part of our individual identity.

Are social media algorithms truly to blame for online polarization, or do our own choices play a bigger role?

Social media algorithms collect and analyze vast amounts of user data, such as browsing history, interests, and interactions. This information is used to tailor content feeds to individual preferences, ensuring that users see posts most relevant to them. Algorithms heavily weigh engagement signals like likes, shares, comments, and watch time. Content that receives higher engagement is more likely to be promoted and shown to a wider audience. Once user data and engagement signals are collected, algorithms rank content based on predicted relevance and interest. Posts that score higher are surfaced at the top of users’ feeds, while less relevant content is shown less frequently or not at all. Ranking is continuous.  Updates happen as soon as new data arrives.

Like the Wendy’s commercial back in the mid-1980s, algorithms can amplify divisive content, but that isn’t their design.  They function by interaction.  They are more like the force that kept the connect-time clock running in the CompuServe CB.  They are engines of profit.  These back-end processes do not create the groups.  People follow the subjects and experiences that reinforce their own perspectives and join with the like-minded.

Understanding the human element in all of this cyber-selected content is key to understanding and abandoning algorithmic polarization. Not all engagement signals are beneficial; negative interactions, such as outrage or controversy, can also boost content visibility. Algorithms do not distinguish between positive and negative engagement, focusing instead on overall activity levels.  What starts as a personal action on social media can escalate into widespread influence, affecting conversations and shaping public opinion. Recognizing this path will highlight the importance of mindful engagement online.

We are all curious, it is part of who we are.  When we click around on the internet looking for some tidbit of information —“Was Clara Peller really a vegetarian?”  [No - Following her, popularity, she was presented with a 25-pound hamburger by the cattle industry and was gifted an apron that read "Beef Gives Strength”.]  And while a search like that may spawn hamburgers ads all over your chosen media, be aware that the engine that sends the content back to us is not trying to raise your cholesterol or shorten your lifespan or drive you to some nutrition-oriented political party.  It is only there to make money by offering you what you have already looked for.

If you are careful selecting how content is accessed, you can change the choices presented by the algorithms. If you use a browser like DuckDuckGo to access content instead of proprietary apps, your tracks are more difficult to follow.  And if you want to see a sudden and startling change in the content that is offered to you, open a browser and type Happiness in the Search bar, you may be surprised at what begins to follow you around.

That result is not in their algorithm, but in ourselves. 

Trading Kilowatts for Qubits

QbitIt had been in the news in the United States all week, the federal government is moving to a policy that will require the power-hungry data centers to get out of the public energy pool and go swim in a plasma of their own making.  Big Tech companies are building and investing in their own energy supplies as they race to meet the huge energy demands of AI computing in 20th century datacenters.  It's estimated that a "traditional" (non-AI producing) datacenter rack can consume somewhere between 5 and 15 kilowatts of power --think central A/C units, commercial clothes dryers, banks of EV car chargers.  That same rack, running AI-capable hardware and processing power will consume 10 times as much -- up to around 100 kilowatts.

As robust as the United Stated power grid is, the demand for electricity to power these clustered artificial intelligence entities will exceed the current ability to support that demand. To feed this need for power, Amazon has begun developing small nuclear reactors (SMRs), Oracle and OpenAI are working on half trillion-dollar natural gas fueled electrical plants.  These solutions have their obvious drawbacks:  Amazon's quest for contemporary electricity using the nuclear option will produce the most-toxic waste ever thrown away, and it will last for thousands of years.  Oracle and OpenAI's investment in huge natural gas energy sources risks not only accelerating climate rot, but it risks exhausting energy supplies at scale.  The risks of expanding the electricity supply on a 20th century grid are substantial.  If the demand for electricity was reduced, those risks would subside.

The development of quantum computing has quietly been on the rise.  These computing instances, in total, consume about 25kilowatts for super computers that require extreme refrigeration to drive their super-conductor-based processors at temperatures near absolute zero.  The warmer weather loving neutral-atom computers operate around room temperature and use 7 kilowatts (or less) of power.  When optimization tasks or simulations are sent to their quantum algorithms, these computers produce solutions at orders of magnitude faster and use a tiny fraction of the energy that a traditional datacenter would require.

Quantum computing, now, can significantly enhance AI (Generative AI) by its speed.  Quantum computers are faster and deeper in data analysis and have now led to a new class of Generative AI called GenQAI (Generative Quantum AI) that can use quantum hardware to iterate complex problems and generate more human-like reasoning and intuition in AI.

Quantinuum, which is reported to be one of the world leaders in quantum technology, in November unveiled its Helios system, which has been described as the world's most accurate quantum computer. That quantum instance requires less than 40 kilowatts of power, about the same as a single data center rack average AI Generative load and configuration.  The company announced last week it was going public and would issue an IPO sometime in the first half of 2026.

With some clairvoyant disruption and a little bit of luck, we'll have frugal quantum computing cottages humming their 4-dimensional power song before we have natural gas caverns and poisonous landfill dirges to endure

 

 

 

The Rite of Privacy

privacy roadPrivacy is a cornerstone of personal freedom, yet its meaning and importance have evolved over centuries.

Aristotle viewed the public sphere, or polis, as the space where true freedom and civic life were possible. For him, public life was about participating in politics and achieving lasting accomplishments, while private life was more concerned with household affairs and personal needs. This distinction meant that privacy was often seen as secondary to public engagement, but it also laid the groundwork for later debates about the value of personal space and autonomy. Even the Romans also drew a line between public and private spheres. Public life was where individuals could gain honor and recognition, while private life was associated with family, home, and personal matters.   Fast-forward a millennium or two, and thinkers like Rousseau saw privacy as a retreat from the pressures of society—a necessary space for self-reflection and authenticity. Hannah Arendt later argued that privacy is essential for forming personal identity and exercising political rights.In 1890 Samuel Warren and Louis Brandeis published in the Harvard Law Review an essay on the right to privacy By the early part of the 20th century, courts began interpreting the U.S. Constitution to protect an expansion of privacy to include personal freedom and dignity.

The history of privacy reveals that it has always been closely tied to personal liberty and the boundaries between the individual and society. From ancient debates about public and private life to modern legal protections, the concept of privacy has continually evolved in response to new challenges. Privacy remains a vital issue today, shaping debates about technology, freedom, and the rights of individuals in a rapidly changing world. As concerns escalated, privacy was recognized as a fundamental human right,  and laws and regulations were created to address the concerns caused by the spread of computers and data collection and storage.

Then came Edward Snowden.

The scale and scope of government surveillance was exposed. The global debate about privacy was joined with personal data security.  A full five years after surveillance and data collection concerns were exposed, the European Union’s General Data Protection Regulation claimed to set a new global standard for data protection and user rights.  Even California, with its trove of data-driven companies, took the GDPR seriously and enacted the California Consumer Privacy Act.

locked phone?Personal data has become a valuable commodity in the digital economy. Companies collect, analyze, and sell user information to drive advertising, product development, and business strategies.

This shift has made privacy a key economic issue, as individuals must navigate the trade-offs between convenience and control over their data. 

As surveillance and data collection become more widespread, concerns about personal liberty and autonomy have grown. When every action can be tracked, individuals may feel less free to express themselves or make independent choices. These issues are at the heart of modern privacy debates, reminding us that protecting privacy is essential for maintaining freedom in a digital society. Privacy in the modern era is shaped by rapid technological change, new legal frameworks, and the growing power of data. As personal information becomes more valuable and vulnerable, understanding how privacy has evolved is crucial for protecting autonomy and freedom.

Privacy is not just a right of the past—it’s a challenge for the future. We all must stay vigilant and informed. Freedom depends on it.