1970s Computer Clubs

Apple I

                 The Apple 1 as displayed at the Computer History Museum

On March 5, 1975, the Homebrew Computer Club first met in a garage near Menlo Park in Silicon Valley, California.

On that day, I was across the country in my last semester at Rutgers. I had taken one course in computer programming, using Fortran, which had been around in some earlier forms since the late 1950s. We used a box of punch cards to create a program. I had looked into the class as an auditor, for no credit and not on my transcript, because I had talked to the professor after an information session he gave, and he was curious to see what an English major would do in his class.

My afterschool and vacation job in high school was doing printing for a liquor distributor. They had a room with huge computers using tape drives and cards, and I would sometimes wander in there and talk to the operator. Of course, I understood nothing about what he was doing. He was in a unique place in that position because no one in the company understood what he was doing except him and his one assistant. And yet those computers, printed all the invoices which I would later have to box up and file in the warehouse. Though they were using the computer to print them all, no one could access that data from their desktop, so if someone wanted a copy of an invoice, they had to dig through a file cabinet.

That 1970 computer was certainly not for personal use, and no one had a personal computer because they did not exist. Most of my fellow students didn't imagine we would ever have a computer in our home. They were gigantic — a computer easily took up an entire room. And they were very, very expensive, costing about a million dollars each. Not even computer engineers or programmers who made a living working on computers had access to a personal computer.

So this California club served a real need for tech-minded people But many of these tech-minded people wanted to build personal computers for fun. And they decided to start a hobbyist club to trade circuit boards and information and share their enthusiasm. Among the early members were high school friends Steve Jobs and Steve Wozniak. Eventually, they would design and build what tey called the Apple I and II computers and brought them to the club to show them off. Lee Felsentein and Adam Osborne were also members and would create the first mass-produced portable computer, the Osborne 1.

Wozniak would write "The theme of the club was 'Give to help others.' Each session began with a 'mapping period,' when people would get up one-by-one and speak about some item of interest, or a rumor, and have a discussion. Somebody would say, 'I've got a new part,' or somebody else would say he had some new data or ask if anybody had a certain kind of teletype."

I started teaching in a junior high school, in the fall of 1975, and shortly thereafter, the school got a terminal that was connected to a mainframe at some university in New Jersey. It was first used by one of the math teachers for a kind of computer club. I did go to his classroom a few times just to see how it worked but I saw no connection to what I had learned about programming in college.

It would be a few years before the first personal computers appeared in the school   We had a lab that was used for the first actual computer class. It was a classroom full of standalone TRS 80s. TRS stands for Tandy RadioShack, though later they were nicknamed Trash 80s. I took a professional development class using those computers where we learned to program in BASIC. I created a vocabulary flashcard program that I was able to use with a few of my English classes during periods when the lab was not being used by the math teacher. The program was crude. The graphics were basically nonexistent, but the kids and I found it very interesting. 

I remember one teacher who was in the professional development class, saying we will all have to learn to program in the future. I was sure she was wrong. I had no doubt that computers would play a role in our teaching future, but I was also sure that other people would be writing the programs and we would only be users.

apple iie

The first computer I had in my classroom was an Apple IIe. Since I had some computer background and more so because I had some interest in learning more, I became the computer coordinator for the building. That meant my computer had two disc drives so that I could copy software that we had purchased and were allowed to copy.  MECC was a big source of classroom software back then.

The first computer I bought for home use was the same as what I had in my classroom which made sense because then I could use the software home too. This hardware was expensive. I paid more for the Apple dot matrix printer than I paid for my laptop last year.

We remained an aApple school, and an apple family for a few years until a new person moved into the position of district computer coordinator. He swapped out all the Apple computers for what we would call IBM clones, but we're the early Windows95-equipped computers. When I bought my next computer, it was one using Windows 95.

When I left teaching secondary school in 2000 and went to work at NJIT, all the computers used Windows except for the school of architecture, which was an Apple Mac building. They were their own little tech world. And so I lost contact with the Apple world in those days when even TV commercials and print ads would argue about whether you were a Windows or Mac kind of person. I remember one professor saying to me that he was surprised I was not using a Mac because I seemed like "a creative type."

Valentine's Day with Artificial Intelligence

valentine card kids
When love was easy. Or at least easier.

Since my dating days were before dating became an online thing and literally before online was a thing, I haven't really kept up with dating and technology. 

I have friends who got divorced and dipped back into dating and used online dating apps. Over 300 million people use dating apps worldwide, according to a 2023 report by Business of Apps. To visualize this figure, it’s almost the entire population of the U.S. or half of Europe’s population.

Tinder is an online dating and geosocial networking application launched in 2012. On Tinder, users “swipe right” to like or “swipe left” to dislike other users’ profiles. A profile has their photos, a short bio, and some of their interests. Tinder uses a “double opt-in” system, also called “matching”, where two users must like each other before they can exchange messages. In 2022, Tinder had 10.9 million subscribers and 75 million monthly active users.

Renate Nyborg was Tinder’s first female CEO, but she recently left the popular dating app and launched Meeno which is described as relationship advice rather than dating. For example, you might ask for advice about dealing with your boss. The Meeno app uses artificial intelligence (AI) to help solve relationship problems. She predicts that the future will be less about online dating and more about real-life encounters.

The numbers for online dating are huge but Nyborg and others see a trend (with Gen Z in particular – 18 to 25-year-olds) that they are more interested in meeting people organically.

When she left Tinder, she had said she wanted to use tech to “help people feel less lonely” and dating is only a part of that. According to a 2023 report on loneliness commissioned by the European Commission, at least 10% of European Union residents feel lonely most of the time. A Pew Research study revealed that 42% of adults surveyed in the US said they had felt lonely during the COVID-19 pandemic. So, Meeno is intended to be your mentor, distinct from a virtual girlfriend, boyfriend, clinical therapist, or coach.

What can AI do in all this? Broadly, AI can speed up the processing of all these apps. It can analyze very quickly user behavior patterns and datasets to identify potential matches based on shared interests, values, and preferences. AI can filter profiles for inappropriate content, such as nudity or hate speech. It can analyze a user’s swiping patterns, interests, answers to questions, and personality results to introduce them to tailored recommendations.

There are other apps, like Blush, Aimm, Rizz, and Teaser AI, that use personality tests and physical type analysis to train AI-powered systems. Some apps use machine learning algorithms to scan for attraction and then suggest images of real people that the app thinks the user might find attractive. these are more for “dating” than everyday relationships which is Meeno’s current target.

This post first appeared in a different format on Weekends in Paradelle

The Soft Skills of AI

workers talking
Communication is a rising soft skill

AI, especially its subset, generative AI, seems to be changing everything including the workplace. As machines become adept at tasks once considered uniquely human, what does this mean for the workforce, and which worker skills will become more important? For some jobs, AI will simply be complementary to the job, but the prevailing belief is that about half of all jobs will be significantly disrupted by AI.

I have never been a fan of the terms "hard and soft skills" since it seems to make some "soft" skills seem less important. Still, some historically “hard” skills will drop on the hiring credentials.

An article on www.fastcompany.com featured some soft skills that will be important in an AI world.

SOCIAL INTERACTION SKILLSsuch as listening to others in meetings, or collaborating with teammates under pressure, will remain in the human domain. A working paper from the National Bureau of Economic Research showed that almost all job growth since 1980 has been seen in jobs that are social-skill intensive, while jobs that require minimal social interaction have been in decline.

CREATIVITY especially in using AI. One study found that knowledge workers who used Chat GPT 4.0 completed 12.2% more tasks, 25,.1% faster and with 40% greater quality over those who did not use AI to perform their work. That’s astonishing data, especially the data on the increased quality level. Human workers who leverage AI and who demonstrate a combination of strong creativity and critical thinking skills will fare the best.

CRITICAL THINKING SKILLS I don't think that critical thinking has ever been off the skills list for jobs. It must be applied to evaluate AI responses since (as you may have discovered already) not all responses will be valid, unbiased, factual, or error-free. AI can generate vast amounts of data, analyses, and potential solutions at unprecedented speed, but the veracity and applicability of generative AI’s responses are not guaranteed. A uniquely human skill is to think critically.

CURIOSITY is that innate drive to explore, understand, and seek information about the world around us. AI is not curious unless it is told to be or programmed to seek out information. Workers ask questions, probe into things, challenge assumptions and delve deeper.

Yes, the rise of AI will fundamentally alter the nature of skills deemed crucial in the workplace. While some hard skills and jobs will disappear for workers, some soft skills will remain human-only and therefore will become more important - perhaps "harder" -  than ever.

Begin. End. The Waning Days of Coding

code on screen

A piece in The New Yorker (not exactly a technology magazine) titled "A Coder Considers the Waning Days of the Craft," set me thinking about what tech careers will be lost in the near and far future. Yes, artificial intelligence plays into this, but there are other factors too. Coding seems to be a likely candidate for being on the decline.

The author, James Somers, says that, "Coding has always felt to me like an endlessly deep and rich domain. Now I find myself wanting to write a eulogy for it." With his wife pregnant, he wonders that "...by the time that child can type, coding as a valuable skill might have faded from the world." 

It is an interesting read. Kind of a memoir of a coder.

Schools still teach coding. Coders are still working. The question is for for how long? Should a student in middle school think about it as a career? I used to tell my middle school students that a lot of them will go into careers that have titles that don't exist today. Who can predict?

Somers concludes:

"So maybe the thing to teach isn’t a skill but a spirit. I sometimes think of what I might have been doing had I been born in a different time. The coders of the agrarian days probably futzed with waterwheels and crop varietals; in the Newtonian era, they might have been obsessed with glass, and dyes, and timekeeping. I was reading an oral history of neural networks recently, and it struck me how many of the people interviewed—people born in and around the nineteen-thirties—had played with radios when they were little. Maybe the next cohort will spend their late nights in the guts of the A.I.s their parents once regarded as black boxes. I shouldn’t worry that the era of coding is winding down. Hacking is forever."

The future of coding is likely to be affected by all of these factors:

Artificial Intelligence and Automation: AI is already influencing coding through tools that assist developers in writing code, debugging, and optimizing algorithms. As AI continues to advance, it may take on more complex coding tasks, allowing developers to focus on higher-level design and problem-solving.

Low-Code/No-Code Development: The rise of low-code and no-code platforms is making it easier for individuals with limited programming experience to create applications. This trend could democratize software development, enabling a broader range of people to participate in creating digital solutions.

Increased Specialization: With the growing complexity of technology, developers are likely to become more specialized in particular domains or technologies. This could lead to a more segmented job market, with experts in areas like AI, cybersecurity, blockchain, etc.

Remote Collaboration and Distributed Development: Remote work has become more prevalent, and this trend is likely to continue. Tools and practices for collaborative and distributed development will become increasingly important.

Ethical Coding and Responsible AI: As technology plays a more central role in our lives, the ethical considerations of coding will become more critical. Developers will need to be mindful of the societal impact of their creations and consider ethical principles in their coding practices.

Continuous Learning: The pace of technological change is rapid, and developers will need to embrace a mindset of continuous learning. Staying updated with the latest tools, languages, and methodologies will be crucial.

Quantum Computing: While still in its early stages, quantum computing has the potential to revolutionize certain aspects of coding, particularly in solving complex problems that are currently intractable for classical computers.

Augmented Reality (AR) and Virtual Reality (VR): As AR and VR technologies become more widespread, developers will likely be involved in creating immersive experiences and applications that leverage these technologies.

Cybersecurity Emphasis: With the increasing frequency and sophistication of cyber threats, coding with a focus on security will be paramount. Developers will need to incorporate secure coding practices and stay vigilant against emerging threats.

Environmental Sustainability: As concerns about climate change grow, there may be a greater emphasis on sustainable coding practices, including optimizing code for energy efficiency and reducing the environmental impact of data centers.

How do I know this? Because I asked a chatbot to tell me the future of coding.