The Soft Skills of AI

workers talking
Communication is a rising soft skill

AI, especially its subset, generative AI, seems to be changing everything including the workplace. As machines become adept at tasks once considered uniquely human, what does this mean for the workforce, and which worker skills will become more important? For some jobs, AI will simply be complementary to the job, but the prevailing belief is that about half of all jobs will be significantly disrupted by AI.

I have never been a fan of the terms "hard and soft skills" since it seems to make some "soft" skills seem less important. Still, some historically “hard” skills will drop on the hiring credentials.

An article on www.fastcompany.com featured some soft skills that will be important in an AI world.

SOCIAL INTERACTION SKILLSsuch as listening to others in meetings, or collaborating with teammates under pressure, will remain in the human domain. A working paper from the National Bureau of Economic Research showed that almost all job growth since 1980 has been seen in jobs that are social-skill intensive, while jobs that require minimal social interaction have been in decline.

CREATIVITY especially in using AI. One study found that knowledge workers who used Chat GPT 4.0 completed 12.2% more tasks, 25,.1% faster and with 40% greater quality over those who did not use AI to perform their work. That’s astonishing data, especially the data on the increased quality level. Human workers who leverage AI and who demonstrate a combination of strong creativity and critical thinking skills will fare the best.

CRITICAL THINKING SKILLS I don't think that critical thinking has ever been off the skills list for jobs. It must be applied to evaluate AI responses since (as you may have discovered already) not all responses will be valid, unbiased, factual, or error-free. AI can generate vast amounts of data, analyses, and potential solutions at unprecedented speed, but the veracity and applicability of generative AI’s responses are not guaranteed. A uniquely human skill is to think critically.

CURIOSITY is that innate drive to explore, understand, and seek information about the world around us. AI is not curious unless it is told to be or programmed to seek out information. Workers ask questions, probe into things, challenge assumptions and delve deeper.

Yes, the rise of AI will fundamentally alter the nature of skills deemed crucial in the workplace. While some hard skills and jobs will disappear for workers, some soft skills will remain human-only and therefore will become more important - perhaps "harder" -  than ever.

Begin. End. The Waning Days of Coding

code on screen

A piece in The New Yorker (not exactly a technology magazine) titled "A Coder Considers the Waning Days of the Craft," set me thinking about what tech careers will be lost in the near and far future. Yes, artificial intelligence plays into this, but there are other factors too. Coding seems to be a likely candidate for being on the decline.

The author, James Somers, says that, "Coding has always felt to me like an endlessly deep and rich domain. Now I find myself wanting to write a eulogy for it." With his wife pregnant, he wonders that "...by the time that child can type, coding as a valuable skill might have faded from the world." 

It is an interesting read. Kind of a memoir of a coder.

Schools still teach coding. Coders are still working. The question is for for how long? Should a student in middle school think about it as a career? I used to tell my middle school students that a lot of them will go into careers that have titles that don't exist today. Who can predict?

Somers concludes:

"So maybe the thing to teach isn’t a skill but a spirit. I sometimes think of what I might have been doing had I been born in a different time. The coders of the agrarian days probably futzed with waterwheels and crop varietals; in the Newtonian era, they might have been obsessed with glass, and dyes, and timekeeping. I was reading an oral history of neural networks recently, and it struck me how many of the people interviewed—people born in and around the nineteen-thirties—had played with radios when they were little. Maybe the next cohort will spend their late nights in the guts of the A.I.s their parents once regarded as black boxes. I shouldn’t worry that the era of coding is winding down. Hacking is forever."

The future of coding is likely to be affected by all of these factors:

Artificial Intelligence and Automation: AI is already influencing coding through tools that assist developers in writing code, debugging, and optimizing algorithms. As AI continues to advance, it may take on more complex coding tasks, allowing developers to focus on higher-level design and problem-solving.

Low-Code/No-Code Development: The rise of low-code and no-code platforms is making it easier for individuals with limited programming experience to create applications. This trend could democratize software development, enabling a broader range of people to participate in creating digital solutions.

Increased Specialization: With the growing complexity of technology, developers are likely to become more specialized in particular domains or technologies. This could lead to a more segmented job market, with experts in areas like AI, cybersecurity, blockchain, etc.

Remote Collaboration and Distributed Development: Remote work has become more prevalent, and this trend is likely to continue. Tools and practices for collaborative and distributed development will become increasingly important.

Ethical Coding and Responsible AI: As technology plays a more central role in our lives, the ethical considerations of coding will become more critical. Developers will need to be mindful of the societal impact of their creations and consider ethical principles in their coding practices.

Continuous Learning: The pace of technological change is rapid, and developers will need to embrace a mindset of continuous learning. Staying updated with the latest tools, languages, and methodologies will be crucial.

Quantum Computing: While still in its early stages, quantum computing has the potential to revolutionize certain aspects of coding, particularly in solving complex problems that are currently intractable for classical computers.

Augmented Reality (AR) and Virtual Reality (VR): As AR and VR technologies become more widespread, developers will likely be involved in creating immersive experiences and applications that leverage these technologies.

Cybersecurity Emphasis: With the increasing frequency and sophistication of cyber threats, coding with a focus on security will be paramount. Developers will need to incorporate secure coding practices and stay vigilant against emerging threats.

Environmental Sustainability: As concerns about climate change grow, there may be a greater emphasis on sustainable coding practices, including optimizing code for energy efficiency and reducing the environmental impact of data centers.

How do I know this? Because I asked a chatbot to tell me the future of coding.

Jobs and Bots

chatgpt on phone
Workers are already using bots to help them work. Will that AI replace them?

On the same day, I saw three articles about artificial intelligence that made me view AI in different ways. One article was about how a chatbot powered by the Internet has passed exams at a U.S. law school after writing essays on law topics. Another article was about a company that is developing AI for warfare, but said they would only sell it to "democratic nations." The third article was about how AI makes the translation of difficult "dead" languages as well as interpreting medical tests faster and more accurately. 

Jonathan Choi, a professor at Minnesota University Law School, gave ChatGPT the same test faced by students. It had 95 multiple-choice questions and 12 essay questions. He reported that the bot scored a C+ overall.

In my own essay testing, I have found that the bot can produce in seconds a "C" paper or the start of a better paper. It is impressive but it is not like a really good student's work. So far.

But many of the AI bot stories in the media are about jobs that are likely to be replaced by AI. One popular media story at cbsnews.com/ supposes that computer programmers and people doing administrative work that they term "mid-level writing" can be handled by AI. That latter category would include work like writing emails, human resources letters, producing advertising copy, and drafting press releases. Of course, there is always the possibility that a worker doing that could be freed from those tasks and put onto higher level tasks and actually benefit from the AI.

I have seen positive and negative results from using AI in media work and law. Some of the negative examples seem to me to be when the user expects too much from AI at this stage in its development.

I don't think we know today what AI and bots will change in the world of work by next year, but it is certainly an area that requires concern by individuals and those who can affect the broader culture.

Productivity Paranoia

The term "productivity paranoia” was a new one for me when I encountered it in a conversation. I had to admit ignorance and ask the speaker for a definition. I was told that this is when some bosses fear that remote employees aren’t working enough despite data showing just the opposite. He said, "Yeah, they get the work done, but I suspect they are also walking the dog, running errands and watching their kids during what should be 'working hours'."

Defined by Microsoft as a scenario “where leaders fear that lost productivity is due to employees not working, even though hours worked, number of meetings, and other activity metrics have increased,” productivity paranoia is mostly associated with remote/virtual and hybrid workers.

Productivity paranoia is prevalent enough that some companies have invested in expensive technology to monitor their employees’ whereabouts and active time online. Tracking software, surveillance cameras, and GPS data are all possibilities and in one survey 97% of business leaders surveyed believed such software has increased workers’ productivity.

surveillance
Image:StockSnap from Pixabay

But couldn't this level of tracking bordering on "surveillance" have negative effects on workers and perhaps on their productivity?

Some articles say that those who are so monitored tend to be less loyal and more distrustful of their employers. It certainly is a more stressful work environment.

Another article says that "the average adult’s focused attention span is between 90 and 120 minutes and peaks at about 45 minutes" and that "taking a 10-minute break between a working interval of up to 90 minutes can help reset your attention span and keep cognitive momentum going."

For me, that is too long a span. As I am an almost entirely virtual worker now, I have found myself using the "Pomodoro method."

When you start a task (not a project, but a piece of it), set a timer and work on that task for 25 minutes. Then, take a short break (3-5 minutes). Start working on the task again for 25 minutes and repeat until it’s completed. Not only is that short break good for your brain and concentration but physically it is important for you to get out of a chair and move.