Blockchain on Campus

blockchainBlockchain is sometimes described as a secure public ledger. I wrote last year about blockchain and its possible uses on campus, but I have not seen evidence of its application on the campuses I have visited. Of course, it is possible it is being used behind the scenes since this is a technology that would not be evident to end-users.

I read an article about Oral Roberts University's recent conference intended to educate and persuade schools to learn about the technology,test it out and collaborate. Their CIO, Michael Mathews, believes blockchain will be as important to transforming education as the Internet was and early adopters will benefit the most.

The first blockchain was theorized by Satoshi Nakamoto in 2008 and applied the following year as a key component of the digital currency Bitcoin. That connection to the alternative currency that has a still unclear reputation may have influenced some to associate blockchain with had some negative of that rep rub off on it. In fact, it is a technology that adds levels of trust, authentication and recordkeeping. As a public ledger of transactions, it uses a peer-to-peer network (another idea that picked up a bad rep through pirating software and music) to build a decentralized, distributed database. (A more detailed definition here.) Block chain offers an unalterable (for now, at least), public record (that can be made only semi-public) of digital transactions.

Though financial transaction are blockchain's main uses, for a school, the immediate applications would likely be student application processing, transcript evaluations and articulation agreements. 

The conference program may be correct that blockchain is not only the future business model of supply chain, but may be applied to a large education value chain.

This post first appeared on LinkedIn

You Don't Need a College Degree: New Edition

graduateThe argument that a college degree is or isn't the path to a job surfaces regularly. Many studies show that having a degree ultimately leads to greater earnings in a lifetime, and colleges love to see that research out there. But in the past few decades, you find more stories in the news about people succeeding in the workplace without degrees.

This year, I am seeing two trends: more vocational training in high schools, and companies not requiring degrees for some jobs that once did require a degree.

An article on wsj.com discusses the direct ties between some big companies and local high schools to prepare students for jobs. Volkswagen is working with schools in Tennessee to modernize their engineering programs. Tesla is partnering with Nevada schools on an advanced manufacturing curriculum. Fisheries in Louisiana have created courses for students to train for jobs in “sustainability.”

There have long been high school career education programs, and the U.S.. has had specialized vocational schools for a century, but this is a shift. The idea that not all students need a degree (and especially not a liberal-arts degree) in order to get a good job is gaining strength through relationships and changes with employers.

The outlier examples of the billionaires like Bill Gates and Mark Zuckerberg who never finished college are anomalies. Students - and parents - were not convinced that skipping college was the right path. When I was an undergrad back in the 1970s, we all knew that with good grades from a decent college in almost anything you could get some kind of job. I had art history friends who ended up in banking, education majors who went into publishing etc. It was an early enough time in computers that you could get in on the ground floor of that area without a degree. I knew people who got training in network administration at post-secondary vocational schools and did very well.

But there was also a time a bit later when if you wanted a job at Google you had better have a degree, and really a doctorate from Stanford. That is less true today.

The job-search site Glassdoor compiledlist of 15 top employers that have said they no longer require applicants to have a college degree that includes companies like Google, Apple and IBM. 

These companies are not saying they don't want any college graduates and this doesn't apply to all their positions, but it does apply to many more than before. Passing on college degree requirements for some positions is probably a reaction in part to the tight labor market and mounting concerns surrounding student debt.

For example, Apple is considering and hiring people without degrees for positions such as Genius (in their stores), Design Verification Engineer, Engineering Project Manager, iPhone Buyer, Apple Technical Specialist, AppleCare at Home Team Manager, Apple TV Product Design Internship, Business Traveler Specialist, and Part Time Reseller Specialist.

Google lists these positions as open to non-graduates: Product Manager, Recruiter, Software Engineer, Product Marketing Manager, Research Scientist, Mechanical Engineer, Developer Relations Intern, UX Engineer, SAP Cloud Consultant, Administrative Business Partner.

Do these companies penalize someone with a computer science or marketing degree who applies? That would be foolish. But they do seriously consider people without degrees who would not have made the first round of interviews ten years ago.

Threats to the college degree in the past 30 years have been many: tuition costs, online learning, MOOCs, and OER have all been viewed as things that would take down the traditional degree and perhaps the traditional college itself. We would have Education 2.0 as we had Web 2.0. Still, students still apply, take courses, study, party, attend sporting events and graduate. But do they get jobs in their field of study? Sometimes. Do they discover when they get a job that much of those 120 credits seem to play no constructive role in their work? Sometimes.

Tech Design for Seniors

In my previous post, I wrote about andragogy, the theories behind adult learning. Today, I'm writing about what might seem like an extension of andragogy, especially when dealing with technology.

Many (too many) people assume that learning about and using technology is very different for older adults. I am a "Baby Boomer," one of a large group born between 1946 and 1964. I consider myself to be very well versed in technology, but fter all, I have been using and teaching with and about technology for 40 years. But many of my peers are not so comfortable with technology and often come to me for recommendations and help.

Some companies have realized that generally companies and probably educational institutions are underinvesting in, and underserving, older adults. On the educational side, this is a great disservice to this large group of people. But on the marketing side, companies (and colleges?) have discovered a large market and opportunity with the growing over-65 population.

There are approximately 46 million people aged 65 and over living in the United States, and that number is projected to more than double to 98 million by 2060.

This group grew up with 20th century technology that has radically changed in their lifetime. Think of the present-day automobile or phone. They adapted to banking via an ATM and cooking with a microwave — though they may still prefer a teller and a gas oven.

Looking back on those andragogical principles and moving the adult number up 44 years, some seem particularly relevant. For example, when the content and processes have a meaningful relationship to their past experience.

Designers and technology entrepreneurs are most often in their 20s, 30s or 40s. They are not thinking about older generations. But they should. 

 

Being Secure on Chrome

The Chrome browser’s “not secure” warning is meant to help you understand when the connection to the site you're on isn’t secure. It is also a bit of a shaming motivation to the site's owner to improve the security of their site. But that process of getting the httpS site is not really easy in some cases and for non-tech average web users. 

Google made a warning announcement nearly two years ago and there has been an increase in sites that are secured. They started by only marking pages without encryption that collect passwords and credit card info. Then they began showing the “not secure” warning in two additional situations: when people enter data on an HTTP page, and on all HTTP pages visited in Incognito mode.

Their goal is to make it so that the only markings you see in Chrome are when a site is not secure, and the default unmarked state is secure. They will start removing the “Secure” wording in September 2018, and in October 2018, they will start showing a red “not secure” warning when users enter data on HTTP pages.

Source: https://www.blog.google/products/chrome/milestone-chrome-security-marking-http-not-secure/

Digital Darwinism and the Age of Assistance

It is an evolution that I have been following and I have written about how AI and machine learning are pushing us closer to that new age. I jokingly said that I was accepting resumes for a digital assistant. And most recently, the amazing and also frightening Google Duplex demo made me wonder if we don't need a reverse Turing test for AI.

It has been suggested that in this era where technology and society are evolving faster than businesses and schools can naturally adapt, the mantra of “adapt or die” comes into play. You can react to change, be disrupted by it, or adapt. Brian Solis and others have referred to this as "Digital Darwinism."

As with the more established Darwinism, the digital version is pretty indiscriminate when it comes to which products or companies survive, thrive or fade.

digital DarwinismI suppose we are still officially in the Information Age, but I think we may be evolving into an Age of Assistance. There is some evidence of this when you hear people say things like "Google, dim the bedroom lights" or "Alexa, play music by James Taylor."  

As in nature, we need experimentation and adpatations and even new species to survive. And some will have to go extinct. Goodbye Blockbuster, Circuit City, Borders Books, Tower Records, Pontiac, Saturn, and Palm. Hello Netflix, which then had to evolve (and is still doing so) to be a streaming rather than discs-in-the-mail service.

This pruning is clearly happening in business. Have any colleges fallen aside via Digital Darwinism? (A colleague answered that question by only half-sarcastically saying "Trump University.")

Artificial intelligence and machine learning are big drivers in Digital Darwinism. Is it true that Digital Darwinism has pushed open the door to an age of assitance a bit wider?  

That push comes from artifical intelligence combined with speech recognition.

Though smartphones and standalone devices with Siri, Alexa et al have put this assistance in front of consumers, digital speech recognition didn't start with those devices. The IBM Shoebox was shown to the general public during the 1962 Seattle World's Fair. It was launched in 1961 - almost 20 years before the introduction of the first IBM Personal Computer. It was able to recognize 16 spoken words and the digits 0 to 9. 

Most speech recognition systems require some "training" from users and with AI they learn to respond more accurately and efficiently. We have moved away from the earlier "speaker dependent" systems in some ways. They have to learn a particular user's speech patterns, accent, pronunciation etc. Newer systems tend to be independent and aggregate patterns from the many users that are connected to them more than focus on one user. It's not your Alexa. It's everyone's Alexa.

The rise of smart speakers in the past year via their sales (sales that have more than tripled) like Google Home and Amazon Echo have made some humans more reliant on voice commands.

Google has been talking and offering voice search for a few years and we ceratinly sometimes use voice to seach on our phones. But, if you're like me, you still find that technology lacking in most instances. But the voice search revolution is still predicted to happen and I don't doubt that it will occur, though perhaps more evolutionarily than revolutionariy.

The sophistication of voice-recognition systems is improving rapidly. Microsoft’s Cortana voice recognition software (which doesn't get as much attention it seems) now has an error rate of 5.1 percent, which gets it up there with human counterparts. With Microsoft large installed base of Windows-based personal computers, smartphones and smart speakers, they will certain be players in this area.

Google Assistant is a virtual assistant primarily available on mobile and smart home devices, and (unlike Google Now) it can engage in two-way conversations.

What I find on this topic online is primarily about marketing, but I believe it also applies to education.

Consumers are researching just about everything they buy and want relevant results. they want assistance. In an Age of Assistance, we are finding that assistance in what is being refferred to as mobile-first “micro-moments.”

This is way beyond classical marketing. Currently, I don't see many examples in eduction of schools getting into new opportunities for “assisted” engagement. Is your help desk using voivce recognition and AI? Is it in other student support services? can a student in a course that is online or in a classroom use voice to aska question of a digital assistent at 1 AM when they are stuck on a problem?

Those few schools that are adapting and experimenting with these technologies might be safer down the evolutionary road when Digital Darwinism starts to make programs or colleges digital Dodo birds.

 

Digital Darwinism: Survival of the Fittest in the Age of Business Disruption

Digital Darwinism: Branding and Business Models in Jeopardy

                 

 

FURTHER READING

thinkwithgoogle.com/intl/en-154/insights-inspiration/thought-leadership/marketing-age-assistance/

thinkwithgoogle.com/intl/en-gb/consumer-insights/search-in-the-age-of-assistance/

forbes.com/sites/briansolis/2017/10/11/wtf-whats-the-future-of-marketing-in-the-age-of-assistance/

business2community.com/tech-gadgets/the-age-of-assistance