Ethical Tech

Reading the latest newsletter from Amber Mac a topic that caught my education eye is ethical tech. Hope educational use of tech is always stressing ethical use, but is this also a topic that is being taught? 

At the end of 2018, The New York Times posted an article titled, "Yes, You Can Be an Ethical Tech Consumer. Here’s How" by Brian Chen, which notes that products that we enjoy continue to create privacy, misinformation and workplace issues. That article makes some recommendations, ranging from Boycott and Shame (not so radical if you consider the 2018 #DeleteFacebook campaign that I don't think was all that successful) to paths that mean we Give Up Convenience for Independence - something that is as easy as fulfilling that resolution to diet and exercise.

Of course, I am on the side of educating the public and our students at all grade levels about the ethical use and applications of technology. Students are largely consumers of the tech, but they will be the creators. Did Mark Zuckerberg ever have an courses or lesson on the ethical use of technology?

I know that at NJIT where I taught, there were a number of courses that touch on ethical issues. In the management area, "Legal and Ethical Issues: Explores the legal and ethical responsibilities of managers. Analyzes extent to which shareholders should be allowed to exercise their legitimate economic, legal, and ethical claims on corporate managers; extent of regulation of a particular industry, individual rights of the employee and various corporate interests, and corporate responsibility to consumers, society, and conservation of natural resources and the environment." Of course, you have to get to the graduate level for that course.

In my own humanities area of Professional and Technical Communication, we started in the foundation courses in addressing ethics in communications - but it is only one topic in a busy curriculum along with usability analysis, visual information; global diversity and communication concerns and communicating with new technologies.

In computer science, "Computers, Society and Ethics" is a 300 level course that examines the historical evolution of computer and information systems and explores their implications in the home, business, government, medicine and education. The course includes discussions of automation and job impact, privacy, and legal and ethical issues. Obviously, ethical use needs to be a part of many courses at a science and technology school, as well as being the subject matter of entire courses.

AmberAmber says in her newsletter, that looking ahead "We will also continue to see social responsibility expand beyond the consumer. For example, let's think about investment dollars into new technologies. In the US alone, according to PitchBook, venture capital investment in US companies hit $100B in 2018. If we dig into these dollars, there are very few memorable headlines about ethical investments, but that is bound to change - especially as executives at large tech companies set new standards.

Engineers, designers, technical communicators and managers need to be better prepared for the world they are entering professionally. I proposed a course at NJIT on Social Media Ethics and Law that has yet to be approved or offered.

Amber continues that in terms of momentum on this ethical use  in companies, she points to software giant Salesforce as a leader. CNBC reported, the company will have its first Chief Ethical and Humane Use Officer in 2019. And she points to a company that prides itself on being ethical and sustainable, Patagonia, as being "the north star of ethical business practices" and suggests that tech CEOs like Mark Zuckerberg should take a long look at Patagonia's many years of dedicated corporate responsibility. Patagonia announced they will donate the $10M the company saved via GOP tax cuts to environmental groups. Amber points out that Patagonia has a long history of providing consumers with access to their supply chain footprint and she asks if that might be the kind of thing that Gen Z may demand from the companies from whom they purchase. They might - if they are properly educated on the ethical use of technology.

Setting a Course in Rhizomatic Learning

A literal rhizome appears on plants. It is not a root, but more like a stem that sends out shoots and roots from its nodes. "Nodes" may make readers of this blog think of a network and that is one reason why the word was used to describe a kind of learning. As a gardener, I think of the plants (especially weeds and invasive species) that spread with vast networks of roots and will even shoot up new plants at a distance from the original.

grass rhizome

This method of spreading appealed to two French philosophers, Gilles Deleuze and Felix Guattari, in writing their book, A Thousand Plateaus. Rhizomatic learning is actually a variety of pedagogical practices that has more recently been identified as methodology for net-enabled education.

This theory of learning is not like the goal-directed and hierarchical approaches that has been the traditional approaches in classrooms. In the rhizomatic approach, learning is most effective when it allows participants to react to evolving circumstances. That means the task or goal is fluid and continually evolving.

That is a structure where the "community is the curriculum" and it turns teaching, learning and instructional design. Most educators and students are primed for pre-existing objectives. There is comfort in knowing where we are headed and then knowing that we have arrived there.

Dave Cormier's introduction/preface/prologue for an upcoming edited book on rhizomatic learning is online as a long post. Cormier avoids a hard definition as he finds that when we define "particularly in writing, we necessarily exclude some of the nuance of the meaning. We leave out the chance that the definition can get better. We leave out another’s perspective." But people want definitions.

via GIPHY

It is no surprise that that Dave Cormier first came to worldwide educational attention as one of the early users and pioneering formulators of the Massive Open Online Course (MOOC). Those original MOOCs were often rhizomatic in structure in that the learning path, the goals and objectives of learners, and so the course it self, was not written in a stone syllabus.

Cormier found in his teaching that using new technologies his students' work "became more diverse and more individualized, and, at the same time, I had lost some control over the teaching process." That can either feel exciting or frightening to a teacher.

And yet, like most of us, Cormier's research reading indicated that "students were ‘most successful’ when they had a clear expectation of what success could look like." Having clear goals for each learning event did not match up with what he was seeing in his teaching.

Curriculum that is textbook-driven (as far too much of our courses are "designed") support a highly structured, linear approach to learning. Add to that structure assignments that come from the content and answers to those assignments that are clearly stated (perhaps in the Teacher’s Copy) and you have a very un-rhizomatic growth pattern. This is growth restricted by borders, walls, planters and possibly even prevented from moving outside the structure by educational "chemicals" designed to kill off stray rhizomes, roots and shoots.

It seems that what gave rise to the current rhizomatic learning growth spurt was the Internet. Cormier's piece goes back much further.

First, he looks to Marcus Tullius Cicero and Gaius Julius Caesar. Then he jumps to the year 1270 and the University of Toulouse, and then to Switzerland in 1800. On that last stop in his history, Johann Heinrich Pestalozzi decides that in order to teach the entire country to read (this is before a public school programs and before teacher education programs) he needs standardization. His method is the textbook. It is a way to make 10000 identical copies of content that all will use.

Pestalozzi was using the new technology of his time - the printing press. It allowed him to scale the learning process to more people. But his efforts and ones to follow not only sought to standardize the content, but also the process and the path to learning.

Cormier argues that following that path may have led us to believe that simply following the path means that learning is occurring. He also believes now that under the technology, rhizomatic learning was always happening. As a simple example, he points at the citations in an academic article that thread back rhizomatically to sources.

The Wikipedia entry of rhizomatic learning notes that educational researcher Terry Anderson has criticized the way in which advocates of rhizomatic learning seem to attack the idea of formal education as a whole. And one of Cormier's fellow MOOC pioneers, George Siemens, has questioned the usefulness of the rhizomatic metaphor: "I don’t see rhizomes as possessing a similar capacity (to networks) to generate insight into learning, innovation, and complexity... Rhizomes then, are effective for describing the structure and form of knowledge and learning...[h]owever, beyond the value of describing the form of curriculum as decentralized, adaptive, and organic, I’m unsure what rhizomes contribute to knowledge and learning."

If this approach to learning is truly rhizomatic, it should be difficult to stop from spreading. 

 

2019 on Serendipity35

Welcome to another year. This year will mark the start of my 13th year blogging on Serendipity35.

serendipity35As I type this post, the visitor counter says there have been more than 104 million visits overall to the site. Wow. At the end of 2017, we were at 97,123,654 visits and at the end of 2018 the count was 104,587,893 - so we had an amazing 7,464,239 hits on pages for the year. That is actually down from years past when we were closer to a million a month. Maybe blogs are not as popular as they once were. Maybe we lost some faithful followers. Probably it is because I used to write several posts a week but now, in my unretirement, I'm only averaging 1.7 posts per week on Serendipity35. (But I am posting on 8 other sites, so it's not like I am not busy!)

The other counter that visitors don't see is the counter that tracks how many posts I have written. That one tells me that early in 2019 we will pass the 2000 posts mark.  At one time, Tim Kellers would also write on the blog, but for the past few years he has been busy in his academic IT world and keeping the server side of Serendipity35 running.

A Google search on "Serendipity35" bring up mostly posts from this site at the top, along with someone's defunct Twitter account, and a wall mount electric fireplace named "Serendipity 35 inch."

It is early in the year for academic readers. The K-12 teachers around here are back in the classroom tomorrow. College professors get started around the third week of January. I'll start posting again this week and maybe some readers will have some free time to read some thoughts on education for the new semester and year.

Thanks for following the blog.

Serendipity35 Holiday

My colleges are ready to take their winter breaks. People are using up some personal and vacation days to extend the break before and after Christmas and the New Year. And I will take a break from writing here too until the new year. 

Here's wishing all my readers a happy and healthy holiday season and a great new year.

If you can put aside education and technology for a day or week, do it. Refresh your brain. 

2019
   Image via pixabay.com

This Business of Predicting: EdTech in 2019

crystal ballAs the year comes to an end, you see many end-of-year summary articles and also a glut of predictions for the upcoming year. I'm not in the business of predicting what will happen in education and technology, but I do read those predictions - with several grains of salt. 

“A good science fiction story should be able to predict not the automobile but the traffic jam.” wrote sci-fi author Frederik Pohl.

Many of the education and technology predictions I see predict things rather than the impact those things will have. Here are some that I am seeing repeated, so that you don't have to read them all, but can still have an informed conversation at the holiday party at work or that first department meeting in January.

If you look at what the folks at higheredexperts.com are planning for 2019 just in the area of higher ed analytics.

Is "augmented analytics" coming to your school? This uses machine learning (a form of artificial intelligence) to augment how we develop, consume and share data. 

And IT analyst firm Gartner is known for their top trends reports. For 2019, one that made the list is "immersive user experience." This concept concerns what happens when human capabilities mix with augmented and virtual realities. Looking at the impact of how that changes the ways we perceive the real and digital world is what interests me.

We are still at the early stages of using this outside schools (which are almost always behind the world in general). You can point to devices like the Amazon Alexa being used in homes to turn on music, lights, appliances or tell us a joke, This is entry-level usage. But vocal interaction is an important change. A few years ago it was touch screen interactions. A few decades before it was the mouse and before that the keyboard. A Gartner video points at companies using remote assistance for applications such as an engineer working with someone in a remote factory to get a piece of equipment back online.

Will faculty be able to do augmented analytics using an immersive user experience? Imagine you can talk to the LMS you use to teach your course and you can ask, using a natural language interface, and ask " Which students in this new semester are most likely to have problems with writing assignments?" The system scans the appropriate data sets, examines different what-if scenarios and generates insights. Yes, predictive analytics is already here, but it will be changing.

But are IT trends also educational technology trends? There is some crossover.

Perhaps, a more important trend to watch for as educators for next year is changing our thinking from individual devices (and the already too many user interfaces we encounter) to a multimodal and multichannel experience.

Multimodal connects us to many of the edge devices around them. It is your phone, your car, your appliances, your watch, the thermostat, your video doorbell, the gate in a parking lot and devices you encounter at work or in stores.

Multichannel mixes your human senses with computer senses. This is when both are monitoring things in your environment that you already recognize, like heat and humidity, but also things we don't sense like Bluetooth, RF or radar. This ambient experience means the environment will become the computer.

One broad category is "autonomous things" some of are around us and using AI. There are autonomous vehicles. You hear a lot about autonomous cars and truck, but you may be more likely to encounter an autonomous drones. Will learning become autonomous? That won't be happening in 2019.

AI-driven development is its own trend. Automated testing tools and model generation is here and AI-driven automated code generation is coming.

Of course, there is more - from things I have never heard of (digital twins) to things that I keep hearing are coming (edge computing) and things that have come and seem to already have mixed reviews (blockchain).

EducationDive.com has its own four edtech predictions for colleges: 

Digital credentials gain credibility - I hope that's true, but I don't see that happening in 2019.  

Data governance grows  - that should be true if their survey has accurately found that 35% of responding institutions said they don't even have a data governance policy - a common set of rules for collecting, accessing and managing data.

Finding the ROI for AI and VR may be what is necessary to overcome the cost barrier to full-scale implementation of virtual and augmented reality. AI has made more inroads in education than VR. An example is Georgia State University's Pounce chatbot.

Their fourth prediction is institutions learning how to use the blockchain. The potential is definitely there, but implementation is challenging. 

Predictions. I wrote elsewhere about Isaac Newton's 1704 prediction of the end of the world. He's not the first or last to predict the end. Most have been proven wrong. Newton - certainly a well respected scientist - set the end at or after 2060 - but not before that. So we have at least 41 years to go.

Using some strange mathematical calculations and the Bible's Book of Revelation, this mathematician, astronomer, physicist came to believe that his really important work would be deciphering ancient scriptures. 

I'm predicting that Newton was wrong on this prediction. He shouldn't feel to bad though because I guesstimate that the majority of predictions are wrong. But just in case you believe Isaac, you can visualize the end in this document from 1486.