Parental Control of Technology

kids on tech
Photo by Andrea Piacquadio

As the new school year begins for all students this week, a series titled "Parental Control" appears from Mozilla (Firefox) about ways to empower parents for some technology challenges. That sounds like a good thing, but particularly when it applies to schools, parental control has cons along with pros.

Many digital platforms offer parental control settings. The most common and most popular allows parents to shield young people from “inappropriate” content. Restricting "mature content" and what is "inappropriate" takes us into a controversial area. Who defines what should be restricted? Mozilla says that "the way platforms identify what that means is far from perfect."

YouTube has apologized after its family-friendly “Restricted Mode” recently blocked videos by gay, bisexual and transgender creators, sparking complaints from users. Restricted Mode is an optional parental-control feature that users can activate to avoid content that’s been flagged by an algorithm.

That example takes me back to the earliest days of the Internet in K-12 schools when filters would block searches for things like "breast cancer" because "breast" was on the list of blocked words.

Limiting screen time is another strategy and is within a parent's control but is certainly controversial within a family. Kids don't like their screen time to be limited.

Mozilla actually had questions for itself about what to call the series. They quote Jenny Radesky, an MD and Associate Professor of Pediatrics-Developmental/Behavioral at the University of Michigan, as saying that “Parental mediation is [a better] term, parental engagement is another – and probably better because it implies meaningful discussion or involvement to help kids navigate media, rather than using controlling or restricting approaches.” She pointed to research that suggests letting children manage their own media consumption may be more effective than parental control settings offered by apps.

The internet has risks, but so do parental controls. Many kids in the LGBTQI+ community can be made vulnerable by tech monitoring tools.

Sensitive information about young people can be exposed to teachers and campus administrators through the school devices they use.

As parents and eductaors, we want to protect students, especially the youngest ones. We als want to, as a society, instill in younger generations why privacy matters.

RESOURCES

Electronic Frontier Foundation https://www.eff.org/search/site/parents

Mozilla https://blog.mozilla.org/en/internet-culture/deep-dives/parental-controls-internet-safety-for-kids/

Free Online Educational Summit July 28-29

Course Hero is a website where educators (and students) can find free resources. You need to create a free Verified Educator account which will allow you to access sample assignments, case studies, lectures, labs, syllabi, and more. These resources have been shared by higher education faculty and students. There are 80,000+ college faculty using the site to get learning resources and inspiration for your own teaching.

Your ability to download resources is gained by you uploading your original study materials. You’ll earn free unlocks for sharing your knowledge - 5 unlocks for every 10 documents submitted.

They are hosting their free 2-day 5th annual Education Summit July 28–29 from 9 a.m. – 4 p.m. PT. Their belief that teaching is a shared practice and the rapidly shifting educational landscape requires us to lean into our roles as both instructors and learners. You can join thousands of fellow educators, research experts, and instructional designers to unpack the latest in learning and pedagogy.

I first used their website when I was building a new course and was curious if any other teachers had uploaded their syllabi for a similar course. I found half a dozen that I was able to use to get started, including ones with links to readings I might also use.

Make That Informational Resource Educational

resources
Image by Manfred Steger from Pixabay

A recent post by David Wiley is titled "The Difference Between an Informational Resource and an Educational Resource"
(Creative Commons Attribution 4.0 International)  He wrote "Recently I’ve been thinking about the difference between an informational resource and an educational resource. I’ve had the sense that an educational resource is an informational resource with a little something extra and have enjoyed coming back to this thought again and again over the last several weeks, trying to reduce this “something extra” to its simplest form."

Thinking about that myself, I recalled how many times as a young person I was told that a book, movie or even a TV show was "educational." But were they? A book can certainly be informational, but can a book be educational just by reading it? Wiley gives as an example an encyclopedia. Certainly it is informational if read. It has some characteristics that Wiley sees as essential: comprehensive, accurate, and well-organized. What would need to be added to make it an educational resource? Is Wikipedia educational because it has interactivity built in to it?

Another example that Wiley ponders is creating an open textbook. A book - especially a textbook - is clearly meant to be informational, but I think many of us (especially in academia) probably also consider a textbook to be "educational."

You could say that what takes a resource to educational is what you do with it. When a reader takes information about how to paint a watercolor and then starts to paint one, it seems to have moved beyond informational. But that doesn't make the resource educational, does it? What can a creator do in the creation of a resource to better insure that it can be educational?

Wiley's suggestion for that open textbook is to consider what would result if the writer (possibly a faculty member) partnered with an instructional designer? How might the book be written if along the way you are thinking about how it will be used by a teacher and student. For example, adding practice (not a new textbook component) with feedback might be one thing that moves a resource into the realm of educational.

I have a category on this blog called RESOURCES and looking over the posts there (including this one) I realize they are informational, not educational. That's not a bad thing, but it is a thing to consider in creation. If you had asked me earlier if my resources were educational, I think I would have casually replied that they were educational. You might read this post, click the link and read Wiley's original post. Good information. You might even go on to make your your next informational creation -a lecture, a handout, a textbook - educational by designing it with additional elements. That would be good - but it would not make my post or Wiley's educational. Kind of a humbling consideration.

Is Technology Destructive By Design?

Technology is good. Technology is bad. Both are true. 

The highest tech has transformed the world. It has changed our culture, made information accessible to many more people, altered businesses, education, and the economy.

I came across the book, Terms of Disservice: How Silicon Valley is Destructive by Design, by Dipayan Ghosh recently. Ghosh was a Facebook public policy adviser who went into government work with President Obama's White House.

The book's title is a play on those terms of service that products offer and are often not even read by users. Though you can view this book as being negative on the effects of technology, it actually offers ideas for using technology in positive ways, such as to create a more open and accessible world. That was actually part of the original plan (or dream) for the Internet. The extra level of service he sees as lacking is consumer and civilian protections.   

Ghosh is a computer scientist turned policymaker so much of the focus in the book is on industry leaders and policymakers. Technology has done a lot of good but it has also exacerbated social and political divisions. This year we are hearing again about how technology in the form of social media and cyberterrorism has influenced elections. Civilians has wittingly and unwittingly given private information to American companies which was wittingly and unwittingly passed on to terrorist groups and foreign governments.

We have heard this on an almost daily basis, and yet it seems that nothing is being done to stop it.

In an interview with the LA Review of Books, Ghosh was asked about what a broader “digital social contract” would look like. He answered, in part:

"If we can agree that this business model is premised on uninhibited data collection, the development of opaque algorithms (to enable content curation and ad targeting), and the maintenance of platform dominance (through practices that diminish market competition, including raising barriers to entry for potential rivals), then three basic components of possible intervention stand out. First, for data collection and processing, all the power currently lies within corporate entities. For now, Google can collect whatever information it desires. It can do whatever it wants with this data. It can share this information basically with whomever.

Europe’s GDPR has begun to implement some better industry norms. But to truly resolve these problems, we’ll need to transfer more power away from private firms...

We also need more transparency. Basic awareness of how this whole sector works should not be treated as some contrived trade secret. Individual consumers should have the right to understand how these businesses work, and shouldn’t just get opted in by default through an incomprehensible terms-of-service contract. We likewise need much better transparency on how platform algorithms and data-processing schemes themselves work.

And finally, we need to improve market competition. We need data-portability arrangements, interoperability agreements — and most importantly, a serious regulatory regime to contend realistically with monopolistic concentration."

One of the takeaways from this book is that these institutions are destructive by design. It reminds me of the late revelations about the American tobacco industry that they knew their products were addictive and caused health problems and designed the products to increase that addiction while they ignored and even covered up the health concerns. Can the same be said of technology products?