The Science of Learning

Einstein
Professor Einstein during a lecture in Vienna in 1921

Albert Einstein was definitely a subject matter expert, but he is not regarded as a good professor. Einstein first taught at the University of Bern but did not attract students, and when he pursued a position at the Swiss Federal Institute of Technology in Zurich, the president raised concerns about his lackluster teaching skills. Biographer Walter Isaacson summarized, “Einstein was never an inspired teacher, and his lectures tended to be regarded as disorganized.” It's a bit unfair to say that "Einstein Was Not Qualified To Teach High-School Physics" - though by today's standards he would not be considered qualified. It probably is fair to say that "Although it’s often said that those who can’t do teach, the reality is that the best doers are often the worst teachers."

Beth McMurtrie wrote a piece in The Chronicle called "What Would Bring the Science of Learning Into the Classroom?" and her overall question was: Why doesn't the scholarship on teaching have as much impact as it could have in higher education classroom practices?

It is not the first article to show and question why higher education appears not to value teaching as much as it could or should. Is it that quality instruction isn't valued as much in higher education as it is in the lower grades? Other articles show that colleges and most faculty believe the quality of instruction is a reason why students select a school.

Having moved from several decades in K-12 teaching to higher education, I noticed a number of things related to this topic. First of all, K-12 teachers were likely to have had at least a minor as undergraduates in education and would have taken courses in pedagogy. For licensing in all states, there are requirements to do "practice" or "student teaching" with monitoring and guidance from education professors and cooperating teachers in the schools.

When I moved from K-12 to higher education at NJIT in 2001, I was told that one reason I was hired to head the instructional technology department was that I had a background in pedagogy and had been running professional development workshops for teachers. It was seen as a gap in the university's offerings. The Chronicle article also points to "professional development focused on becoming a better teacher, from graduate school onward, is rarely built into the job."

As I developed a series of workshops for faculty on using technology, I also developed workshops on better teaching methods. I remember being surprised (but shouldn't have been) that professors had never heard of things like Bloom's taxonomy, alternative assessment, and most of the learning science that had been common for the past 30 years.

K-12 teachers generally have required professional development. In higher education, professional development is generally voluntary. I quickly discovered that enticements were necessary to bring in many faculty. We offered free software, hardware, prize drawings and, of course, breakfasts, lunches and lots of coffee. Professional development in higher ed is not likely to count for much when it comes to promotion and tenure track. Research and grants far outweigh teaching, particularly at a science university like NJIT.

But we did eventually fill our workshops. We had a lot of repeat customers. There was no way we could handle the approximately 600 full-time faculty and the almost 300 adjunct instructors, so we tried to bring in "champions" from different colleges and departments who might later get colleagues to attend.

I recall more than one professor who told me that they basically "try to do the thing my best professors did and avoid doing what the bad ones did." It was rare to meet faculty outside of an education department who did any research on teaching. We did find some. We brought in faculty from other schools who were researching things like methods in engineering education. I spent a lot of time creating online courses and improving online instruction since NJIT was an early leader in that area and had been doing "distance education" pre-Internet.

Discipline-based pedagogy was definitely an issue we explored, even offering specialized workshops for departments and programs. Teaching the humanities and teaching the humanities in a STEM-focused university is different. Teaching chemistry online is not the same as teaching a management course online.

Some of the best parts of the workshops were the conversations amongst the heterogeneous faculty groups. We created less formal sessions with names that gathered professors around a topic like grading, plagiarism and academic integrity, applying for grants, writing in the disciplines, and even topics like admissions and recruiting. These were sessions where I and my department often stepped back and instead offered resources to go further after the session ended.

It is not that K-12 educators have mastered teaching, but they are better prepared for the classroom from the perspective of discipline, psychology, pedagogy, and the numbers of students and hours they spend in face-to-face teaching. College faculty are reasonably expected to be subject matter experts and at a higher level of expertise than K-12 teachers who are expected to be excellent teachers. This doesn't mean that K-12 teachers aren't subject matter experts or that professors can't be excellent teachers. But the preparations for teaching in higher and the recognition for teaching excellence aren't balanced in the two worlds.

Huang's Law and Moore's Law

I learned about Gordon Moore's 1965 prediction about 10 years after he proposed it. He said that by paying attention to an emerging trend, he extrapolated that computing would dramatically increase in power, and decrease in relative cost, at an exponential pace. His idea is known as Moore’s Law. Moore's law sort of flips Murphy's law by saying that everything gets better.

Ic-photo-Intel--SB80486DX2-50--(486-CPU)Moore was an Intel co-founder and his idea was "law" in the electronics industry. Moore helped Intel to make the ever faster, smaller, more affordable transistors that are in a lot more than just computers today. The 2021 chip shortage globally reminded us that cars and appliances and toys and lots of other electronics rely on microchips.

Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. (Originally, Moore said it would happen every year but he revised it in 1975 when I was introduced to it to say that it would happen every two years.)

Though the cost of computer power for consumers falls, the cost for chip producers rises. The R&D, manufacturing, and testing costs keep increasing with each new generation of chips. And so, Moore's second law (also called Rock's law) was formulated saying that the capital cost of a semiconductor fabrication also increases exponentially over time. This extrapolation says that the cost of a semiconductor chip fabrication plant doubles every four years.

Huang's Law is new to me. Up front, I will say that this newer "law" is not without questions about its validity. It is based on the observation that advancements in graphics processing units (GPU) are growing at a rate much faster than with traditional central processing units (CPU).

This set Huang's Law as being in contrast to Moore's law. Huang's law states that the performance of GPUs will more than double every two years. The observation was made by Jensen Huang, CEO of Nvidia, in 2018. His observation set up a kind of Moore versus Huang.  He based it on Nvidia’s own GPUs which he said were "25 times faster than five years ago." Moore's law would have expected only a ten-fold increase.

Huang saw synergy between the "entire stack" of hardware, software and artificial intelligence and not just chips as making his new law possible.

If you are not in the business of producing hardware and software, how do these "laws" affect you as an educator or consumer? They highlight the rapid change in information processing technologies. The positive growth in chip complexity and reduction in manufacturing costs would mean that technological advances can occur. Those advances are then factors in economic, organizational, and social change.

When I started teaching computers were not in classrooms. They were only in labs. The teachers who used them were usually math teachers. It took several years for other disciplines to use them and that led to teachers wanting a computer in their classroom. Add 20 years to that and the idea of students having their own computer (first in higher ed and about a decade later in K-12) became a reasonable expectation. During the past two years of pandemic-driven virtual learning, the 1:1 ratio of student:computer became much closer to being ubiquitous.

Further Reading
investopedia.com/terms/m/mooreslaw.asp
synopsys.com/glossary/what-is-moores-law.html
intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html

An Instagram Kids App Is On Hold

Instagram logos
Image by Gerd Altmann from Pixabay

Facebook has been getting a lot of critical press the past month.  The Wall Street Journal's "Facebook Files" series has focused attention on how Facebook Inc. knows from internal research that its three platforms allow content that causes harm and any actions it has taken have not been effective.

When they announced this summer that there is a project to develop a version of Instagram aimed at children younger than 13, there was an outcry in the media. Concerns about privacy, screen time, mental health and safety were all aired.

This week Facebook announced it is suspending plans to build the Instagram Kids app. Facebook has owned Instagram since 2012. The platform is largely a photo-sharing application, though it has the commenting and likes common to most social sites. The Wall Street Journal series covered how Instagram is known by Facebook to sometimes negatively affect teenage girls in particular.

This suspension is not an end to the project and the company plans to take some time to work with parents, experts, policymakers and regulators, but to move forward. Introducing the next generation to the platform would be advantageous to the company, though they had said that the Kids app would be ad-free, introducing kids to what may become in their adult life the Facebook "metaverse."

Facebook/Instagram/WhatsApp is certainly not alone in wanting new and younger users and is competing with other platforms such as TikTok and Snapchat.

It may seem somewhat ironic that the WSJ used the results of an internal study by Facebook which they conducted to determine how its apps affect users against the company. In fact, the WSJ did compliment Facebook on doing the research, but their criticism came in what Facebook did or did not do as a result of the studies.

Facebook is scheduled to address these issues this Thursday before the Senate Subcommittee on Consumer Protection, Product Safety and Data Security.

https://www.washingtonpost.com/business/2021/09/27/facebook-instagram-kids/
https://www.wsj.com/articles/facebook-pauses-instagram-kids-project-11632744879
https://www.engadget.com/facebook-is-pausing-work-on-instagram-kids-app-124639135.html

Probability

coin tossI took one course in statistics. I didn't enjoy it, though the ideas in it could have been interesting, the presentation of them was not.

I came across a video by Cassie Kozyrkov that asks "What if I told you I can show you the difference between Bayesian and Frequentist statistics with one single coin toss?" Cassie is a data scientist and statistician. She founded the field of Decision Intelligence at Google, where she serves as Chief Decision Scientist. She has another one of those jobs that didn't exist in my time of making career decisions.

Most of probably had some math teacher use a coin toss to illustrate simple probability. I'm going to toss this quarter. What are the odd that it is heads-up? 50/50. The simple lesson is that even if it has come up tails 6 times in a row the odds for toss 7 is still 50/50.

But after she tosses it and covers it, she asks what is the probability that the coin in my palm is up heads now? She says that the answer you give in that moment is a strong hint about whether you’re inclined towards Bayesian or Frequentist thinking.

The Frequentist: “There’s no probability about it. I may not know the answer, but that doesn’t change the fact that if the coin is heads-up, the probability is 100%, and if the coin is tails-up, the probability is 0%.”

The Bayesian: “For me, the probability is 50% and for you, it’s whatever it is for you.”

Cassie's video about this goes much deeper - too deep for my current interests. However, I am intrigued by the idea that if the parameter may not be a random variable (Frequentist) you can consider your ability to get the right answer, but if you let the parameter be a random variable (Bayesian), there's no longer any notion of right and wrong. She says, "If there’s no such thing as a fixed right answer, there’s no such thing as getting it wrong."

I'll let that hang in the air here for you to consider.



If you do have an interest to go deeper, try:
Frequentist vs Bayesian fight - your questions answered
An 8 minute statistics intro
Statistical Thinking playlist
Controversy about p-values (p as in probabllity)

 

Memory Sculpting

photo wall
Photo by Rachel Claire from Pexels

I was having a Facebook conversation with a friend about how photos and videos change our memories. Kids who grew up in the past 30 years - and more so in the age of smartphones and social media - have definitely had their memories sculpted by images of their past. My sons have said to me several times when I ask them "Do you remember us being there?" that "I remember the photos of it." Do the photos trigger a memory to return or is the photo the memory itself?

I am fascinated by how memory works. Research shows that when we describe our memories differently to different audiences it isn't only the message that changes, but sometimes it's also the memory itself. Every time you remember an event from the past, your brain networks change in ways that can alter the later recall of the event. The next time you remember it, you might recall not the original event but what you remembered the previous time. This leads some to say that memory is like the "telephone game."

This sent me back to an article I read in 2017. I did a search and found it again since my memory of this article on memory may not be remembered correctly. It is titled "Facebook is Re-Sculpting Our Memory" by Olivia Goldhill. Facebook is not the only social network or the only place that we share photos and videos, but it is a major place for this sharing.

I have a new granddaughter and her parents have set up a shared photo album online for relatives. They don't want people (mostly me - the oversharer) to post photos of her on Facebook, Instagram et al. I understand that privacy caution. My granddaughter will have many thousands of photos and videos to look at one day. I have about two dozen black and white photos of my first two years of life. It is probably two 12 photo rolls of film from that time (the 1950s) which seemed like enough to my parents to chronicle my early life.

Those photos of baby me don't trigger any memories but they are my "memory" of that time along with my mother's narration. "That was your stuffed lamb that was your favorite toy."

I have also kept journals since my teen years. The way to chronicle life once was to write it down. Rereading those journals now is a mixed experience. For some things, the journal is now my memory. Without the entry, I couldn't recall names, places or details from 40 years ago. But for some entries, I know that the version I wrote at age 15 is a kind of augmented reality. I made some things sound better or worse than the actual event. I sculpted the memory. Maybe as my memory degrades, those entries - accurate or not - will become the only memory I have.

Those sculpted memories are not unlike the image of ourselves we put online. Not all, but many people, post almost exclusively the best parts of their lives. Alfred Hitchcock said "Drama is life with the dull bits cut out," and that's true of many virtual lives as portrayed online.

That article references Daniel Schacter, a psychology professor at Harvard University, whose 1990s research first established the effects of photographs on memories. Frighteningly, he showed that it was possible to implant false memories by showing subjects photos of an event that they might have experienced but that they didn’t experience.

Another of his experiments found that while looking at photos triggered and enhanced the memory of that particular event, it also impaired memories of events that happened at the same time and were not featured in the photographs.

This sounds terrible, but one positive effect he has found that comes from weaknesses in our memory helps allow us to think meaningfully about the future.

In our recent discussions about fake news and images and videos that are not accurate, we realize that these weaknesses in memory and the ability to implant memories can be very powerful and also very harmful. "Source information” is a weakness of memory that can be tapped for devious purposes. How often have you heard someone explain that they heard it or read it or saw it "somewhere?"  We commonly have trouble remembering just where we obtained a particular piece of information. Though true off-line, for online information we may recall a "fact" but not the source - and that source may Online, this means we could easily misremember a news story from a dubious source as being from a more credible publication.

One phenomenon of memory is now called “retrieval-induced forgetting” I spent four years living at my college but I have a limited number of photographs from the time. Those photos and ones in yearbooks and some saved campus newspapers, plus my journal entries are primarily what I recall about college life. Related things that I can't review are much harder, if not impossible, to remember.

Social media is certainly sculpting (or perhaps resculpting) our memories. Is this making our ability to remember worse? That's not fully determined as of now. Nicholas Carr wrote a book called The Shallows: What the Internet Is Doing to Our Brains that looked at some neurological science in an attempt to see the impact of computers and the Net and that is certainly related to but not exactly the same as memory and images. The controversial part of Carr's book is the idea that the Internet literally and physically rewires our brain making it more computer-like and better at consuming data. But a surprisingly large section of the book is devoted to the history of the written word and all that it has done to “mold the human mind.”

Facebook, Instagram, TimeHop and other tools are reminding me daily of memories from years past. At times, I think "Oh yes, we were in Prague on this day two years ago." Other times, I say to myself, "I don't remember writing this 4 years ago." I react the same way to my old journals and black and white photos in an album taken a half-century ago.

The Subtle Art of Persuasive Design

child smartphone

Image by Andi Graf from Pixabay

Tech companies use “persuasive design” to get us hooked. Some psychologists say it’s unethical. Children are particularly susceptible to "hidden manipulation techniques," but lots of adults are also taken in by its use, especially in social media and advertising on the Internet. by companies like Facebook and Twitter. 

It is in front of our faces when we are getting notifications on our phone and even when that next episode or video on Netflix or YouTube loads itself as soon as we finish one.

Back in the 1970s, there were plenty of articles and theses written about the dangers of too much television affecting children. Kids have 10 times the amount of screen time now compared to just 2011. Of course, now we are talking about more screens than just the family TV set. They spend an average of 400 minutes using technology, according to Common Sense Media.

Media companies have been using behavioral science for decades to create products that we want to use more and more. Remember how the tobacco companies were sued for the ways they hooked people on cigarettes? Big tech uses persuasive technology which is a fairly new field of research based on studying how technology changes the way humans think and act.

Using persuasive design techniques, companies incorporate this research into games and apps. As soon as a child begins to move on to Twitter, Facebook, Snapchat, Amazon, Apple, and Microsoft apps, they have been pre-conditioned for specific behaviors. 

Apple CEO, Tim Cook, has warned that algorithms pushing us to catastrophic results, though critics will say that Apple itself is not free from using persuasive design.

Social media companies are being targeted for deliberately addicting users to their products for financial gain. Some design features, such as infinite scroll, are features that are seen as highly habit-forming. Along with features that may appear as a "plus", like notifications, they keep us on our devices and looking at advertising and clicking longer. They encourage the "fear of missing out" (FOMO).

The infinite scroll was a feature designed by Aza Raskin when he was working for Humanized - a computer user-interface consultancy. He now questions its use.

He is not alone. Leah Pearlman, co-inventor of Facebook's Like button, said she had become hooked on Facebook because she had begun basing her sense of self-worth on the number of "likes" she had. But Ms Pearlman said she had not intended the Like button to be addictive, and she also believes that social media use has many benefits for lots of people.

Defenders of persuasive tech say it can have positive effects. There are apps that remind/train people to take medicine on time or develop weight loss habits. But critics are concerned with persuasive design that is not intended to improve lifestyles but to keep people on their devices in order to sell.

A letter signed by 50 psychologists was sent to the American Psychological Association accusing psychologists working at tech companies of using “hidden manipulation techniques” and asks the APA to take an ethical stand on behalf of kids.

Schrodinger's Coin and Quantum Computing

Schrodinger's cat

A cat sits in a box along with some kind of poison that will be released based on the radioactive decay of a subatomic particle. Because these tiny particles are capable of being in multiple states at once (decaying or not decaying at the same time, that means the poison could simultaneously be released and not released. By extension, the cat could be dead and not dead.

In 1935, Austrian physicist Erwin Schrödinger spun this scenario. Though paradoxical, he didn't mean that cats can be simultaneously dead and alive, but that until you opened the box you'd have a cat that was simultaneously dead and alive.

When I first heard back in high school I thought of some Zen koans or stories that are equally paradoxical and maddening.  If a tree falls in the woods and no one is around to hear it, does it make a sound?

Later, I read that Schrödinger was criticizing the "Copenhagen interpretation" which was the prevailing school of thought in quantum mechanics. The Copenhagen interpretation suggested that particles existed in all possible states (different positions, energies, speeds) until they were observed, at which point they collapsed into one set state. But Schrödinger thought that interpretation didn't scale up very well to objects in the visible world.

A clearer analogy for me was when I heard it explained as being like a spinning coin. While it is spinning, it can be heads or tails. We don't know what it is until it falls and stops spinning. No cats are injured in this version. 

I thought about Mr. Schrodinger's cat and about that spinning coin when I was reading something recently about quantum computing. Schrödinger's cat is often used to illustrate the concept of superposition -- the ability for two opposite states to exist simultaneously -- and unpredictability in quantum physics.

Quantum computing is about harnessing and exploiting quantum mechanics in order to process information. The computers we are used to using “bits” of zero or one. If we had a quantum computer, there would be quantum bits (qubits). The freaky Schrodinger's cat part of quantum computers is that they would perform calculations based on the probability of an object's state before it is measured. Not just 1s or 0s. That means they would have the potential to process exponentially more data compared to traditional computers.

It has been 85 years but people are still messing around with the whole cat thing. Some physicists have given Schrödinger’s cat a second box to play in. This cat lives or dies in two boxes at once in order to consider quantum entanglement. Entanglement means that observation can change the state of a distant object instantaneously - something that Einstein considered impossible and referred to as “spooky action at a distance.” 

Are we even close to creating a quantum computer? It depends on who you read

spinning topHere's a leap beyond cats and coins that came to me because I was surfing through channels on the television and saw that Christopher Nolan's film Inception. 

A character in the film returns home after a long time in the dream world and we are told that a little top that he sets into motion will keep spinning forever if he is still in the dream world. If it stops and falls over, that means he is back in reality. It's like the old pinch yourself to see if you're dreaming.

But the film has a frustrating final shot because it ends before we know what happens to the top. It wobbles but then the film ends. That ending was infuriating to most viewers. It was like the finale of The Sopranos. What happened?

Nolan once spoke at a Princeton University graduation ceremony and said that "The way the end of that film worked, Leonardo DiCaprio’s character Cobb — he was off with his kids, he was in his own subjective reality. He didn’t really care anymore, and that makes a statement: perhaps all levels of reality are valid."

Nolan's point to the graduates? Don't chase dreams; chase realities because, unfortunately, "over time, we started to view reality as the poor cousin to our dreams".

Can you prove that you're not dreaming right now?

That "pinch yourself" thing isn't adequate proof. What if this is a dream that you're stuck in?  Does it matter? If it is, this dream is your reality. 

This sounds like some philosophical skepticism - that school of thought that I once had to study in school and that also sent my mind running in circles. It argues that we can't really know that anything is real. Why? Some would say because you deny the possibility of knowledge. The side I fell on as a college student was that we couldn't make that judgment of "real" because there isn't enough evidence.

That's enough circles to run around in for today. 


Even cats have been considering what Schrodinger proposed. (image via GIPHY)

Social Listening in Higher Education

social media sitesSocial listening (or social media monitoring) is paying attention to  your brand's social media channels for:
- customer feedback
- direct mentions of your brand
- discussions regarding specific keywords, topics
- those same things in your competitors and industry

For higher education institutions, not doing social listening is the equivalent of not listening to students (both potential and active), faculty and staff comments about your school. Some people refer to this as "conversational research" because it is kind of like listening in on other people's conversations about you - which in real life is hard to resist.

Of course, monitoring alone isn't of much value if it is not followed by an analysis to gain insights that you can act on. Though a starting place can be a simple "vanity search" on the name of your school, most colleges are using social listening tools that can filter the data into more granular grouping conversations. That could be geographical locations, online channels (Twitter, Facebook, blogs, forums, etc.), positivity, recency, language, and by specific groups based on sex, age and many other demographics.

Social listening is also a more advanced form of market research that can identify opportunities for courses, majors and new content creation or the amplification of existing content.  For example, comparing the top topics from social listening results to the top topics from a content audit can aid marketers in identifying opportunities to create content that will resonate with their audience.

The search function on networks like Instagram allows for hashtags, so my university would monitor #NJIT, but would also follow #highered #engineering #architecture #STEM and other tags.

It is estimated that there are about 80 million online sources for mentions. Higher education conversations occur in places like news articles, review sites, Reddit, all the big social networks and also higher ed focused sites like College Confidential which is self-described as "The World's Largest College Forum."

Social listening data about peer institutions is "competitive intelligence" and considered to be "brand benchmarking." It is important for admissions marketing, but also for reputation management. This becomes critical such as when there is a campus crisis that requires an instantaneous response.

 

 

 

Birds, Social Media and Scale-free Correlation

I wrote elsewhere about the beauty of the flocking of starlings that is called murmuration. Their murmurations that look like swirling clouds that pulsate, twist and get wider and thinner are intriguing to watch, but how do the birds do it?

I read online that this can be caused by a threat, such as a raptor nearby, but I have seen them flock while walking in a woods and in my backyard trees without any threats seen. In fact, I learned many years ago that if they were roosting in trees nearby and I clapped loudly they would usually take off. Maybe a loud clap sounds like a gun.

Beyond the beauty and wonder of the murmurations, there is interest among scientists who don't normally pay attention to birds by computer scientists and physicists. They are interested in how group behavior spontaneously arises from many individuals at once. Schools of fish are another group behavior studied.

Researchers call this "scale-free correlation." The studies indicate that, surprisingly, flocks of birds are never led by a single individual. You probably have seen flocks of geese that seem to have a "leader," but flocking is actually governed by the collective actions of all of the flock members. Watching these murmurations, as opposed to the straight-ahead flight of a flock of geese flying in formation seems so fluid that it approaches magic.

 

Information moving across the flock so quickly and with nearly no degradation is something I might talk about in communication courses as a high signal-to-noise ratio. Other communication terms that enter into murmuration study include scale-free correlation and effective perceptive range. Those terms can be simply explained as the ways that allow a starling on one side of the flock to respond to what others are sensing all the way across the flock.

A study on starling flocks led by George Young at Princeton determined that starlings in large flocks consistently coordinate their movements with their seven nearest neighbors. This immediately made me think of relationships online, especially in social media.

One thing that comes to mind is Dunbar's number which is a suggested cognitive limit to the number of people with whom one can maintain stable social relationships. That number was proposed in the 1990s by British anthropologist Robin Dunbar, who found a correlation between primate brain size and average social group size. By studying primates and sing the average human brain size, he proposed that humans can comfortably maintain 100-200 (often averaged out at 150) stable relationships. 

That number seems too high to me in real life but it might make sense with social media where words like stable, relationship and friends have other meanings.

That Princeton study also found that the shape of the flock, rather than the size, has the largest effect on the 7 number. It's like playing that game of Telephone. When one person passes a message along to the next person, who repeats it to another and so on, the message degrades as the size of the group increases. The starlings are playing telephone only with their seven nearest neighbors. They have made the shape of their group different, despite the large size of the flock.

I wonder if the studies of starlings might be extrapolated to explain social media behaviors. I may have 600 friends on Facebook, but I think that I "shape" my group of close friends much smaller, and that group connects me to other smaller groups within that 600. I use lists on Facebook, for example, one comprised of poets. (Lists is a feature that is not really promoted by Facebook these days.)  I really only have direct and active communication with about a dozen of them, but the group has almost a hundred members.

How starlings achieve such a strong correlation still remains mostly a mystery.  I suspect that social media networks are also researching these kinds of correlations.

Wikipedia, Ants and Stigmergy

herring swarm

Swarming herring


I like to discover new words, new fields of study - new things in general. My new one for today is STIGMERGY. According to Wikipedia (an apt source or the definition, as I will explain) is stigmergy is a "mechanism of indirect coordination, through the environment, between agents or actions.” That is not a very clear definition.

The concept of stigmergy has been used to analyze self-organizing activities. Those activities cover a wide area: social insects, social media, robotics, web communities, and the wider human society.

One principle of stigmergy is that the trace left in the environment by an action stimulates the performance of a next action, by the same or a different agent. This can explain the way an ant colony operates. It can also explain how Wikipedia articles are created and changed.

Social insects, like ants and bees, have long been a model of collaboration. Global knowledge sharing through asynchronous collaboration is a newer example. I believe I may have heard this word a or concept more than a decade ago when "Web 2.0" was a new and much-talked-about idea. Now, I hardly ever hear Web 2.0 mentioned - and that's not because we got past it and into Web 3.0.

The word is not all that new. It was coined in 1959 by French biologist Pierre-Paul Grassé in reference to termite behavior, from the Ancient Greek stigma, "mark”, “sign" + ergon "work”, “action."

You might hear the word used in a conversation about swarm intelligence. Swarm intelligence (SI) is the collective behavior of decentralized, self-organized systems, natural or artificial and it is employed in work on artificial intelligence and applications such as cellular robotic systems. It has been studied in the natural world in ant colonies, bird flocking, hawks hunting, animal herding, bacterial growth, fish schooling and the somewhat scary world of microbial intelligence.

The World-Wide Web is the first stigmergic communication medium for humans. The earlier telephone and even email don't count as stigmergic communication since they are only readable by the people on either end. Stigmergic communication means the messages are readable by everyone. And radio and TV don't fit the definition because they are read-only mediums for most people (until the Web emerges and the read/write of Web 2.0 takes hold). 

Wikipedia with its millions of contributors is an example of stigmergy. Its editors are a good example of how these traces of articles and edits left in the wiki environment stimulate the performance of a next action, by the same person or a different person(s).

I discovered (or possibly rediscovered) stigmergy from an episode of the playswellwithothers.org podcast with guests Katherine Maher, the executive director of the Wikimedia Foundation and Clint Penick, an ant researcher and assistant research professor in the Biomimicry Center at Arizona State University.

 

FURTHER READING
https://wiki.p2pfoundation.net/Stigmergy
"Stigmergy as a universal coordination mechanism I: Definition and components" 

Our Collective Attention Span Has Fallen

Quick followup to my previous post about very brief presentations of research

The average human attention span has fallen to eight seconds — below the average attention span of a goldfish. At least, so said a recent wave of debunked press coverage from outlets including The New York Times and, uh, us. The factoid, which had no clear source, felt true. New research suggests this may be because a different attention span has shrunk recently — not the individual's, but the collective's.

Collective attention span is meant to mean how long a topic stays popular (or hot or trending). It is about public conversations.   

Homer Simpson too many optionsPeople study the how long of news stories, movies, hashtags etc. to see when it loses its appeal. Looking at the 2013-2016 hashtags trending on Twitter (one of the things that gets blamed for reduced attention spans) they found that the top 50 hashtags fell from 17.5 hours of trending to 11.9 hours. There was similar shrinkage on Reddit, Google Books and in movie ticket sales. 

Things don't hold our attention as long. At least online and with media. Is anyone studying attention span for real world things, like reading a book, looking at a painting, watching a sporting event?

The researchers say that this is part of "a more general development termed social acceleration, the impact of these changes on the social sphere has more recently been discussed within sociology. In the literature there have been hints of acceleration in different contexts, but so far, the phenomenon lacks a strong empirical foundation."

They created a model that suggests that our collective attention span shrunk due to growing competition. There is just too much media out there competing for our attention. "Our analysis suggests increasing rates of content production and consumption as the most important driving force for the accelerating dynamics of collective attention."

This isn't all that new. Overchoice or choice overload is a cognitive process in which people have a difficult time making a decision when faced with many options. The term was first introduced by Alvin Toffler in his 1970 book, Future Shock.

The Paradox of Choice – Why More Is Less is a 2004 book by American psychologist Barry Schwartz that argues that eliminating consumer choices can greatly reduce anxiety for shoppers.

"Decision Paralysis" is another term to put into this mix. 

I wonder that if we were simply presented with fewer choices, our attention span would increase. Though it is unlikely that we can roll the media content snowball back up the hill, perhaps we can individually limit our choices and improve our personal attention span. I don't have much hope of lengthening our collective attention span.

Jumping Through the Dissertation Hoop

jumping through hoops
"Jumping through hoops at Arabian Nights" by Experience Kissimmee, Florida is licensed under CC BY 2.0 CC BY 2.0

Ah, the dissertation. That mammoth writing task that a student needs to complete in order to get that terminal degree. ("Terminal" - interesting term to use)

If you work in higher education, then you have stories to tell about people trying to jump through that academic hoop. I have lots of tales of friends and colleagues who struggled to write, finish and make it through their defense. (Isn't "defense" an interesting word to use for that part of the process?)

One friend of mine was being pushed to extend his writing for an additional semester because his supervising professor needed him to justify not teaching an additional class. The professor actually told him that. When he reacted negatively, the prof replied that "We had to jump through these hoops to get our doctorate, so now we get to make you jump."  Wow. It's the academic equivalent of fraternity hazing.

I could also write a post on "NOT Writing the Dissertation" since that was my own experience. I enrolled in a Ed.D program in "pedagogy" but lost interest when I was required to take courses on topics such as "school law." I continued to take classes - just not the right ones. Since at that time I was teaching in a public school, when I had amassed 32 credits beyond my MA degree, I "advanced" on the salary guide. Not as far as if I had the Ed.D, but pretty close.

So, I was close to being ABD (All But Dissertation) but really closer to being ADD (Attention Deficit Disorder).

Through my LLC with my wife, we have done editing for academic writing, including some dissertations. When I saw an article by Leonard Cassuto in The Chronicle of Higher Education about the "Value of Dissertation-Writing Groups," it was a good reminder that although we seem to value the solitary work of writing a dissertation, it is - and should be - more of a collaborative writing task.

Cassuto says, "The glorification of solitary labor permeates the imaginary ideal of scholarship in the humanities and humanistic social sciences. Your dissertation is your own work of expertise, your own plot in the great intellectual firmament. And you write it by yourself. Here’s the trouble: It just ain’t so."

The dissertation is supposed to be part of the process of making a student into a scholar and getting research out into the world. Scholarly and academic writing outside of dissertations is especially collaborative. As the article says: "Academic writing done by those who already have the doctorate is very collaborative, as is obvious if you look at all those acknowledgements in scholarly books. There are other academics, librarians, archivists, proofreaders, but also — and crucially — the people who have been reading drafts all along, making suggestions, editing, shaping. The person whose name is on the spine does the largest part of the work, but others share in the labor."

It seems to me that working with graduate students who are doing writing (dissertation or otherwise), you should certainly address the collaborative element. One way to do that is to have a dissertation group. The students we worked with on dissertations all had formed a group on their own with classmates, mostly ones who were on a similar path. That is a part of the process that should be formalized, encouraged and - perhaps most importantly - accepted by academia. 

Content Curation

butterfly collection
   Curating a butterfly collection
archive
or curating an archive

Probably your first association with the word “curator” is a person at a museum. The word comes from Latin: cura, meaning “to take care.” The curator of a gallery, museum, library, or archive usually is in charge of an institution’s collections. Those collections are probably tangible objects like artwork or historic items. But the term “content curation” is a more recent variation.

Content curation has become a term associated with the online world. Though some people might do this as a job, such as a social media manager, many of us do it for no pay. If you have a Twitter, LinkedIn, Pinterest, Facebook or other social account, you probably retweet and repost/share content. Curation means that someone has seen value in content and so is sharing it with friends and followers – and potentially with the entire online world.

I think that everyone would agree that some people do this curation with more though and skill than others. A thoughtful curator gathers from a variety of sources, sometimes around a specific topic, and shares the best of what they find. For example, I might follow someone online because they post good information (original or shared) about poetry.

A poor curator probably isn’t a curator at all. You probably have come across people who share silly things, inappropriate links and who may not even vet (make a careful and critical examination of) a link or article before they share it. You might unfriend or unfollow such or person. You might even take the time to try correcting them with a link to snopes.com or some other site that shows their information is incorrect.

And here we get into that term that is so much in the air the past year or two – fake news.

In all my years of teaching, I always had to teach lessons to students from 7th grade to graduate school about how to vet information in doing research. How do you know a source is valid? How do you know that a fact is a fact? Is your information up to date? Can you separate fact from opinion?

I posit that all of us active online need to be good content curators. Just using this blog as an example, I try to be a good curator of the information I put into the online world. I try to follow good curation practices.

I often write original content, but at least half of my content comes from other sources, such as books I am reading, websites, and podcasts. I try to share things that interest me but that I think will interest and help my audience.

Who is “my audience”? After blogging in different places for 12 years, I have learned to look at my statistics and comments for where people come from (geographically) and what content they find most appealing.

As when I taught research, I try to use trustworthy sources. I look for content that is relevant, timely, interesting, useful, and occasionally entertaining.

A good curator gives credit to sources – give a link to the original  inspiring article or the book or person. Give readers a way to get additional information if they want to go deeper into a topic.

In the more commercial side of social media that concerns marketing (I do that too), there is the “social media rule of thirds.” This rule says that you should share a third on your original brand (which might be personal) content promotion, a third using curated content by others, and a third about the conversations happening on social media.

You are reading this online, so there is a good chance you are a content curator yourself – whether you know you are or not. Are you a good curator? Leave a comment if you have any thoughts about this either on how others do it well or poorly, or about your own practices.

 

This article first appeared at Weekends in Paradelle

Aligning Learning and Key Performance Indicators

focusAlign your training with KPIs. This is not a mantra I hear in education. A KPI is a Key Performance Indicator, which is a measurable value that demonstrates how effectively an organization (most commonly a company it seems) is achieving key objectives.

KPIs are used to evaluate success at reaching targets. Businesses talk a lot about the Return on Investment (ROI) and they are usually talking about dollars and cents. But in educational training and professional development, the ROI probably can't be measured in dollars.

Still, the process may be similar.

Define which metrics are most important to you. These become your key performance indicators. You need to know exactly what you're going to use to judge performance. 

If you want to increase enrollment in a major or program, that provides an easy metric. If a professor want to increase attendance in her classroom, that is also easily measured.

When I work with faculty designing courses, many professors stumble on setting objectives versus goals. The simple difference is that a goal is a description of a destination, and an objective is a measure of the progress that is needed to get to the destination. In this context goals are the long term outcomes.

Teachers will sometimes tell you objectives that are not measurable. For example, to want students to "have an appreciation of modern poetry" may be an admirable goal for a poetry courses, but how do you measure that? 

For an objective to be effective it must be clear, measurable and have a time element. For instance, that objective of increasing class attendance by 10 percent by the end of the semester is clear, measurable and has that time element.

Of course, after you determine those objectives, the real difficult part begins - figuring out how to reach that objective.

Students Are Still Suffering From Summer Melt

Summer melt is the phenomenon of prospective college students' motivation to attend college "melting" away during the summer between the end of high school and beginning of college. I wrote about this summer melt last summer and this summer (inspired partially by a week-long heatwave in my part of the country) I decided to check in in and see if things have changed. Basically, things have not changed.

There are some intervention programs at schools that seem to help prevent summer melt, but for the majority of schools students are still melting away.

This phenomenon is especially prevalent in low-income minority communities, where students who qualify for college and in some cases even register for classes ultimately end up not attending college because they lack resources, support, guidance, and encouragement. The melting is also common for students who are the first in their family to get a chance at college. That was the case for me many decades ago.

I vividly remember trying to fill in the FAFSA forms for financial aid (which was critical to me attending). My father had died three years before and he was the one who wanted me to attend college and get the opportunities he never had. My mom couldn't provide money for college and couldn't really help in completing the forms. She had no idea what college was all about. I had no one to turn to, so I did it all myself - probably badly, as I didn't get the financial aid that I clearly should have gotten based on our financial situation

But I persevered and I got to Rutgers College in September. And I hated it. College seemed so much like high school all over again that first semester that if I could have gone to an office and gotten all my money back, I would have quit in October.

I couldn't get a refund and I stuck it out, and by the spring semester my perspective had completely changed. I found my place. I found the places to go when I needed help. I was able to get some additional student loans.

Many students were helped on their college path during their years in high school by counselors and probably a few trusted teachers. But that support is gone in the summer after high school graduation and most colleges are not supporting incoming freshmen until orientation.

These students who melt away are not going to other colleges. They are going nowhere.  

The summer melt student rate varies by schools but runs about 10-40% of students, according to a study from Harvard University. According to surveys, the general number given is about one third of all students who leave high school with plans to attend college never arriving at any college campus that fall.

That's the problem. What about the interventions and support?

One project I read about targeted 1,422 students and offered them up to two hours of counseling (which is not much) over a five-week period following high-school graduation. About 500 students received assistance through in-person meetings or over the phone. About one in three of them received help filling out financial-aid forms; another third got help with transcripts. One in 10 merely sought emotional support and reassurance to manage pre-college anxiety. This Summer Link program set a budget of $48 per student to cover costs.

But some of the interventions are not costly and may not involve much staff time. I would not let the high schools totally off the hook when it comes to support. Realizing that almost all high school counselors are 10-month employees and off for the summer, high schools can still support their graduates by texting weekly reminders to check their email, complete their financial aid forms and register for classes can go a long way to keep students on track. If there is any summer staff, being available for help would be a tremendous intervention even if it is more of a group session than 1:1 support. 

For colleges too, texting programs (email seems to not be the way to communicate with these students - though many college are still using that and snail mail as their way to communicate) can make it easy for counselors to reach large numbers of students quickly.

Social media should also be used. Having incoming freshmen follow an Instagram account for their particular class (not the general college accounts) that post photos and brief notes on deadlines, numbers to call etc. would also be better than email and snail mail.

When feasible, getting those students on campus in July and August is a good thing. These are not the students who visit with mom and dad, take pictures and buy t-shirts and things at the bookstore.

My only visit to Rutgers before orientation was an afternoon when we met with a faculty member to create our schedule. My "advisor" was new faculty member who taught economics and knew less about my English Education program of study than I did. Plus, I was profoundly disappointed that my first semester courses seemed to have nothing to do with my goal to be an English teacher. Economics 101? 

Some of summer melt certainly comes from those doubts and concerns I felt and I think all students feel about what college will be, how successful they can be and even if they’ve made the right choice. The forms and information colleges ask for and the placement exams that most schools require and all the deadlines are important. Missing or messing up one of them can really screw up your college path. 

In talking to some friends who are not involved in education about summer melt, they were shocked. They say "You mean a kid has taken the SATs, been accepted, received financial aid, and she still doesn't show up? That makes no sense." And they're right. It doesn't make sense that colleges aren't doing more to prevent these students from melting away.

 


Summer Melt: Supporting Low-Income Students Through the Transition to College by Benjamin L. Castleman and Lindsay C. Page