Edge Computing

I learned about edge computing a few years ago. It is a method of getting the most from data in a computing system by performing the data processing at the "edge" of the network. The edge is near the source of the data, not at a distance. By doing this, you reduce the communications bandwidth needed between sensors and a central datacenter. The analytics and knowledge generation are right at or near the source of the data.

The cloud, laptops, smartphones, tablets and sensors may be new things but the idea of decentralizing data processing is not. Remember the days of the mainframe computer?

The mainframe is/was a centralized approach to computing. All computing resources are at one location. That approach made sense once upon a time when computing resources were very expensive - and big. The first mainframe in 1943 weighed five tons and was 51 feet long. Mainframes allowed for centralized administration and optimized data storage on disc.

Access to the mainframe came via "dumb" terminals or thin clients that had no processing power. These terminals couldn't do any data processing, so all the data went to, was stored in, and was crunched at the centralized mainframe.

Much has changed. Yes, a mainframe approach is still used by businesses like credit card companies and airlines to send and display data via fairly dumb terminals. And it is costly. And slower. And when the centralized system goes down, all the clients go down. You have probably been in some location that couldn't process your order or or access your data because "our computers are down."

It turned out that you could even save money by setting up a decentralized, or “distributed,” client-server network. Processing is distributed between servers that provide a service and clients that request it. The client-server model needed PCs that could process data and perform calculations on their own in order to have applications to be decentralized. 

Google car

Google Co-Founder Sergey Brin shows U.S. Secretary of State John Kerry the computers inside one of
Google's self-driving cars - a data center on wheels. June 23, 2016. [State Department photo/ Public Domain]

Add faster bandwidth and the cloud and a host of other technologies (wireless sensor networks, mobile data acquisition, mobile signature analysis, cooperative distributed peer-to-peer ad hoc networking and processing) and you can compute at the edge.  Terms like local cloud/fog computing and grid/mesh computing, dew computing, mobile edge computing, cloudlets, distributed data storage and retrieval, autonomic self-healing networks, remote cloud services, augmented reality and more that I haven't encountered yet have all come into being.

Recently, I heard a podcast on "Smart Elevators & Self-Driving Cars Need More Computing Power" that got me thinking about the millions of objects (Internet of Things) connecting to the Internet now. Vehicles, elevators, hospital equipment, factory machines, appliances and a fast-growing list of things are making companies like Microsoft and GE put more computing resources at the edge of the network. 

This is computer architecture for people not things. In 2017, there were about 8 billion devices connect to the net. It is expected that in 2020 that number will be 20 billion. Do you want the sensors in your car that are analyzing traffic and environmental data to be sending it to some centralized resource - or doing it in your car? Milliseconds matter in avoiding a crash. You need the processing to be done on the edge. Cars are "data centers on wheels." 

Remember the early days of the space program? All the computing power was on Earth. You have no doubt heard the comparison that the iPhone in your pocket has hundreds or even thousands of times the computing power of the those early spacecraft. That was dangerous, but it was the only option. Now, much of the computing power is at the edge - even if the vehicle is also at the edge of our solar system. And things that are not as far off as outer space - like a remote oil pump - also need to compute at the edge rather than needing to connect at a distance to processing power. 

Plan to spend more time in the future at the edge.

The Trends at the NJEDge Conference 2018

There are so many posts I have the past few weeks about trend for education and technology, but one way of seeing what trends may emerge this year is by looking at the tracks, presentations and keynote speakers at EdTech conferences. 

logoI'll be moderating a track next week at NJEdge.Net's Annual Conference: NJEdgeCon2018 "DIGITAL LEADERSHIP & ENTERPRISE TRANSFORMATION" January 11 & 12, 2018 in New Jersey.  My track is, naturally, Education and Technology which has presentations on best practices, innovations and the effectiveness associated with current LMS and online learning tools, effective infrastructure, resources, sustainability models and integrated assessment tools.

But if you look at the other tracks offered, you can see that INFORMATION Technology outweighs instructional technology here. Other tracks at the conference are Big Data & Analytics, Networking & Data Security, Customer Support & Service Excellence,  Aligning Business & Technology Strategies. and Transformation Products & Services.

Amber Mac (as in MacArthur) will talk about adaptation and the accelerating pace of corporate culture in the digital economy.

I have followed her career for a decade from her early tech TV and podcast venture to her current consulting business. She helps companies adapt to, anticipate, and capitalize on lightning-quick changes—from leadership to social media to the Internet of Things, from marketing to customer service to digital parenting and beyond. It’s not about innovation, she says; it’s about adaptation.

When it comes to teachers and technologies, the battle cry of Virginia Tech professor John Boyer is embrace, not replace. In his talk, he presents his view that the best teachers will embrace technologies that help them better communicate with students, but do not fear because those technologies will never replace human to human interaction. But blending the best communicators with the best technology has to offer will produce some amazing and unpredictable opportunities!


Wayne Brown, CEO and Founder of Center for Higher Ed CIO Studies (CHECS), will talk in his session on longitudinal higher education CIO research and the importance of technology leaders aligning technology innovations and initiatives with the needs of the higher education institution. His two-part survey methodology enables him to compare and contrast multiple perspectives about higher education technology leaders. The results provide essential information regarding the experiences and background an individual should possess to serve as a higher education CIO. In collaboration with NJEdge, Wayne will collect data from NJEdge higher education CIOs and will compare the national results with those of the NJ CIOs.

Timothy Renick (a man of many titles: Vice President for Enrollment Management and Student Success, Vice Provost, and Professor of Religious Studies at Georgia State University) is talking about "Using Data and Analytics to Eliminate Achievement Gaps."  The student-centered and analytics-informed programs at GSU has raised graduation rates by 22% and closed all achievement gaps based on race, ethnicity, and income-level. It now awards more bachelor’s degrees to African Americans than any other college or university in the nation. Through a discussion of innovations ranging from chatbots and predictive analytics to meta-majors and completion grants, the session covers lessons learned from Georgia State’s transformation and outlines several practical and low-cost steps that campuses can take to improve outcomes for underserved students.

Greg Davies' topic is "The Power of Mobile Communications Strategies and Predictive Analytics for Student Success and Workforce Development." The technology that has been used to transform, to both good and bad ends, most other major industries can connect the valuable resources available on campus to the students who need them most with minimal human resources. Technology has been used to personalize the digital experience in such industries as banking, retail, information and media, and others by reaching consumers via mobile technology. Higher Education has, in some cases, been slow to adapt innovative and transformative technology. Yet, its power to transform the student engagement and success experience has been proven. With the help of thought leaders in industry and education, Greg discusses how the industry can help achieve the goal of ubiquity in the use of innovative student success technologies and predictive data analytics to enable unprecedented levels of student success and, as a consequence, workforce development.

Learning and Working in the Age of Distraction

screensThere is a lot of talk about distraction these days. The news is full of stories about the Trump administration and the President himself creating distractions to keep the public unfocused on issues they wish would go away (such as the Russias connections) and some people believe the President is too easily distracted by TV news and Twitter.

There are also news stories about the "distraction economy."  So many people are vying for your attention. The average person today is exposed to 1,700 marketing messages during a 24-hour period. Most of these distractions are on screens - TV, computers and phones.  Attention is the new currency of the digital economy.

Ironically, a few years ago I was reading about "second screens," behavioral targeting and social media marketing and that was being called the "attention economy." There is a battle for attention, and the enemy is distraction.

Google estimates that we spend 4.4 hours of our daily leisure time in front of screens. We are still using computers mostly for work/productivity and search. We use smartphones for connectivity and social interactions. Tablets are used more for entertainment. My wife and I are both guilty of "multi-screening." That means we are part of the 77% of consumers watching TV while on other devices. I am on my laptop writing and researching and she is on her tablet playing games and checking mail and messages. It is annoying. We know that.

Of course, the original land of distraction is the classroom. Students have always been distracted. Before the shiny object was a screen full of apps, passing notes was texting, and doodling in your notebook and the cute classmates sitting nearby were the social media. But I have seen four articles on The Chronicle website about "The Distracted Classroom" lately. Is distraction on the rise?

If you are a teacher or student, does your school or your own classroom have a policy on using laptops and phones? If yes, is it enforced?  Anyone who has been in a classroom lately of grade 6 or higher knows that if students have phones or laptops out in class for any reason they are texting, surfing the web, or posting on social media.

Good teachers try to make classes as interactive as possible. We engage students in discussions, group work and active learning, but distractions are there.

Banning devices isn't a good solution. Things forbidden gain extra appeal.

distractionsA few books I have read discuss the ways in which distraction can interfere with learning. In The Distracted Mind: Ancient Brains in a High-Tech World , the authors say that distraction occurs when we are pursuing a goal that really matters and something blocks our efforts to achieve it. Written by a neuroscientist, Adam Gazzaley, and a psychologist, Larry D. Rosen, they join other researchers who report that our brains aren't built for multitasking. This compares to a time a few decades ago when being able to multitask was consider a positive skill.

It seems that the current belief is that we don't really multitask. We switch rapidly between tasks. Any distractions and interruptions, including the technology-related ones - act as "interference" to our goal-setting abilities. 

But is this a new problem or has our brain always worked this way? Is the problem really more about the number of possible distractions and not our "rewired" brains?

Nicholas Carr sounded an alarm in 2011 with The Shallows: What the internet is doing to our brains, arguing that our growing exposure to online media means our brains need to make cognitive changes. The deeper intellectual processing of focused and critical thinking, gets pushed aside in favor of the faster processes like skimming and scanning.

Carr contends that the changes to the brain's "wiring" is real. Neural activity shifts from the hippocampus' deep thinking, to the prefrontal cortex where we are engaged in rapid, subconscious transactions. Substitute speed for accuracy. Prioritize impulsive decision-making over deliberate judgment. 

In the book Why Don't Students Like School?: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom  the author asks questions such as Why Do Students Remember Everything That's on Television and Forget Everything I Say? and Why Is It So Hard for Students to Understand Abstract Ideas? and gives some science and suggestions as answers. But these are difficult questions and simple answers are incomplete answers in many cases.

Some teachers decide to use the tech that is being a distraction to gain attention. I had tried using a free polling service (Poll Everywhere) which allows students to respond/vote using their laptops or phones. You insert questions into your presentation software, and that allows you to track, analyze, and discuss the responses in real time. The problem for me is that all that needs to be pre-planned and is awkward to do on-the-fly, and I am very spontaneous in class with my questioning. Still, the idea of using the tech in class rather than banning it is something I generally accept. But that can't be done 100% of the time, so distracted use of the tech is still going to occur.

bubbleAnd the final book on my distraction shelf is The Filter Bubble. The book looks at how personalization - being in our own bubble - hurts the Internet as an open platform for the spread of ideas. The filter bubble puts us in an isolated, echoing world. The author, Eli Pariser, subtitles the book "How the New Personalized Web Is Changing What We Read and How We Think." Pariser coined the term “filter bubble.” The term is another one that has come up o the news in talking about the rise of Donald Trump and the news bubble that we tend to live in, paying attention to a personalized feed of the news we agree with and filtering out the rest.

Perhaps creating a filter bubble is our way of coping with the attention economy and a way to try to curate what information we have to deal with every day.

Then again, there were a number of things I could have done the past hour instead of writing this piece. I could have done work that I actually get paid to do. I could have done some work around my house. But I wrote this. Why? 

Information overload and non-stop media is hurting my/our discipline for focus and self-control.

Michael Goldhaber defined the attention economy in this more economic way: "a system that revolves primarily around paying, receiving and seeking what is most intrinsically limited and not replaceable by anything else, namely the attention of other human beings.” In order for that economy to be profitable, we must be distracted. Our attention needs to be drawn away from the competition.

As a secondary school teacher for several decades, I saw the rise of ADHD. That was occurring before the Internet and lack of attention, impulsivity and boredom were all symptoms. It worsened after the Internet was widespread, but it was there before it and all the personal digital devices.

Back in 1971,  Herbert A. Simon observed that “what information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

We are collectively wiser than ever before. We have the wisdom of the world in a handheld computer connected to almost everything. But it is so difficult to filter out the distractions and garbage that we don't have a lot of success translating information into knowledge. People used to say that finding out something on the Internet was like taking a sip from a fire hose. Search filtering has helped that, but so far the only filters for our individual brains are self-created and often inadequate.

 

BitTorrent Reconsidered

shirt

This past weekend I was wearing an old BitTorrent t-shirt that has printed on the back: "Give and ye Shall receive." While waiting in a store checkout line, a man behind me said, "BitTorrent? Are you a software pirate?"

To many people, BitTorrent is still synonymous with piracy. BitTorrent was and probably used for some questionable and illegal file transfers, but it’s also being used for many legitimate tasks.

A programmer, Bram Cohen, designed the protocol and released the first available version in July 2001, and it quickly became the preferred way to share large files, especially movies. In the public mind, it is blurred together with other file sharing programs like Napster, which was used to share music (mp3) files.

animationLike HTTP, which your browser uses to communicate with websites, BitTorrent is just a protocol. People were sharing pirated files of all kinds before BitTorrent using anonymous peer-to-peer networks, but this new protocol made it much faster and more efficient. 

The BitTorrent protocol uses client computers to share individual piece of the file. After the initial pieces transfer from the seed, the pieces are individually transferred from client to client and that original seeder only needs to send out one copy of the file for all the clients to receive a copy.

BitTorrent Sync is a use that is comparable to Dropbox, a popular file sharing system. But unlike Dropbox, Sync doesn’t store your files in a centralized server online. It syncs them between computers you own or computers your friends own. It allows easy file sharing and you can sync an unlimited number of files as long as you have the space on your computers for them. (Dropbox offers that extra space, which many of us need.)

The most recent version of BitTorrent was released in 2013 and BitTorrent clients are available for a variety of computing platforms and operating systems including an official client released by BitTorrent, Inc.

What are some of the current legal uses? 

Some game companies use it for game updates and downloads. For example, Blizzard Entertainment uses its own BitTorrent client to download World of Warcraft, Starcraft II, and Diablo III. When you legally purchase one of these games and download it, you’re downloading a BitTorrent client that does it and the game’s launcher automatically downloads updates for you.

Facebook uses the BitTorrent protocol for propagating large files over a large number of different servers.

It also has educational users. Florida State University uses BitTorrent to distribute large scientific data sets to its researchers. Many universities that have BOINC distributed computing projects have used the BitTorrent functionality of the client-server system to reduce the bandwidth costs of distributing the client-side applications used to process the scientific data. The developing Human Connectome Project uses BitTorrent. 

The popular Internet Archive uses the protocol to make its public domain content downloadable.

In 2010, the UK government released several large data sets showing how public money was being spent that were offered via BitTorrent to save on bandwidth costs and speed the process.

NASA has also used BitTorrent to make a 2.9GB picture of the Earth available.

Like Napster, which rebranded and reinented itself after all the lawsuits into a "legitimate" music service, the official BitTorrent website has a list of “bundles” of music and videos. Artists make them freely available to hook fans, just as radio was once used to provide free music to large audiences in hopes that they’ll attend live shows and buy albums.

If we got rid of BitTorrent, another similar protocol would need to emerge.