The Age of Context

The future needs to be contextual. Wait. Hasn't it always been necessary to be contextual? You are not going to get through a day successfully without examining the set of circumstances or facts that surround events and situations. So, why a new age of context?

Add to the discussion some other terms like "predictive technology" and "anticipatory software" and the new age might make more sense. Predictive technology uses learning algorithms and observed data to build behavioral models. It's not that we haven't already built human behavioral models. One problem with those models is that it usually requires information from the user, and that part of the data gathering process often yields inaccuracies. For example, you don't always tell the truth on those surveys.

One example of anticipatory software is Google Now. It is a kind of search where you don't search. Google Now looks at your digital life contextually. Some people get suspicious of Google looking in your email for keywords and at your calendar appointments for mentions of things like travel to other cities. But, by doing so, it builds "cards" that might be able to tell you the weather in that place you're going to this morning and that the flight is on time. That's useful, right?

I'm sure that some people will find this wonderfully amazing. And some will find it frightening in a privacy sense.

A quick search on Wikipedia also turns up contextual advertising (advertisements based on other content displayed, which Google, Facebook and others already use), contextual design (user-centered design), contextual inquiry (a related user-centered design method) and contextual learning (learning outside the classroom).

In this form of search, it is your behavior that creates predictive/anticipatory searches. I am sure that more of this type of data mining will occur, especially if it drives advertising revenue. I'm sure that Google is not alone in using this and expanding on these models

I saw a post last week that mentioned Robert Scoble calling this "the age of context." (He is writing a book on it along with Shel Israel.) I'm not sure we are entering an age of context quite yet. There are more and more context-based services and products, but contextualization is not mainstream yet.

If all this data probing frightens you from a privacy point of view, then maybe you need to consider the perspective of people like Jeff Jarvis. His book, Public Parts, has a subtitle that tells you his point of view: "How Sharing in the Digital Age Improves the Way We Work and Live."  Jarvis contends that "publicness" isn't really new and takes us back to earlier ages when innovations such as the camera and the printing press also created privacy fears. He is writing about the new industry that is based on sharing. Facebook, Twitter, Google (which he wrote about extensively in an earlier book
) and all the photo and status sharing applications certainly exploit our willingness to share.

Whether or not this publicness will ultimately lead to creative collaboration and change the way we think is yet to be determined. It is changing how we manufacture, market, buy and sell. That benefits some, but it would be more significant it it improved how we organize, govern, teach and learn.

Even observers like Jarvis recognize that understanding the limits of privacy is necessary to protecting it. If anticipatory and predictive technologies are going to be successful, then it will that users trust the system.

Will the day come when our devices will know us better than our friends? That is scary in a sci-fi kind of way.  And like a lot of sci-fi, it just may come true.





Triangulating and Visualizing Big Data

http://blogtenxer.files.wordpress.com/2012/12/comics-triangulation.png

We always had data. But then came big data. And now we need to connect big data. The Chronicle of Higher Education did a special report on "Big Data's Mass Appeal."  The Sloan Consortium devoted a whole journal issue to it.

In trigonometry and geometry, triangulation is the process of determining the location of a point by measuring angles to it from known points at either end of a fixed baseline. (While trilateration is measuring distances to the point directly.) It is also used for surveying and triangulation methods were used for accurate large-scale land surveying until the rise of GPS and global navigation satellite systems in the 1980s.

It has a somewhat more figurative usage in business and education when we use it in "surveying" data. As is usually the case, the business world was on it faster than education. The big companies have lots of data that is untouched in organizational and software "silos." Triangulation means connecting these silos.

This is what led to "data-driven decision making" back in the 1980s. Now, the buzz term is "big data" which uses software to do analytics. In education, we are still early on in being able to do this and produce useful results for administrative functions. We are even more basic in our use of it for useful instructional analytics.

Businesses love being able to take data about customer experiences and see trends emerge as as customers conduct normal business. These are not surveys or focus groups. It is any number of everyday operations, and it often triangulates customer activities in social networks including Twitter trends, Facebook likes and other interactions.

How educational institutions will use their data and what angles they will use to survey that data are questions still unclear now. Two areas of concern in education that are using big data (see links below) are improving course success and completion rates.

One site that I have been looking at is the Visualization and Behavior Group at IBM because I like their perspective that data visualization should make data analytics accessible to anyone, not just the experts. They see using social software for communication as the new norm, not a trend.


All About Serendipity

From my daily reading of The Writer's Almanac, I found this post for today:

It was on this day in 1754 that the word "serendipity" was first coined. It's defined by Merriam-Webster as "the faculty or phenomenon of finding valuable or agreeable things not sought for." It was recently listed by a U.K. translation company as one of the English language's 10 most difficult words to translate. Other words to make their list include plenipotentiary, gobbledegook, poppycock, whimsy, spam, and kitsch.

"Serendipity" was first used by parliament member and writer Horace Walpole in a letter that he wrote to an English friend who was spending time in Italy. In the letter to his friend written on this day in 1754, Walpole wrote that he came up with the word after a fairy tale he once read, called "The Three Princes of Serendip," explaining, "as their Highnesses travelled, they were always making discoveries, by accidents and sagacity, of things which they were not in quest of." The three princes of Serendip hail from modern-day Sri Lanka. "Serendip" is the Persian word for the island nation off the southern tip of India, Sri Lanka.

The invention of many wonderful things have been attributed to "serendipity," including Kellogg's Corn Flakes, Charles Goodyear's vulcanization of rubber, inkjet printers, Silly Putty, the Slinky, and chocolate chip cookies.

Alexander Fleming discovered penicillin after he left for vacation without disinfecting some of his petri dishes filled with bacteria cultures; when he got back to his lab, he found that the penicillium mold had killed the bacteria.

Viagra had been developed to treat hypertension and angina pectoris; it didn't do such a good job at these things, researchers found during the first phase of clinical trials, but it was good for something else.

The principles of radioactivity, X-rays, and infrared radiation were all found when researchers were looking for something else.

Julius Comroe said, "Serendipity is looking in a haystack for a needle and discovering a farmer's daughter."

Wiktionary lists serendipity's antonyms as "Murphy's law" and "perfect storm."

When Tim and I launched Serendipity35 back in February 2006 as an experiment in blogging, the name came from the software we used for the blog and I added the somewhat ironic 35 to make the serendipitous a bit less serendipitous. Still, I do like the idea that the blog has some serendipityand that I am "finding valuable or agreeable things not sought for" and commenting on them so that others benefit.


What 2012 Educational Trends Will Continue into 2013?

2013I made a presentation at the end of 2012, titled "It’s the End of the University As We Know It" in which I gave my ideas about how the next ten years will transform universities. The fear factor in the title wasn't just a Mayan calendar reference or hyperbole. If you are an educator or institution that still holds onto the the university model that has existed for almost 900 years, you have reasons to worry.

I focused on technology in education trends that were big last year that I believe will be around through this year and beyond in some form. The trends I selected are open educational resources (OER), MOOCs, big data, non-degree programs and alternatives to a traditional university degree, platforming and who sets the curriculum.

I was speaking to a primarily higher education audience, so I said that these trends will lead us closer to a University 2.0. But these things will have a broader impact and will probably move us more to School 2.0 or a Learning 2.0 without a focus on post-high school education.

You can't ignore these trends being in K-12 classrooms. Those schools are reconsidering some current fundamental assumptions: giving students grades, partitioning them according to age, and methods of proving competency. Will high schools, and maybe even middle schools, begin to operate less like factories and more like colleges? Will the ubiquity of high-technology blur the distinction between being in and out of school? Will high school students start learning in MOOCs? Well, they already are in MOOCs, but probably not with the cooperation of their schools. Who thinks of a 16 year old as a lifelong learner?

I'm going to elaborate some on those trends and trends from a few other sources.

Massive open online courses (MOOC) have been around since that term was applied to a course in 2008, but they seemed to explode in 2012. I have written enough posts in the past year here, so I don't want to repeat myself. But when Coursera got Princeton, University of Michigan, Stanford and Penn State on board early last year and then over the summer California Institute of Technology, Duke University, Ecole Polytechnique Federale de Lausanne, Georgia Institute of Technology, Johns Hopkins University, Rice University, UC San Francisco, University of Edinburgh, University of Illinois at Urbana- Champaign, University of Toronto, University of Virginia and University of Washington all agreed to provide courses, it really caught the attention of higher education and the mainstream media.

I am all for the democratization of knowledge. Making knowledge available is important. But it's not the equivalent of an education. If it was, all you would need is a massive online open library. We have that. It is called the Internet.

Perhaps, when we get the delivery of MOOCs settled, we can focus on the pedagogy. Actually, educators are already making a distinction between xMOOCs (see Coursera, Edx) which are closer to our formal (traditional) course pedagogy, and cMOOCs which are connectivist in design. In the xMOOC, learners expected to duplicate or master
what they are taught. In cMOOCs, that relationship between teacher and learner changes. The learning is distributed, somewhat chaotic, and emergent. The expectations for learners are lofty: to create, grow and expand the domain and then share personal sense-making through artifact-creation. There should be less of the xMOOC's centralized discussion and more distributed, learner-created forums and learning/collaboration spaces.

I don't really see the end of the university degree, but I do see it as carrying less weight in the working world. I'm not sure what version of certificates, badges, or corporate endorsements will emerge as the currency of competency and mastery. One thing that has emerged from the advent of open courseware and MOOCs is that there are lots of people out there who really do want to learn “just” to learn and are not interested in paying tuition or getting any credits.

Competency-based degrees may be one result. Lumina and the Gates Foundation are working with institutions that either do competency-based degree programs or want to try it. Some of the big publishing vendors have moved into the learning, course creation and delivery spaces. Pearson is partnering in some competency-based degree programs. Southern New Hampshire College has proposed a competency-based Associate degree program by seeking to directly assess students’ competencies rather than mapping them to credit hours. They have secured approval from their regional accreditor, the New England Association of Schools and Colleges Commission on Institutions of Higher Education.

Goddard College’s low-residency semester format comprises an intensive 8-day residency on campus and 16 weeks of independent work and self-reflection in close collaboration with a faculty advisor. This isn't new. Goddard invented the low-residency model in 1963 to meet the needs of adult students who had professional, family, or other obligations. These are frequently students who, besides flexibility, also want learning experiences with relevance in real-world circumstances.

Open Everything: Open source, open software, open textbooks and courses are enabling a kind of Open Learning. 

Who is setting the curriculum table these days? Personally, it frightens me that it is less likely to be a teacher or a department selecting the textbook or the syllabus. Some people call it the "platforming of education." It's companies like Blackboard, Pearson, Cengage and perhaps even Google consolidating their platforms and moving deeper into creating educational content.

Perhaps, you don't think about companies like Google being in education in the same way as Pearson or Blackboard, but their Apps for Education product has 20 million users. Those users can now have Google+, bundled education apps from the Chrome Web Store, and some are using Chromebooks acquired through education pricing deals.  Google even plays a role, via their YouTube, in another trend -

Flipping the classroom That particular trend is one I think was over hyped in 2012. Flip teaching or flipped classrooms is a form of blended learning using technology flip learning so that a teacher can spend more time interacting with students instead of lecturing. This is most commonly being done using teacher-created videos that students view outside of class time. It flips/reverses the traditional pattern of teaching where we assign students to read a textbook at home to be discussed in the next class. Their homework would be an assessment that should demonstrate their mastery of the topic. When the classroom is flipped, the student first gets the video lessons as homework and then in class does the assessment "homework" of solving math problems or writing an essay with the teacher there to tutor the student when needed.

Big Data is big. And complicated. To simplify big data, let's just say that it comes from both the fact that we just collect a lot more data now and are just starting to develop tools to do analysis that allows us to predict outcomes. As Audrey Watters says, "big data was big business in 2012, and lots of companies released data and analytics products." She gives a long list of products that are being used in K-12 education, like Shared Learning Collaborative which is a Gates Foundation-funded initiative (info) of data stores, APIs, and Learning Registry-related content tags.

Cognitive theory will make learning analytics systems more relevant and effective. Students and instructors can both benefit from access to better information about the state of learning. Big data analytics is here, but for many of us it is still, as George Strawn defined it, "any data we don’t understand well enough to computerize.” But now we are looking at fine-grained information about student experiences, university processes and emergent trends such as student learning, enrollment, course success, lifestyle and tech use. And this is primarily data that is not coming from surveys but is generated as students and staff conduct their normal business, such as using a learning management system or communicating on social networks.

Do I like educational decisions being driven by data. No, I do not. But I'm not sure we will be given that choice.

In preparing this post, I took a look at the top 10 "Wired Campus" stories from The Chronicle of Higher Education last year. What pattern do you see from the titles?
1. Stanford Professor Gives Up Teaching Position, Hopes to Reach 500,000 Students at Online Start-Up
2. Could Many Universities Follow Borders Bookstores Into Oblivion?
3. Minnesota Gives Coursera the Boot, Citing a Decades-Old Law
4. Khan Academy Founder Proposes a New Type of College
5. Elsevier Publishing Boycott Gathers Steam Among Academics
6. Coursera Announces Big Expansion, Adding 17 Universities
7. 3 Major Publishers Sue Open-Education Textbook Start-Up
8. Students Find E-Textbooks ‘Clumsy’ and Don’t Use Their Interactive Features
9. Now E-Textbooks Can Report Back on Students’ Reading Habits
10. Udacity Cancels Free Online Math Course, Citing Low Quality

Four of those articles are about changing education with offerings that are not from traditional schools. Six articles are about changing the publishing and marketing of textbooks.

In a Book Industry Study Group survey of college faculty perceptions toward classroom materials found that 88% of professors still prefer to assign the printed versions of textbooks and other class materials. The survey also found that while 32% of faculty reported making digital versions of textbooks available, just 2% of students said this was the primary way in which they accessed the materials.

Annie Murphy Paul is one of many people who sees mobile devices (laptops + tablets + phones) as a learning game-changer. Schools are experimenting with BYOD "bring your own devices" to school. Rather than telling student to close the laptop and put away the phone, some innovative teachers are finding ways to integrate mobile technology into instruction.

That presentation I gave had an apocalyptic feel to it, but I ended my presentation by recognizing that we have already heard about the death/end of the novel, theater, movies, broadcast television, newspapers, print, libraries, record/music sales, CDs, DVDs, software, and traditional computers. And did any of them vanish? No, though some of them have had quite dramatic changes and they may vanish in the next decade. The end comes slower than we imagined. Not with a bang, but maybe not with a whimper either.

I see a few higher education trends that are very real and that are not coming from within colleges. The push from outside to get students through school faster and have them ready for jobs is very real. The government and some voices from the world of industry and funding are looking to people such as Salman Khan (Khan Academy) and to funding sources like The Gates Foundation who want to see big changes in classrooms. That is a trend that will be with us for more than a few years.



More on Academic Credit for MOOCs

Georgia State University announced Tuesday that it will start to review MOOCs for credit much like it reviews courses students have taken at other institutions, or exams they have taken to demonstrate competency in certain areas.

And Academic Partnerships, a company that works with public universities to put their degree programs online, announced an effort in which the first course of these programs can become a MOOC, with full credit awarded to those who successfully complete the course. The educational idea is that this offering will encourage more students to start degree programs. The financial idea is that the tuition revenue gained by participating institutions when students move from the MOOC to the rest of the program (which will continue to charge tuition) will offset the additional costs of offering the first course free.

Among the first universities planning to make the debut course in their online programs a MOOC are Arizona State, Cleveland State, Florida International, Lamar and Utah State Universities and the Universities of Arkansas, Cincinnati, Texas at Arlington and West Florida.


Fururelearn To Offer MOOCs in the UK

Non-American universities are considering different options for entering the MOOC market, which to date has been dominated by elite U.S. institutions.

The California-based MOOC provider Coursera counts eight foreign institutions among its 33 university partners. Meanwhile, 12 universities in the United Kingdom have launched a new MOOC platform of their own.

The Open University, a distance education institution based in London, recently announced the formation of Futurelearn in partnership with Cardiff and Lancaster Universities; the Universities of Birmingham, Bristol, East Anglia, Exeter, Leeds, Southampton, St Andrews and Warwick; and King’s College, University of London.

Initial marketing material for Futurelearn emphasizes its U.K. identity -- asserting that Britain should be at the forefront of advances in educational technology and stressing that, until now, U.K. universities interested in offering MOOCs have only had the opportunity of working with U.S.-based platforms. However, Futurelearn’s CEO, Simon Nelson, said the company is open to eventually working with universities outside the U.K.


Read "Multinational MOOCs" insidehighered.com/news/2013/01/22/foreign-universities-consider-how-best-enter-mooc-market