Teaching Apprenticeships

When I was studying to become a teacher in the 1970s, the only "apprenticeships" were my student teaching experiences. At Rutgers, I went into secondary classrooms in a limited way in my sophomore and junior years and did my "student teaching" every day for an entire 15-week semester as a senior.

You might associate apprenticeships as a way of enabling students to learn by doing, but it is often used with vocational training where a more experienced tradesman or journeyman models behavior and provides feedback when the student attempts what was shown.

A teaching apprenticeship is a program that allows prospective teachers to work in schools while earning a paycheck and getting training. Apprenticeships are paid programs that can last one to three years. They offer on-the-job learning, mentorship, and a postgraduate-level qualification without tuition fees.

teaching mentorAlternate route programs for teachers are designed for people who want to become certified teachers but have not completed a formal teacher preparation program at an accredited college or university.

Where I live in New Jersey, the Alternate Route Teaching Certificate Program is a two-year program that includes 400 hours (24 credits) of education courses. The program is also known as the Provisional Teacher Process (PTP). The program is designed for people who have earned an Instructional Certificate of Eligibility (CE) and have been provisionally hired by a New Jersey public school district. Alternate route teachers earn a Certificate of Eligibility (CE). 

The program accommodates student schedules as they teach in a full-time teaching position simultaneously while completing required coursework. To pursue an alternative teaching program, you typically need to possess a bachelor's degree at minimum.

An article on The Future of Teaching Apprenticeships, discusses how apprenticeships provide an innovative way for educators to experience real-life challenges and hone their professional skills. They allow aspiring educators to gain hands-on experience, mentorship, and practice in actual classrooms.

Educator apprenticeships strongly emphasize mentorship, pairing novices with experienced educators who serve as their guides throughout the program. Unfortunately, there are few programs like this for higher education teachers.

This mentor-mentee relationship allows apprentices to benefit from the wisdom and expertise of seasoned professionals while also receiving ongoing support, constructive feedback, and opportunities to reflect on teaching practices. The mentor model is not new and is cited as a best practice, according to the Educator Prep Lab at the Learning Policy Institute, and is backed by a rich evidence base that prioritizes educator retention in the profession and other similar factors championed by teacher residency programs. 

Can Bloom's Taxonomy Teach Us Anything About AI?

spiral model
Image gettingsmart.com

 

When I was studying to be a secondary school teacher, Bloom’s Taxonomy often came up in my classes as a way to do lesson planning and a way to assess learners. Recently, there have been several revisions to its pyramid stack. An article on www.gettingsmart.com suggests a spiral might be better, particularly if you want to use it as a lens to view AI.

The author, Vriti Saraf, opines that the most important potential of AI isn’t to enhance human productivity, it’s to enhance and support human thinking, and that looking at AI’s capabilities through the lens of Bloom’s Taxonomy showcases the possible interplay of humans and machines.

It is an interesting idea. Take a look.

 

 

Telling Students to Use AI

grading

2023 was certainly a year for AI. In education, some teachers avoided it and some embraced it, perhaps reluctantly at first. Some educators have reacted, partially to AI that can write essays Some schools, some teachers, some school districts some colleges some departments have tried to ban it issues. Of course, that is impossible, just as it was impossible to ban the use of Wikipedia or going back to the previous century, the use of a word processor, or a calculator in a math class, or use the Internet to copy and paste information.

What happened when an entire class of college students were told to use ChatGPT to write their essays?

Chris Howell, an adjunct assistant professor of religious studies at Elon University, noticed more and more suspiciously chatbot-esque prose popping up in student papers. So rather than trying to police the tech, he embraced it. He assigned students to generate an essay entirely with ChatGPT and then critique it themselves.

When I first caught students attempting to use ChatGPT to write their essays, it felt like an inevitability. My initial reaction was frustration and irritation—not to mention gloom and doom about the slow collapse of higher education—and I suspect most educators feel the same way. But as I thought about how to respond, I realized there could be a teaching opportunity. Many of these essays used sources incorrectly, either quoting from books that did not exist or misrepresenting those that did. When students were starting to use ChatGPT, they seemed to have no idea that it could be wrong.

I decided to have each student in my religion studies class at Elon University use ChatGPT to generate an essay based on a prompt I gave them and then “grade” it. I had anticipated that many of the essays would have errors, but I did not expect that all of them would. Many students expressed shock and dismay upon learning the AI could fabricate bogus information, including page numbers for nonexistent books and articles. Some were confused, simultaneously awed and disappointed. Others expressed concern about the way overreliance on such technology could induce laziness or spur disinformation and fake news. Closer to the bone were fears that this technology could take people’s jobs. Students were alarmed that major tech companies had pushed out AI technology without ensuring that the general population understands its drawbacks.

The assignment satisfied my goal, which was to teach them that ChatGPT is neither a functional search engine nor an infallible writing tool.

Source  wired.com/story/dont-want-students-to-rely-on-chatgpt-have-them-use-it/

Report: AI and the Future of Teaching and learning

I see articles and posts about artificial intelligence every day. I have written here about it a lot in the past year. You cannot escape the topic of AI even if you are not involved in education, technology or computer science. It is simply part of the culture and the media today. I see articles about how AI is being used to translate ancient texts at a speed and accuracy that is simply not possible with humans. I also see articles about companies now creating AI software for warfare. The former is a definite plus, but the latter is a good example of why there is so much fear about AI - justifiably so, I believe.

Many educators seem to have had the initial reaction to the generative chatbots that became accessible to the public late last year and were being used by students to write essays and research papers. This spread through K-12 and into colleges and even into academic papers being written by faculty.

A chatbot powered by reams of data from the internet has passed exams at a U.S. law school after writing essays on topics ranging from constitutional law to taxation and torts. Jonathan Choi, a professor at Minnesota University Law School, gave ChatGPT the same test faced by students, consisting of 95 multiple-choice questions and 12 essay questions. In a white paper titled "ChatGPT goes to law school," he and his coauthors reported that the bot scored a C+ overall.

ChatGPT, from the U.S. company OpenAI, got most of the initial attention in the early part of 2023. They received a massive injection of cash from Microsoft. In the second half of this year, we have seen many other AI chatbot players, including Microsoft and Google who incorporated it into their search engines. OpenAI predicted in 2022 that AI will lead to the "greatest tech transformation ever." I don't know if that will prove to be true, but it certainly isn't unreasonable from the view of 2023.

Chatbots use artificial intelligence to generate streams of text from simple or more elaborate prompts. They don't "copy" text from the Internet (so "plagiarism" is hard to claim) but create based on the data they have been given. The results have been so good that educators have warned it could lead to widespread cheating and even signal the end of traditional classroom teaching methods.

Lately, I see more sober articles about the use of AI and more articles about teachers including lessons on the ethical use of AI by students, and on how they are using chatbots to help create their teaching materials. I knew teachers in K-20 who attended faculty workshops this past summer to try to figure out what to do in the fall.

Report coverThe U.S. Department of Education recently issued a report on its perspective on AI in education. It includes a warning of sorts: Don’t let your imagination run wild. “We especially call upon leaders to avoid romancing the magic of AI or only focusing on promising applications or outcomes, but instead to interrogate with a critical eye how AI-enabled systems and tools function in the educational environment,” the report says.

Some of the ideas are unsurprising. For example, it stresses that humans should be placed “firmly at the center” of AI-enabled edtech. That's also not surprising since an earlier White House “blueprint for AI,” said the same thing. And an approach to pedagogy that has been suggested for several decades - personalized learning - might be well served by AI. Artificial assistants might be able to automate tasks, giving teachers time for interacting with students. AI can give instant feedback to students "tutor-style." 

The report's optimism appears in the idea that AI can help teachers rather than diminish their roles and provide support. Still, where AI will be in education in the next year or next decade is unknown.