I have seen friends on Facebook create decent songs and stunning artistic creations with little knowledge of music or art, all after spending a bit of time getting to know an AI art or music generator. But since the grammar assistants in my word processors often flag what is already correct and miss what I wish they should have caught, I’ve never felt AI writing was advancing very quickly.
And then I met ChatGPT. The Facebook teaching page for my university has taken off on the topic, so I took a deep dive into what it can do. I’ve seen it create (in a flash) movie scripts and comic strips, sonnets and grant proposals, graduate course syllabi and lessons. It can execute math problems, showing all its work with written explanations. Nearly any writing prompt one might assign to be completed outside of class (with a few notable exceptions) can be written pretty well, quickly, at no cost, and undetectable by our current plagiarism software by anyone who takes a little time to learn the nuances of ChatGPT. I am spending the day after Christmas writing this because I don’t want anyone to lament, “Why didn’t anyone warn me about this sooner?”
This coming semester were you planning on asking students to explain the difference between operant and classical conditioning, compare and contrast the writing styles of Octavia Butler and Louise Erdrich, write a lab report, analyze a speech, create an LLC, or design a brochure? ChatGPT has them covered.
Let me be clear about my first goal; I desperately hope that I can convince you to take an hour to get a feel for what ChatGPT is and what it is capable of right away. I’ve collected some ideas and resources to get you started. A few will walk away from this exercise thinking, “What a relief. Nothing needs to be modified in the courses I teach; ChatGPT will not impact my instruction, but I’m glad I can converse about it.” But I suspect the vast majority, and especially those who teach online courses, will recognize that this is a game-changer that may require substantial course revisions before the new semester.
But I have a second, more important mission once you’ve taken a look, perhaps worried about how to approach this technology, and wondered how swiftly AI is going improve and expand. I want to beg you not to turn to increased punishment, surveillance, and control, and instead consider how this fascinating turn of events might be a reason for rejoicing. Might this be an opportunity to turn away from assembly line efficiency and toward a model where we help students use AI to extend their capabilities, allowing them to pursue interests and solve wicked problems? Could this be a chance to design a model where students wouldn’t dream of using AI unethically or allow it to steal their learning, a model where educators find more meaning and purpose in their work as well?
At a time when it may feel like AI is stripping away our humanity, I wonder if it might be possible that it is handing us an opportunity to work with students far more than we talk at them. What if students bring strengths, interests, technical knowledge, and comfort in thinking outside the box and we pair that with our own substantial learning and talents and then bring what AI can contribute into the mix? What could we accomplish then?
Let’s say, for the sake of argument, that basic writing will now be generated by AI initially, tweaked with a few additional prompts, and then polished by the human writer. Or for more creative, personal, analytic, or cutting-edge writing, the work will start with the human writer and be polished in the end by AI. We’d all continue to do a lot of “writing to learn”—writing to help us think, brainstorm, and make sense of our thoughts and feelings, with no desire to use AI. Would all this be so terrible? Already when I write I use loads of “assistants.” Word checks my spelling and grammar, and I use its thesaurus to find the ideal word. Google helps me fact-check. A friend might make suggestions. Citation generators help with APA style. If I am writing something that follows conventions I am less familiar with, such as a book proposal or grant, I look at others’ examples. Is any of this “cheating?” How might what constitutes as “cheating” change in the age of AI?
My hope is that this shake-up will force us to address the underlying problem—that while humans are naturally curious and will often pursue out-of-school learning with great fervor, much in-school learning feels trivial and tedious. The pandemic brought even more students to the conclusion that much of the educational endeavor is not worth the considerable amount of time it requires. But as I noted in The nail in the coffin: How AI could be the impetus to reimagine education, we could “enchant learning in higher ed in such a way that people flock to it when they need to be energized, when they need a balm, when they are trying to figure out their why, when they have burning questions they want support in pursuing.”
What might this look like? Students would need to direct their own learning with our guidance; I suspect that’s a non-negotiable prerequisite for learning that beckons. We may need to let go of our attachment to a common curriculum marched through in a specified order. When a student is pursuing learning, almost nothing can stop them. When they are forced, developing the intrinsic motivation needed for quality learning is a tall order. What could we imagine if we move beyond our fears that teaching in our particular field might need to drastically change?
We would spend far more time developing information literacy, teamwork, research skills, study skills, and metacognitive skills so that students wouldn’t be so dependent on us and could engage in more self-directed learning. We’d have loads of discussions on what constitutes ethical use of AI in this brave new world and ethics in general. What do we want our world to be like? How can we move in that direction?
While foundational knowledge is required for higher-order thinking, we often focus primarily or almost exclusively on the foundational. In this new paradigm, we would point students toward the appropriate modules to develop that foundational knowledge, and we’d move students as soon as possible into problem/project/case-based learning, much of it personalized and experiential or field-based. We would be mostly working with, working alongside, facilitating and supporting, and letting AI do some of the heavy lifting.
What if a key task for more expert students was to create modules for more novice students with your support? These students would select meaty resources, devise interesting ways to engage others with those resources, and create fascinating and interactive modules that build on what we know about how to make learning stick. The more expert students could learn to work with the novices to clear up confusion, discuss nuances of understanding, and so forth.
That’s my vision. Running with robots is a wonderful collection of other possible visions, some real and some imagined. While it is focused on high school, the majority of the concepts would be transferrable to higher education. What is your vision? If your goal was to make learning so meaningful, worthwhile, and alluring that students wouldn’t dream of cheating themselves AND make your own job deeply satisfying, what would that look like? What role might you play in getting this conversation started at your institution? I’m going to facilitate a faculty learning community on reimagining higher education; perhaps you’ll do the same.
Cynthia Alby has spent most of her career immersed in “avid cross-disciplinary idea synthesizing.” Her primary research question is, “How might we re-enchant learning in order to help faculty and students flourish?’ She joined Georgia College in 2001, where she is a Professor of Teacher Education. She recently co-authored a practical book on course design for changing times: Learning that matters: A field guide to course design for transformative education. In addition, for nearly 20 years she has helped to develop faculty from across Georgia through the “Governor’s Teaching Fellows Program” at the Louise McBee Institute for Higher Education at UGA.
References:
Alby, C. (2022, December 21). ChatGPT: Understanding the new landscape and short-term solutions [web log]. Retrieved from https://learningthatmatters.weebly.com/resources.html.
Alby, C. (2022, December 23). The nail in the coffin: How AI could be the impetus to reimagine education [web log]. Retrieved from https://learningthatmatters.weebly.com/resources.html.
Toppo, G., & Tracy, J. (2021). Running with robots: The American high school’s third century. The MIT Press.