On our way to developing a curriculum for implementable real-world projects in professional classrooms, an interesting thing happened: Generative AI became ubiquitous. This provided a new perspective on the ideas we established in our previous article, Why Can’t Your Real-World Project Live in the Real World? (Faculty Focus, March 2022.)
In that article, we made a case for enabling real-world clients as partner educators. We tested this theory by working with a client, Artists Alliance Incorporated, a non-profit arts organization in New York City, who needed to rebuild its web presence. While we successfully redesigned the client’s website, we learned during this experience that the 15-week course structure presents a challenge for the production of an implementable project that has real-world impact. Why? Because there’s limited time for the students to gain the professional knowledge needed to generate recommendations that are implementable, and because of the limited capacity of the partner to consistently engage in answering questions about the project for the students. So, in trying to resolve these issues, we wondered, “Could we use generative AI to ‘hack time?’”
Where to begin hacking
The time-intensive elements we believe need to be “hacked” to help students develop the skills necessary to effectively respond to the client’s needs are:
- Bringing the students up to speed on content knowledge so they can provide a viable scope to the client
- Providing the students feedback on their thinking and recommendations without overloading the client
With the introduction of AI, we wondered if it could be used to augment the access students have to the teacher and the client. For example, could AI be harnessed to help students understand the core concepts of the course and increase their foundational knowledge? Could it also give on-demand feedback based on the client’s stated needs? Both are time-intensive, yet essential, components for implementing a real-world project.
To test our thinking, we developed two generative AI bots and introduced them to our students in a course we were teaching. One was a “Subject Knowledge Bot” (SKB), trained on introductory course content, and the other was a “Project Knowledge Bot” (PKB), trained on transcripts from a client interview detailing their business goals. The SKB provided students with timely knowledge to support the development of their ideas and solutions, while the PKB reduced demands on the client’s time by offering students project-specific insights.
So, how did the bots do?
We piloted the two bots in a Strategic Communications course taught at the NYU School of Professional Studies, Division of Programs in Business, during the spring 2024 semester. During the course, students were asked to redesign a monthly newsletter sent out by the administration to internal stakeholders. To start, the students were given previous newsletters to better assess their quality and efficacy based on the foundational concepts of internal communications being taught in the course. The students submitted their personal assessments and course examples to the SKB, requesting the bot’s feedback. Based on the bot’s response, the students developed a set of recommendations for how the administration could improve the newsletter. The purpose of this exercise was to strengthen the student’s knowledge of internal communications and fill in any gaps students may have had in their understanding. The bot streamlined the time required to build and enhance the students’ knowledge base, enabling students to respond more swiftly to the client’s needs.
After faculty review, the students shared their recommendations with the PKB, which had been trained on a transcript of an interview we conducted with the client. In the interview, the client discussed and assessed their newsletter, focusing on layout, design needs, and the suitability of the content for the target audience. Using the bot’s feedback, the students reevaluated their recommendations and prepared a revised response. The students then developed a final presentation outlining what they considered to be their top recommendations based on their conversations with the PKB. The project concluded with the client selecting the most viable ideas, which were then used to produce a year-end newsletter presented as “for-students, by-students.”
What comes next?
When we began this effort, we assumed that AI could provide a way to strengthen the student-client relationship by giving the students unrestricted access to:
- the skills and strategies taught in the course using a bot trained on the course content and
- the client using a bot trained on the project requirements and needs.
Based on what we had previously discovered (Is Your Real-World Experience Real Enough? Faculty Focus, March 2020), our thinking was that deepening the linkages between the learner and the client could foster a collaboration that would lead to an implementable project.
Overall, the bots were successful because they helped the students achieve several outcomes in a shorter period of time. The students became better versed in basic strategies for stakeholder communications, and the bots provided immediate feedback on their communication recommendations. These aspects had previously been challenging due to the constraints of the course schedule and the impracticality of expecting clients to address the high volume of student inquiries.
While we were pleased with the experience and the students were able to produce recommendations that were implemented by the client, we believe it is still too early to determine how effective this approach would be in applied classrooms on a broader scale. For example, some issues that still need to be addressed include:
- Access to Generative AI tools: For this pilot, students had to have access to paid ChatGPT accounts to use the bots we designed, and those who did not had to use untrained, free-tier generative AI tools.
- Strengthening student AI skills: Unfortunately, many of the students resorted to using the bots to simply write their recommendations, not recognizing the bot as a conversational and collaborative partner.
However, despite limitations, we believe there is real potential for AI to “hack time,” i.e., respond to some of the real challenges we, as teachers, have faced in working within a 15-week timeframe to produce implementable projects in professional, applied classrooms. We intend to continue utilizing AI in this manner while refining our approach based on our findings. This includes enhancing the bots’ knowledge base, improving equitable access to the bots, and developing clearer norms and guidelines for their use.
Dr. Paul Acquaro is a lecturer at FOM University of Applied Science for Economics and Management in Berlin and an adjunct assistant professor teaching online with New York University’s School of Professional Studies. He teaches undergraduate and graduate courses in database development, web technologies, IT management, business communication, and project development. Acquaro has over 20 years of experience in information technology, communications, and curriculum development and teaching, and earned a doctorate in education, focusing on instructional technology from Teachers College, Columbia University. Among his many interests is exploring how to combine the possibilities of online learning and the power of problem-based pedagogy.
Dr. Steven Goss is the dean of the School of Professional Studies at Manhattan College. He joined Manhattan College after serving as the vice provost of digital learning at Teachers College, Columbia University where he helped to facilitate the institutional mission for online education. Prior to Teachers College, Goss lead several successful online initiatives at Bank Street College of Education and New York University. He has received awards from The Association for the Advancement of Education in Computing (AACE) and Online Learning Consortium (OLC) for his research on learner-centered online education.