As AI continues to reshape the educational landscape, trust in AI-powered tools remains a crucial yet challenging issue—particularly in the humanities. JSTOR, the digital library of academic journals, books, and primary sources, published an interview with Professor Alexa Alice Joubin on critical thinking and AI. It is a follow-up to her recent webinar hosted by JSTOR / ITHAKA (see the YouTube video). Joubin brought a key insight to critical AI studies. She views AI not as an ultimate source of plausible answers, but as a performative and interpretive tool, much like a theatrical performance.
Instead of expecting factual accuracy, students should approach AI-generated text critically, much like analyzing a literary work. AI, like theatre, creates “virtual worlds” that are rooted in the real but are not always factual. By recognizing the performative nature of AI, students can develop a healthy skepticism, treating AI as a tool for inquiry rather than an unquestionable authority. AI is known to hallucinate or fabricate information, but it resonates with Shakespeare’s depictions of neurodiversity, ghosts, and strange dreams. Joubin says that “Through words, Shakespeare created virtual worlds. Through simulating textual patterns, AI constructs a similar type of virtuality that is partially rooted in the real world and partially in probable worlds.”
Insights from Renaissance studies also show that this is not the first time that new technologies have created anxieties and a crisis of attention. In the era of the printing press, people were unsure how to manage the deluge of printed books efficiently. The illustration shows a Renaissance-era spinning book wheel to simultaneously read and compare multiple books.
The key takeaway? AI should be a “social technology,” not a replacement for critical engagement. For example, instead of asking AI for a definitive answer, students use Joubin’s open-access AI to simulate different communities’ reactions to a scenario. This can broaden students’ horizon beyond their age group and their own backgrounds.