“You won’t lose your job to AI. You’ll lose your job to someone using AI.”
That stark warning, published in a recent Business Insider article, has rapidly become a rallying cry for educators, policymakers, and technologists. It signals a fundamental shift in our thinking about artificial intelligence: not as a job-stealer but as literacy. And like all forms of literacy, access is everything.
Access to AI is increasingly stratified in high schools across America. In well-funded school districts like Lower Merion and elite liberal arts institutions like Haverford College, students are taught to treat AI not as a shortcut but as a partner in learning. In under-resourced schools in the inner cities, the story is very different.
AI is already woven into the classroom culture at Lower Merion, which is considered one of the best school districts in Pennsylvania and known for its academic excellence and affluence.
“Staff can use AI-detection software to review assignments that students are supposed to create themselves,” says Amy Buckman, Lower Merion’s Director of Community Relations, “but can also direct students to use AI in the classroom as a thought partner to suggest improvements to their original assignments.” She notes that using AI responsibly is crucial now and will remain so, especially since many AI tools are both open-source and freely accessible.
At Haverford College, AI integration is already part of the admissions application process. Speaking on the role of AI in admissions essays, President Raymond stated, “Students can use it; they just have to do it smartly.” That hazy, thin-line approach is extending campus-wide. “The libraries are taking the lead on working with students, providing resources on their website on how to do this smartly,” she says.
Haverford is even launching a new Institute for Ethical Inquiry and Leadership, to be co-led by computer science professor Sorelle Friedler, who specializes in the ethics of AI. For Raymond, ethical literacy is just as critical as technical fluency.
“Do we want to learn? Sure, let’s use ChatGPT to learn,” she says. “But let’s know enough that we know when ChatGPT is telling us garbage.”
This nuanced approach contrasts sharply with what’s possible in underfunded districts, many lacking reliable internet access and much less structured AI literacy programs. While schools like Lower Merion are already piloting AI-assisted tutoring platforms, according to a recent article in Education Next, many public schools in Philadelphia struggle to supply updated computers or train teachers in the latest technological tools.
The result? A widening gap between students using AI to enhance critical thinking and those using it to bypass it.
That gap is not just about hardware or software. It’s about mindset, opportunity, and long-term consequences. According to Education Next, students who used a specially designed AI chatbot tutor in a physics class experienced roughly twice the learning improvements and were much more engaged compared to their peers in standard classroom settings. Another study showed that students using AI for math tutoring outperformed their peers by as much as 127% but saw a 17% drop when access to the tools was removed, revealing a dangerous dependence.
Educators like Sal Khan, founder of Khan Academy, believe we’re on the edge of a breakthrough—if implemented equitably. His platform, Khanmigo, built around OpenAI’s GPT-4, is designed to serve as a personalized AI tutor that breaks down complex problems into manageable steps and gives students real-time feedback.
But even Khan acknowledges the risks. AI tutoring systems that lack guidance can become just another “footfall yanked away,” as one article put it, like so many overhyped ed-tech tools before them. As “AI in Education” author John Bailey argues, the key difference this time is that this wave of tools doesn’t just democratize access to information—it provides access to expertise. For schools that can afford to train students and staff to use it well, that’s a game-changer. For schools that can’t, it’s a threat.
Some hope comes from the federal level. In 2023, the U.S. Department of Education released a report urging immediate action to prepare for the expected increase of AI in education technology. However, implementation is slow and inconsistent, and disparities persist.
Until then, the divide grows. Students in places like Haverford and Lower Merion are learning how to prompt AI, spot hallucinations, and wield tools like GPT-4 with discernment. In under-resourced schools, many navigate an AI-driven world without a roadmap—hoping for guidance that may never come.
Discover more from The Clerk
Subscribe to get the latest posts sent to your email.
Be First to Comment