I Technology and machinery have long exceeded human strength, speed, and
efficiency. But OpenAI’s recent release ChatGPT is an artificial intelligence chatbot
that—unlike previous technological advances—seems to mimic human intellect in a
manner rivaled only by science fiction.
In general, ChatGPT can transact much of the work done in modern educational
settings. This has led to a spate of articles with alarming titles like “Freaking Out
About ChatGPT,” “Will Everyone Become a Cheat?” “The End of High School English,”
and “The College Essay is Dead.” A recent article opened with this ominous forecast:
“Professors, programmers and journalists could all be out of a job in just a few
In response to its unnerving capacity to disrupt core fixtures of education like
homework and essays, many are quick to point out the limitations of ChatGPT in
order to prove human brainpower is still greater than increasingly intelligent AI. Yet
while dialogue models like ChatGPT still have glitches and quirks, they will
undoubtably get “smarter” and more adaptive with time.
ChatGPT’s capacity to arrange information and knowledge is a challenge to
educational assessment from elementary to graduate school. More accurately, AI’s
threat is proportionate to how we have come to define education.
As Amit Katwala writing for Wired put it, modern education is structured to teach
people a single skill: how to collect and transmit information. And in this sense,
alarm is warranted. To the extent education is reduced to absorbing and
regurgitating information, AI like ChatGPT and other adaptive dialogue models will
continue to surpass humans and radically reconfigure core dimensions of
But is this the right way to characterize education? Is this the role of a teacher?
Just as fears regarding automation and human labor reveal an impoverished view of
what a human is, fears of artificial intelligence and learning reveal an impoverished
view of education. If education is merely the transmission of information, then
advanced AI could very well be a kind of death knell for conventional educational
instruction and evaluation.
If, however, educational institutions exist for the formative development of
students—their intellect, character, morality, wisdom, judgment, prudence, service,
capacity, and unitive understanding of the world they inhabit and how they act
within it (and here I have the Christian university in mind)—educational institutions
will always have an important relevance for society, even a society increasingly
occupied by AI.
In Nichomachean Ethics, Aristotle claimed that the purpose of education was to
teach the student rightly ordered affections, desires, and impulses; to receive
pleasure and pain from proper objects was “correct education.”
G. K. Chesterton referred to education as giving us “abstract and eternal standards”
to judge fleeting conditions. French philosopher Simone Weil said that education is
a form of attention—seeing beyond the subject and ordering our minds and hearts
to higher things.
In this sense, educational instruction and learning should not be divorced from their
proper use—which involve a distinctly moral dimension.
“If we are disposed to judge apart from the larger questions,” writes Wendell Berry
in his essay The Loss of the University, “then a good forger has as valid a claim to
our respect as a good artist.” Or, as ethicist Martha Nussbaum writes, “A good
doctor is also a good poisoner.”
In other words, the difference between a life-saving doctor and a life-taking
poisoner or between an able artist and a master forger lies not in the possession of
knowledge or skill, but in its rightful application.
To use Berry’s phrase, addressing the “larger questions” of life must be part of any
comprehensive educational curriculum. The enterprise of learning is not just
optimizing an end goal or value, but the prioritization of values.
Yes, schools should teach technical skills, foster the social and economic potential of
students, and prepare them for a dynamic workforce. But in its best sense,
education does not solely exist for job preparation. It involves providing students
with a unifying hub for the various academic spokes—to convey what knowledge,
skill, and craft are for.
That is education worthy of its name.
Former Google chairman Eric Schmidt claimed that by giving Google greater access
to your personal information, the popular search engine can “make you smarter.”
But can Google or any other adaptive AI technology make you better? More
empathic? Selfless? Virtuous?
Knowledge is not enough to constitute wisdom, judgment, prudence, and moral
excellence. One can be, to use an expression once used by the playwright Molière, a
“learned fool.” Or as Walker Percy reminded us, you can make straight A’s and “flunk
For this reason, Stanford University professor Rick Reis said educators should aim
for “productive discomfort”—creating fear in students because they “either cannot
articulate … the values that guide their lives, or that their values and beliefs do not
To this last point, a holistic, unitive education says something about the teachers
who provide it. As most can attest, impactful teachers are not simply intelligent,
credentialed communicators who can effectively convey a concept or teach a skill—
they are more. Good teachers care and show interest in their students, name their
potential, broaden their imaginative horizons, and inspire them.
Though some have attempted to use AI in educational settings to create an
empathetic community, AI cannot empathize in a way a teacher can.
If teachers maintain patterns of connection and compassion for students—drawing
them into a larger story, pushing for “productive discomfort,” forcing evaluative
judgments and moral fortitude (what Robert and Edward Skidelsky called “educating
the sentiments”)—AI may change how assessment is proctored, but it will be
unlikely to change the enterprise of teaching.
Years ago, while doing graduate work at an institution in Scotland, I was privileged
to have renowned biblical scholar Richard Bauckham as a professor. We had to
write five papers for the semester, all graded on a 20-point scale. Submissions
would rarely, if ever, receive a score higher than 18—implying that there is no
My first submission returned a score of 17.5, the highest grade I had received in the
program. At that point, I was convinced I had “cracked the code” on British
assessment and the essay elements necessary to land a high grade. To my surprise,
however, my next paper’s score was lower. The next, lower still. For the rest of the
semester, the trend of decreasing scores continued.
During an end-of-semester feedback session with the professor, I spoke up. “I
maintained the same quality of writing, the same volume of sourcing, and the same
arc of argumentation,” I griped. “I met the stated criteria. I did what was asked. Why
were my scores lower with each submission?”
I will never forget Bauckham’s reply. After patiently bearing my appeal, he calmly
stated, “If you did not receive a favorable score on a paper, it is because you didn’t
At the time, I was annoyed. How was I supposed to “dazzle” a world-renowned
scholar? Is that really what I was being graded on?
Today, as an educator, I see the exchange much differently. Bauckham did not
simply want students to follow a pattern and reproduce information. He was not
merely looking for proficiency. Demonstrating understanding and competency was
important, but not enough. He wanted authorial voice—connection and originality
that advanced knowledge through reasoned argumentation.
Artificial intelligence like ChatGPT and its advanced future variants will continue to
elicit both wonder and fear. But while impressive, they cannot dazzle in the manner
Bauckham described. This speaks to the possibilities for modern education.
For this reason, I am encouraged.
First, technology can be used for beneficial purposes. Artificial intelligence can lower
the costs of educational delivery. It can expand educational access across a broader
spectrum of learners. AI can help students brainstorm research topics or consider
alternative perspectives. It may even be used as a tool to share the gospel. But as
Wendell Berry has warned, problems arise from our “willingness to allow machines
… to prescribe the terms and conditions” for our lives.
In other words, AI is a good servant, but it is a poor master.
I am encouraged for another reason: the future of education. As AI capabilities
grow, it will force institutions to define, describe, and practice education in a
manner that mirrors a more holistic vision of formational learning.
If teachers are viewed as gatekeepers of information, then they are already
obsolete. If education is merely rote memorization, information regurgitation, data
computation, and proficiency demonstration, then education must soon be radically
AI will only grow in its capacity to accurately assemble information. But holistic
education is teaching and learning from an empathic educator seeking to locate
knowledge and skill in a larger unified context and assess its rightful application.
Poet John Keats warned of reductionist education that would “clip an angel’s wings”
or “unweave a rainbow.” For this reason, CS Lewis said the task of an educator is
“not to cut down jungles but to irrigate deserts.”
This kind of education is not only robust to withstand technological dynamism but
also an increasingly important fixture for cultivating the minds, hearts, and hands of
tomorrow’s citizens (including our citizenship in the city of God).
Educational institutions that embrace this vision have an opportunity to dazzle in a
way that cannot be artificially replicated.
Kevin Brown is the 18th president of Asbury University.
Why Educators Shouldn’t Be Worried About AI
I Technology and machinery have long exceeded human strength, speed, and