The Future Is Yesterday
As generative AI tools make waves in fields as diverse as banking and the visual arts, the academy must decide how it will adjust to these new technologies鈥攁nd quickly
On November 30, 2022, OpenAI, a privately held technology company in San Francisco, launched ChatGPT, a chatbot with a conversational, dialogue-based format that creates the illusion of a human-like back-and-forth with the user.
What makes ChatGPT and similar generative artificial intelligence (AI) programs such as Microsoft鈥檚 Compose and Google Bard different from the autocomplete feature on your smart phone is that they can generate seemingly original text based on specific input. Provided with a prompt such as 鈥淲rite a five-paragraph essay about symbolism in Hemingway鈥檚 The Sun Also Rises for a college English class,鈥 a generative AI program does just that.
The initial capabilities of ChatGPT seemed so impressive that just six days after the app was introduced, a headline in The Atlantic asked, 鈥淲ill ChatGPT Kill the Student Essay?鈥 Author Stephen Marche has apparently made up his mind since initially asking that question: The online article now runs under the title 鈥淭he College Essay Is Dead.鈥
Whether his prophecy will ultimately prove true or not, Marche鈥檚 essay started a media firestorm of conflicting opinions鈥攊nformed and otherwise鈥攁bout how AI would either ring in a transformational new era for universities or sound their death knells. The predictions about how ChatGPT and similar technologies will affect higher education have been nonstop ever since, but what鈥檚 clear is that these tools have already taken root, and there will certainly be more of them to come.
AI in the Academy
Pepperdine, like most universities, is adapting to these generative AI programs as quickly as possible. 鈥淕enerative AI is disrupting teaching and learning by changing how skill and knowledge assessments are done and creating new educational use cases and pedagogical approaches,鈥 says David Smith, senior associate provost of information technology (IT), online learning, and international program facilities. 鈥淚t could also negatively affect the cultivation of creativity and inquiry among students and pose ethical challenges for teachers and learners. On the other hand, it will also offer opportunities to enhance personalized learning experiences, support faculty in research, expand access to knowledge and resources, and foster interdisciplinary research.鈥
Cognizant of both the opportunities and risks that generative AI tools pose to higher education, Jonathan See, Pepperdine鈥檚 chief information officer since 2012, recently established an AI Advisory Committee under the authority of the Office of the Provost. Comprising faculty members from all five of the University鈥檚 schools, as well as representatives from Human Resources, the Office of the General Counsel, the Office of Community Standards, and the IT department, the committee is charged with the mission of developing University-wide guidelines and best practices for the use of AI and has already produced a faculty guide for creating AI-use statements for course syllabi.
鈥淧epperdine recognizes that these types of generative AI tools are here to stay,鈥 says See. 鈥淪o the question becomes, 鈥楬ow do we embrace and adapt to these tools so that we鈥檙e enhancing student learning and maintaining our academic standards?鈥 AI is only going to get better in the future, so how do we use these tools effectively while maintaining data privacy and security?鈥
To prepare for his task, See even asked an AI tool about 鈥渆thical considerations for uses of AI鈥 to help him generate areas for the committee鈥檚 focus. 鈥淎I creates a certain efficiency, but it still requires the human element to ensure that the content generated meets University standards,鈥 he says. 鈥淲e recognize that the human element is critical and cannot be forsaken.鈥
Questions about the ethical uses of generative AI go beyond how the technology is used; many have concerns about AI-generated content promulgating biases and prejudices鈥攊ntentional or not鈥攅mbedded in the source texts that make up a given program鈥檚 dataset. 鈥淎I was developed by humans who have inherent biases and flaws,鈥 says Hattie Mitchell (MPP 鈥12), visiting professor of education and policy at the School of Public Policy. 鈥淲e should expect AI to represent some of the same biases and flaws that the humans who created it do鈥攁nd early research indicates that it does. But if used with discipline, boundaries, and thoughtfulness, AI can be of tremendous added value to our work at the university level.鈥
Pepperdine鈥檚 faculty guide to the use of AI tools, which has been accepted by the University鈥檚 provost Jay Brewster and forwarded to the deans of each school to disseminate to their faculty members, provides general recommendations to faculty, as well as examples of syllabus statements in three categories: for classes in which the use of AI tools is allowed or encouraged, for classes that allow limited or selective use of AI tools, and for classes in which the use of AI tools is prohibited. It is at the discretion of each instructor to decide whether or not to include a syllabus statement regarding the use of AI tools in their classes.
鈥淓ach school will update its academic integrity policies, curricula, and syllabi to acknowledge the impact of these new technologies,鈥 See says. 鈥淭he committee has provided high-level guidelines with flexibility, so we can adapt them as the technology changes. If we are too specific, the technology may outrun us.鈥
The inevitability of generative AI impacting education seems to be almost universally accepted. 鈥淚s it something we should be afraid of?鈥 asks John C. Buckingham, eLearning instructional designer for the Pepperdine Graziadio Business School. 鈥淵es and no. To some extent, we should embrace it as an institution. There鈥檚 definitely some fear in education circles that students are going to use this technology to cheat. That fear is understandable. AI is a disruptive force to education, and while it鈥檚 going to force education to adapt, it will present some good opportunities along the way.鈥

鈥淭he full landscape of AI tools is going to change the educational environment over the next few years.鈥
鈥擩ordan Lott
Assessing Assessments
One way in which educators are adapting to AI tools is in their means of assessing student learning. In educational theory, 鈥渁uthentic assessment鈥 is a teaching approach that emphasizes the student鈥檚 ability to apply knowledge in new situations, rather than simply memorizing content. Tony DePrato (MA 鈥02), an educational technology expert and chief information officer at St. Andrew鈥檚 Episcopal School in Ridgeland, Mississippi, believes that generative AI may inspire instructors to use authentic assessment strategies more frequently and effectively.
鈥淚t could mean a return to the 鈥榖lue book,鈥 where students write their response to a question or problem in front of a proctor,鈥 says DePrato. 鈥淭his kind of answer has to actually be read by the assessor, so it鈥檚 more work than a multiple-choice test. But that means it鈥檚 a more authentic assessment for the assessor as well as the student.鈥
Another adjustment that educators may have to make is more philosophical. 鈥淚f teachers see themselves as experts who simply transfer knowledge to students and leave the difficult cognitive tasks for the students to do on their own, then of course the students are going to do that in the easiest way possible,鈥 says Catlin R. Tucker (EdD 鈥20), an expert on blended learning, whose latest book, Shift Writing Into the Classroom with UDL and Blended Learning, will be published in January 2024. 鈥淏ut if you鈥檙e using class time to help students use these tools, then there鈥檚 unlimited opportunity for personalizing learning. It鈥檚 reimagining the approach to teaching and learning.鈥
Some educators are proactively taking on the challenge of incorporating AI tools into their students鈥 learning experiences. Artem Barsky (BSM 鈥18, MBA 鈥19), an adjunct professor of information systems technology management at the Graziadio School since 2020, encourages his undergraduate students to share their own experiences with ChatGPT in an online discussion forum, and he shares his uses of the tool as well.
鈥淲e discuss its strengths and its limitations,鈥 he says. Barsky wants his students to learn how to adapt to this new technology quickly. 鈥淐an they automate processes? Can they give themselves more bandwidth to do other things? Can they build efficiencies?鈥 he asks.
Barsky also encourages his graduate students to use ChatGPT as a supplementary tool on their short-essay examinations (and to disclose if they did so), 鈥渂ut so far, they haven鈥檛.鈥 As to why not, he speculates that because his exams are timed, students may view adapting the answers that ChatGPT generates as an additional step that would be too time-consuming. Barsky believes that within a few terms, he鈥檒l see more students use AI tools to either create or to review their first drafts. 鈥淭he world is moving toward more integration of these tools,鈥 he explains. 鈥淚 want students to be as comfortable with this technology as they are with Google.鈥
When Teachers Become the Students

We need to use these tools to maximize student creativity and potential
鈥擟atlin R. Tucker (EdD 鈥20)
In an effort to support Pepperdine faculty in their understanding and use of generative AI tools, the Seaver College and the University鈥檚 team jointly held a three-day workshop, Teaching and Learning in the Age of AI, in June. Approximately 30 faculty members participated in the workshop, which mixed informative lectures with hands-on exercises that allowed faculty to explore how AI tools can be used by both students and instructors.
For instance, to simulate the student experience, participants used ChatGPT to write an original essay on a topic of their choice, to improve a poorly written rough draft, and to write multiple versions of an essay to find out if鈥攁nd how quickly鈥擟hatGPT could improve it. Faculty were also asked to experiment with AI tools to revise their test questions, to help craft a syllabus policy regarding AI usage, and to simulate student responses to test questions in order to improve and refine them. These exercises were designed to illustrate how both students and educators can use AI tools to do their work more efficiently.
鈥淭he full landscape of AI tools is going to change the educational environment over the next few years,鈥 says Jordan Lott, who serves as senior manager of Pepperdine鈥檚 IT training and the Technology and Learning team and was a speaker at the June workshop. 鈥淧edagogy and assessment will need to be adjusted. Very soon, it鈥檚 going to be hard to avoid having AI assist with your writing because it鈥檚 going to be built into all the tools. So looking at the process rather than the end result may help instructors see where student learning happened and where their knowledge of the content was applied.鈥
Many experts in the popular press have compared the rise of ChatGPT to the introduction of the calculator or the early years of the internet. At first, using these technologies was viewed as cheating, and some educators treated it as such. For example, students weren鈥檛 allowed to use calculators on the SAT until 1994. Almost 30 years later, calculators are so accepted as a valuable tool that the College Board website even provides tips on how to use them most effectively on the test.
鈥淥nce the calculator reached a certain level of penetration, it was no longer considered cheating,鈥 says Lott. 鈥淒oes it change the value of what鈥檚 being learned to use the tool? People can still be good at math while using a calculator. There can be value in reading high-quality content generated by these AI tools with the intent of improving it. I have improved my writing abilities by consistently reading the writing of colleagues who write better than I do. Why couldn鈥檛 ChatGPT serve the same function?鈥
The Way Forward
Some see generative AI tools as a much greater leap forward in technology than even the printing press. 鈥淟arge language models like ChatGPT are able to speak to each other, to understand human emotion, to understand consequences; it鈥檚 totally different,鈥 says Barsky. 鈥淲e are building something that is likely going to become billions of times smarter than us. And if that鈥檚 the case, these tools may be successful in solving problems in ways we鈥檝e never even considered.鈥
But others are more cautious. 鈥淭here鈥檚 a lot of uncertainty and fear around AI tools, as well as a lot of hype,鈥 says DePrato. 鈥淪ome say, 鈥業t鈥檚 going to do everything!鈥 And others say, 鈥榃e have to run away from it!鈥 My advice is to use the tools, to use them at scale, and to build an understanding of how the tools work. We have to get an understanding of where we are and where the technology is going.鈥
Tucker takes a practical approach to ChatGPT and other generative AI technologies, contending that AI usage can help teachers improve their students鈥 work. 鈥淲e need to use these tools to maximize student creativity and potential,鈥 she says. 鈥淚f we block their use, we won鈥檛 understand how to use them, and we will create classrooms that are stuck in time, out of step with what鈥檚 happening outside the classroom. We don鈥檛 spend nearly enough time supporting students as they write, identifying what鈥檚 good, what isn鈥檛, and why. ChatGPT may actually raise the bar because it can provide real-time support to students in their writing.鈥
So yes, the college essay鈥攁s we knew it鈥攎ay be dead. Long live the college essay.
How It Works

The 鈥淕PT鈥 in ChatGPT鈥檚 name stands for 鈥済enerative pre-trained transformer.鈥 In simple terms, the program generates responses to questions or prompts based on a large dataset that has been pre-trained according to a transformer model, a type of machine learning that is meant to mimic cognitive attention. In humans, cognitive attention is the ability of our brains to determine what is the most important information to focus on amid all the stimuli we receive and to filter out the information that is irrelevant.

ChatGPT is a large language model (LLM), a type of computer program that can read, summarize, identify patterns in, and generate text based on a dataset made up of huge amounts of text, the vast majority of which was written by actual humans, such as books, journal and magazine articles, and webpages. An LLM predicts the next word in a sentence based on the words that have come before it鈥攗sing probability calculations based on the text in its dataset鈥攁nd then the next word after that, and so on. When an LLM generates the first word of a sentence, it doesn鈥檛 know what the last word of that sentence will be.

And of course, this type of technology can also be used to create images, computer code, and more. If a human being can provide it with a large enough dataset to sample, a machine-learning program like ChatGPT or DALL-E (OpenAI鈥檚 image-generation app) can mimic and generate almost any type of output.