Demand for liberal arts education has declined in recent years as students increasingly eye college programs that directly prepare them for jobs. But according to many tech and college experts, as businesses launch advanced AI tools or integrate such technology into their operations, liberal arts majors will become more coveted.
That’s because employers will need people to think through the ethical stakes and unintended consequences of new technologies. Companies may also need people to help improve the written commands given to chatbots or resolve challenging customer service disputes that AI can’t handle.
College leaders therefore need to take action as AI changes the workforce, scholars say.
While robotics in the 1990s replaced many blue-collar jobs, AI will replace jobs that require college degrees, or even graduate degrees like attorney positions, predicted Ray Schroeder, a senior fellow at UPCEA. Schroeder estimates big changes will occur over the next three to four years.
“This is really a significant factor in employment and of course, those of us in higher education are monitoring it closely because we want to be able to give our students the skills that will enable them to thrive in this emerging workplace environment,” Schroeder said.
ChatGPT and the large language models are predictive based on existing knowledge, said Cecilia Gaposchkin, a history professor at Dartmouth College.
One could train a computer about the ideas of ethics, human morality and a basic sense of human dignity, but that kind of reasoning is instinctual for humans, Gaposchkin said. And the tools and capacity of AI will be subordinate to the decisions people make and their priorities, Gaposchkin said.
While AI makes past knowledge available at one’s fingertips, liberal arts education trains students to creatively think, problem-solve, synthesize information, manage ambiguity, ask questions and come up with new ideas, Gaposchkin said.
“To outsource the higher-level orders to an object that doesn’t have that ethical grounding trained by human reason is a dead end,” she said.
There are many settings where people would like to use AI but the cost of failure is high, said Rebecca Willett, professor of statistics and computer science at the University of Chicago. That includes certain AI tools for healthcare, real estate, finance and criminal justice, she said.
A 2023 study from Stanford University’s medical school, for instance, showed that AI tools regenerated debunked medical ideas that were racist and could lead to worsened health disparities for Black patients.
And in 2016, ProPublica found software used by courts disproportionately labeled Black defendants as being at a higher risk of commiting a crime compared to White defendants. ProPublica, in its analysis, compared groups of defendants who did not commit a crime — throwing into question the fairness of a tool used to help with sentencing and parole decisions.
People with strong liberal arts backgrounds will contribute with “not only the technical development of these tools, but also to thinking about, how should we approach them? And what are the trade-offs associated with the choices we might make?” Willett said.
Students who earn degrees in liberal arts and humanities may actually have an advantage in the job market over those who specialize in STEM-based programming, argued Robert Gibson, director of instructional design at Wichita State University Campus of Applied Sciences and Technology.
That’s because liberal arts students could provide a more humanistic perspective on the technology, with an eye to ethics, privacy and bias, for example, Gibson added.
“They don’t sort of wildly go off and promote the use of this technology without sort of stepping back and saying, ‘Maybe this is how we can look at this a little bit differently,’” Gibson said.
Noting that employers are now posting job ads seeking workers with AI skills, Gibson also said AI is being used in marketing campaigns or customer relationship management systems by small businesses nationwide. That includes businesses ranging from art stores and business retailers to production companies that want workers who can wear two hats, he said.
They want workers who are not only adept with using those technologies, but also who have customer service experience, or other human experiences required for their businesses, said Gibson.
Liberal arts students could also be instrumental in developing and operating AI chatbots. “Prompt engineering,” which programs chatbots to converse with users, is an English writing capability and requires adaptability and continuous learning, Schroeder said.
Now AI can actually write code faster, and in many cases, better than humans can, Schroeder said. The U.S. Bureau of Labor Statistics projected the number of computer programming positions to decline 11% between 2022 and 2032. Companies will instead need a supervisor to troubleshoot what AI generates, Schroeder said.
“If you are in computer science and if you are coding, your job may be in jeopardy,” Schroeder said.
“To outsource the higher-level orders to an object that doesn’t have that ethical grounding trained by human reason is a dead end.”
Cecilia Gaposchkin
History professor, Dartmouth College
Schroeder said he’s also seen an uptick in companies seeking applicants with interpersonal skills such as critical thinking, creativity, emotional intelligence, communication, collaboration and ethics.
“We see those more and more often in position descriptions,” Schroeder said. “While they don’t specifically say liberal arts degrees — because it’s possible to obtain some of these skills outside of college — those are skills that we emphasize in the liberal arts.”
Willett said she does not believe liberal arts will replace all technical workers. Rather the two need to work hand in hand, she said.
But between two candidates with backgrounds in data science, many employers will find the one with a liberal arts education “extremely attractive” as that allows the employees to contextualize the technical work they might be doing, Willett said.
Roanoke College is among the liberal arts institutions having discussions about where AI fits into its curriculum, said Brian Reed, vice president for student success and the Roanoke Experience.
Reed said he would like to see the small Virginia-based college create a place in its general education requirements for students to work with AI in intentional ways.
The emergence of AI will provide “a real renaissance for the humanities,” Reed said. But higher ed institutions, including liberal arts colleges, are often not great at explaining how critical thinking skills that students learn can be applied to technology like AI, Reed said.
“Institutions have to be much more clear about the skills and competencies that the students gleaned from liberal arts in really straightforward language,” he said.
A lot of higher education institutions are already starting to aggressively introduce programs or entirely new colleges around artificial intelligence, said Gibson. Institutions that are larger “thought leaders,” in particular, are poised financially for leadership to build those conversations around AI across disciplines, he said.
Earlier this year, Arizona State University announced it has been incorporating generative AI into many of its English Department courses, allowing students to use the technology to, among other things, brainstorm topics, assist with research, and help edit their papers.
Some smaller colleges are starting to have those conversations, although not as fast as Gibson said he would like.
“This is still on the periphery,” he said. It’s harder for small colleges to implement wholesale change, like cross-disciplinary courses on AI, due to bureaucratic procedures that can slow down that process.
“Institutions have to be much more clear about the skills and competencies that the students gleaned from liberal arts in really straightforward language.”
Brian Reed
Vice president for student success and the Roanoke Experience, Roanoke College
To prepare, Gibson said institutions need to start building into their liberal arts curriculum some conversations around the moral and societal implications of AI.
Reed said higher ed leaders would be doing a disservice to their liberal arts students if they are not having conversations about how they can help them embrace AI technology — so they become comfortable with it and know how to harness it in responsible and ethical ways.
Liberal arts students will need to gain competency on the technical side, said Reed. But the emergence of AI will also require people who are “really thoughtful about: How [do] we prompt? Should we prompt in certain instances? How do we sort of filter bias? How do we see through phantom responses?”