Artificial Intelligence is everywhere, from the news to your search browser to maybe even your homework. It’s the topic of the future. But how will society adapt to this growing tool revolutionizing each industry?
At Bates and other higher education institutions, AI is continually the subject of conversations surrounding its possibilities and the new “capabilities” that it holds.
Big-name universities like Yale, UC Berkeley and Caltech have statements for the use of generative AI in the classroom. These statements provide a guideline for the entire college community in using these tools. Interestingly, they often don’t dissuade students from using the tools altogether. Many encourage students to use the generative AI models responsibly and be open and honest when AI is used.
At Bates, an institution without such a guideline, students could be asked to use AI for an assignment in one class and to handwrite an essay the next. This variation between majors, departments and professors has allowed for discussion and interest by some faculty.
Professor Sylvia Federico, English department chair, explained how in English she is excited for the possibilities that the AI models could bring to her classroom.
“I think it has a lot of potential in certain pedagogical fields, lots of sort of cool things it can do that might really be useful in the classroom,” Federico said.
For example, Federico mentioned that the English department is applying for a grant to help combine historic and complex texts with the capacities of AI. The model would specialize on the research papers and historical context of the source and be able to compile a detailed analysis and accessible answer to anyone who had a question. This model, if developed, would provide a refreshing change to English classrooms.
Jennifer L. Koviach-Côté, Associate Professor of Chemistry and Biochemistry, noted how in the field of science, when used properly, AI can function as a brilliant tool in accelerating research.
Koviach-Côté has been at Bates for over two decades, where she has seen the way changes in technology can really influence student spaces and learning. She mentioned how several times in her career, when a new online platform is launched there is a huge wave of surprise, a similar pattern to what we’re seeing now.
“When Wikipedia came out, everyone freaked out and thought, ‘oh, students are just gonna copy the Wikipedia page,’” Koviach-Côté said, “but we found ways to make sure that students were still learning. So we just always have to be kind of dealing with the next thing.”
Koviach-Côté spoke about the program AlphaFold, a protein-structure prediction database, utilized in numerous Biochemistry courses at Bates. She noted how platforms like AlphaFold can have the ability to both challenge and enhance students’ learning.
However, that doesn’t mean that AI doesn’t have great flaws that need to be addressed before this technology is relied upon in higher education. Because it is only as smart as the content that it is fed, Federico explained that “it has all of the symptoms of any other malignant enterprise in late-stage capitalism, I mean, racist, sexist, etc., etc., right? It’s a reflection of the person who taught it.”
Professor Dale Chapman of the Music department agrees. He described how he is “a little bit more open-minded and less draconian, in the sense that there might be dimensions of this that are potentially useful, but there’s disturbing trends as well.”
In the context of music, there have been some recent controversies that have exposed the truly exploitative nature that AI can be used in. The most popular example was from just last year with the Drake song “Taylor Made Freestyle” using the AI vocals of 2pac and Snoop Dogg.
Chapman explained that in doing this, “You’re using an algorithm to essentially put on a kind of musical black face in order to try and realize someone’s voice that you don’t inhabit.”
Other instances of AI threatening the music industry have been when it takes away the jobs of session musicians hired to write generic tracks for presentations or corporate functions. Chapman believes that “there’s a likelihood there of it replacing real musicians and thus having disturbing labor implications for the music industry.”
This use of AI in art and writing has raised the question of what makes a piece of work valuable. Does music made by AI have the potential to be as meaningful as another song written by an artist? Some might be apprehensive about embracing the new technology, fearing it would replace the creativity that makes humans so special.
But Federico and Chapman don’t believe that this should stop those working in fields related to anti-racism and combating misogyny and antisemitism from disavowing the technology altogether. Federico said that instead, maybe “we should be the ones who are putting in the discourse, rather than just condemning the horror that it is.”
In the classroom, some professors are changing their curriculum to adapt to opportunities for academic dishonesty. Formats like online essays might become outdated, with an increased use of presentations and quizzes to showcase student knowledge as they generate it themselves. Even in fields where AI’s presence is rising like chemistry and biochemistry research, Koviach-Côté vocalized her hesitations about how chatbot programs like ChatGPT can interfere in student spaces, both on the individual and collaborative level.
Koviach-Côté believes that AI could hinder students’ learning saying that “Our learning goals for our majors are to communicate, both in writing and orally, their scientific findings and their reflections about science. If students are using an alternative tool in order to do that, then they are not gaining that experience, gaining that expertise, and they’re not fulfilling those learning goals.”
Perhaps a larger study to address why students are feeling the need to use generative AI in replacement of their own ideas is needed, but for now, each professor can modify their course as they see fit.
One of the defining parts of a liberal arts education is the ability to engage in research alongside those at the forefront of their field. Chapman believes that by “learning how the process of knowledge building takes place” where the student can apply a “set of research tools that [are] an approach to critical thinking that is grounded in the ability to look up and verify facts and ideas.”
The AI tools might not quite be to the point where they are needed to fully replace the “human essence of creativity” as they create ideas and art, but they might not be too far off.
In the English department, Professor Federico said, “We value ideas that are new and exciting but well grounded in knowledge. And AI just isn’t there.”
But before it gets to that point, Bates might want to pause and think about both the devastating environmental impacts of generative AI and the social meaning of using such a possibly biased tool.
This is why Professor Chapman has a zero-tolerance policy approach to AI in his classroom. He wants “colleagues and students in other units who were thinking of incorporating AI to really think carefully about what it is that this technology does and doesn’t do.”
So, would a statement like so many of the other colleges work at Bates?
Both professors believe that Bates should at least think about their guidelines for the use of AI at Bates.
As Professor Federico said, “It’s here. We have to deal with it.”
And we must deal with it while AI technology rapidly develops as laws and regulations lag.