Many new first-year and upperclassmen students may have been surprised by the sudden free access to Google’s advanced artificial intelligence, which includes Google Gemini, Notebook, DeepMind and DeepResearch. Additionally, Bates students also now have access to ChatGPT Plus.
Many first-years are adjusting to the transition from high schools with strict honor codes about AI, where even the brief mention of the inciting tech has taxing consequences. Now, at a rigorous college that provides a free AI platform for all students, students may feel that they are entering into a test of academic virtues in an already testing environment.
All First-Year Seminar workshops are introducing how to use Gemini videos and many syllabi have a designated excerpt detailing the use of AI, which tells how to properly go through these channels, rather than dismissing the charade as a whole.
Bates’ website on Curriculum and Research Computing that is dedicated to AI states: “The use of Gemini from your Bates account affords several protections not in place when you use other AI platforms.” The website emphasizes that Gemini has no human review, no model training and that the data stays within the organization it is entered in.
Some academic departments have stricter constraints on the use of AI, such as the Environmental Studies Program. Their website explains that “Artificial Intelligence (AI) and Large Language Models (LLMs) are premised on extraction and theft — of water, land, energy, creativity, and intellectual property.”
Other academic pages don’t comment on specific use. It’s no surprise that Bates is acquiescing to involving AI and granting access, as many universities and colleges are doing the same. Though Bates has online pages on AI resources, they don’t have a page of guidelines for using AI, as sister schools like Bowdoin, Williams and Amherst do.
These schools have a page designated to explain specific barriers of AI, and reminders of ethical values. For example, Bowdoin’s website states, “Bowdoin’s mission, values, and vision are the foundation of these [AI] guidelines, and the College’s commitment to environmental, intellectual, and equitable principles are reflected in our tool selection and usage recommendations.”
Patricia Schoknecht,Vice President for Information and Library Services at Bates, explained that the decision to grant access to AI was actually from Google itself. She stated, “Once Google makes the decision to make something available within the Educational Workspace, (and within our contract), we do not have the ability to refuse.”
That is, if Bates refused to accept this tool, they would have to eliminate the whole contract with Google, which includes changing the email system and other essential tools that the campus utilizes.
Schoknecht noted that Gemini “is protected by the privacy components of our contract and they promise not to use our data in the training of their models. So, we are comfortable with those protections.”
Schoknecht said that if there was a possibility to change workspaces, that would require a much greater discussion campus wide for such a big decision. She also noted that even if they did choose to switch, other workspaces would also present the same challenge of holding AI within their contract. For example, Microsoft has an AI tool called CoPilot that would pose the same challenge.
Schoknecht agreed that AI is here to stay in our lives, and that we have to learn how to use it to our advantage without letting it threaten to control our lives.
“There are a pile of issues after that about the appropriate usage of AI. Those are for us as a community to discuss, using our values as a guide,” Schoknecht said.
Professor Zef Segal is a first year assistant professor in the digital and computational studies and history departments. He uses AI for his coursework, specifically to help for efficiency, and brainstorming for his syllabi.
He believes, like Schoknecht, that in giving students access to AI that students can maintain the privacy of their academic and personal work. Segal also explains that by giving free access to all students, this decision limits barriers that may arise if some students have access, and others don’t, which would exacerbate an unfair academic advantage.
Segal also noted granting all around access bridges a technological gap where students don’t feel left behind, or marginalized by the ubiquitously moving tech.
When asked why we should use AI in the classroom, Segal said that AI is important because of its use of conversation. He explained, “AI should be feedback for our thoughts, for our writing. Sometimes it is not the best feedback, so we have to push against it all the time, but it’s useful.”
He added that AI is not precise, however, so the user must “talk” to the program, which in turn helps our ability to converse. AI, he explained, should help prompt critical thinking.
Segal stated that AI is not supposed to provide authoritative answers or precise results, but rather acts to enhance student engagement. Additionally, he stated that it is important to use AI in the classroom because it heightens our ability to converse and ask questions.
He encouraged students to understand that we should not accept a response unquestioningly.
He noted that we must understand that AI does not have any ethics and that we can not let it become a leader. “Human beings adjust to maintain survival, adapt to new changes, but we always maintain thinking and creativity in different situations,” he said.
“Technology is in our lives, it will always be in our lives, and we have to learn how to adapt to that change. Technology will bring us to a new baseline, where our creativity will advance from then on.”
