AI becoming necessity at colleges

Artificial intelligence has become part of the fabric of everyday life from Siri to Alexa and autonomous cars.

The world of education changed around this time last year when ChatGPT came on the scene.

It has sparked debate in every sector and some concern, especially in higher education.

“Artificial intelligence has been around for over probably 60-plus years. In 1959, the term artificial intelligence was already coined. There were a group of philosophers, mathematicians, scientists, I believe at Cornell University, if I remember correctly, that were already talking about machine learning in the 1950s. And we can go even to the 1900s and talking about science fiction and robots and you think about L. Frank Baum when he wrote The Wizard of Oz, like the robot that was already talking to us … So there’s a little bit of art and science with artificial intelligence,” Odessa College Vice President for Instruction Tramaine Anderson-Silvas said.

Anderson-Silvas recently presented a report to the OC Board of Trustees on AI. Although it has been talked about for decades, it was still a shock when ChatGPT, which is a generative AI that if you ask it a question, it can spit out the answer within seconds.

It collects data and information on people, listens in on people’s conversations and looks at what people search for online. ChatGPT collects all of that data, looks for patterns and gives the information back to you.

Tramaine Anderson-Silvas

Anderson-Silvas said some students take it at face value.

“They can copy and paste that and put it into a discussion prompt or put it into a paper,” she said.

For some it felt like the sky was falling. In summer of 2022, there were webinars and meetings on AI with colleagues from around the world. No one was sure what to do now that ChatGPT had arrived.

“I think with education, our world has changed and we’re going to have to evolve with the change,” Anderson-Silvas said.

People are still concerned about cheating and students not learning, but academics and instruction are going to have to change the way they engage students with AI, she added.

AI processes information a lot faster than a human brain, but it doesn’t mean the information is right. Students are going to have to be critical of the information they receive.

“You have to be able to analyze and evaluate it, and you have to be able to synthesize. You have to be a critical thinker with the information that you’re given back,” Anderson-Silvas said.

She noted that there is a task force at OC on artificial intelligence made up of IT personnel, some staff from Institutional Effectiveness, OC Global distance education staff and some instruction personnel are part of the team.

How do you teach students to think critically?

Anderson-Silvas said it’s the type of questions asked, the type of strategies used in the classroom and the type of assignments given.

OC has six Odessa Student Learning Objectives: Critical thinking, communications, empirical quantitative skills, teamwork, social responsibility, and personal responsibility. This also covers dual credit, early college high school and career and technical education students.

There is a group that is leading six different teams to create assignments, or assessments, on critical thinking and communications using real-world application.

OC faculty gets to know the students and understands their backgrounds. If a student who was a poor writer at the beginning of the year, but suddenly improves the faculty knows something is up.

Anderson-Silvas said she talked to several faculty members at a lunch and learn where they shared ideas on AI.

“The future is here. We cannot go back. Our conversation here at Odessa College … is that we’re not going to be able to change it. We’re going to have to evolve and get better so what ways can do (that)? We have talked about putting in your syllabi what your position is; what your stance is on the use of artificial intelligence,” she said.

“As a college, we believe this is a part of what we are and what we’re becoming, but we want our students to be critical and ethical. The need to use academic honesty. They need to be able to attribute,” Anderson-Silvas added. “That’s been our stance, is not to be afraid, but to walk in this with some wisdom and understanding (of) what we can do to get better and improve.”

AI has its limitations on how far it goes back for information, but it can generate pictures, copy voices and create presentations to name a few.

“We have faculty that are not afraid of it. There are some that express their concerns and we’re working with them,” Anderson-Silvas said.

Faculty are advised to put their expectations into their syllabus and state what they want in writing and orally.

“So if you don’t want them using it, then you need to say that in your statement of expectations,” Anderson-Silvas said.

Anderson-Silvas said there are companies that are able to determine what is and isn’t AI and if students are using it. If students are using it, they are going to have to be taught how to use it ethically and be critical in their approach to using it.

There is also software that can tell you if a student has plagiarized what they’ve written. Turnitin.com is one example, she said.

“When a student writes a paper … and I had this when I was teaching, I can go in and pull that paper and submit it into a database and I can see what percentage of it is plagiarized, or copied. If it’s above a certain amount, then I use my syllabi and say, Okay, I either give you another chance, or I’m going to deduct points because you copied this and you didn’t attribute who you got it from. This is not original content. That’s where Turnitin and Blackboard, our learning management system, are now trying to catch up with how to determine AI, generative AI ChatGPT and Bard (AI). All of this evidence that students are using to turn in a paper or to submit a discussion. What I did tell the board is, you know, I find this akin to like, when calculators were coming into play, and I said, you know, during my time it was when the internet was invented, and we thought, Oh, this is, you know, this is big. This is, you know, horrible, but it does help generate and we evolve in our learning and we evolve with the way that we teach. And we work with students to be able to attribute, you need to cite your sources,” Anderson-Silvas said.

Amin Davoodi, assistant professor of bilingual and ESL education at University of Texas Permian Basin, said AI is not new in higher education. People have been using it for the past four or five years, but the advent of generative AI, especially ChatGPT, has made it more popular in academia.

Information from AI only goes back to April 2023, Davoodi said during a November interview.

UTPB is working on an artificial intelligence policy, but it’s still in the early stages.

Davoodi said there are two groups of faculty. One group does not recommend using it and warns students in their syllabus not to use it, especially because the plagiarism software used on campus can detect AI text.

It gives you not only a similarity report for plagiarism, but a percentage of how much text was written by AI.

Some faculty discourage students from using generative AI and presenting the work as if it is their own.

“But we have another group of faculty who are so interested in integrating artificial intelligence into their teaching and into their assessment. They acknowledge the existence and the emergence of generative AI and they even encourage their students to use it responsibly. … We do not have … a shared view on artificial intelligence. I think across academia, it’s not just UTPB, and I think that’s a good thing, because we don’t necessarily want to be on exactly the same page. We may have different needs based on the student’s need,” he added.

Davoodi said he’s on the side that appreciates AI for learning by students and faculty.

“I encourage my students to use it responsibly because the way that I look at AI, I look at it as another tool that can help all of us to become better learners and better teachers. Having said that, I think people who are kind of worried (about) the fact that AI may lead to plagiarism, they are ignoring the fact that there are hundreds of other tools which are not detectable and they can actually lead to plagiarism. There’s no way we will ever be able to figure them out. Because unlike AI, that you can to some extent rely on these detectors that we have, there are hundreds of companies and websites” that students pay to write essays for them, Davoodi said.

Those essays are written by people who work for those companies.

“So there is no way that we can ever detect those,” Davoodi said. “While I understand the concern, I think AI shouldn’t be blamed. It’s that individual, that student who may find another way to submit someone else’s work instead of working on the assignment themselves. Because as I said, they can easily pay one of these companies and they have hired good researchers, good writers, and they are all human beings. The text is technically produced by a human being, but not the student; someone else that we can never figure out. It’s not just AI. The problem is bigger.”

In his area of bilingual and English as a second language, Davoodi said the most challenging second language skill is writing.

”Writing is the most challenging skill. I have come up with a model (where) my students can use generative AI to be the writing assistant, kind of like a tutor. … Unfortunately, many teachers in K through 12, and a lot of professors in academia in higher education as well, when it comes to writing and assessing the writing, they focus on the product. They want to see your essay. They want to see your composition and they’re going to judge that. That’s not what I’m interested in. I’m interested in the process of writing. Because without understanding and analyzing the process of writing, it would be almost impossible to realize the weaknesses and the strength of someone’s writing skill. Therefore, I’m using generative AI as a tool that can help my students, second language learners, foreign language learners, to improve their writing,” Davoodi said.

“This is how it works. I want them to write a draft in a research project. I want them to write a paragraph or an essay about a topic. Then I will give them prompts that they can use to work on ChatGPT; and ask ChatGPT to analyze the writing, the one that they produced and they basically ask ChatGPT to come up with a list of suggestions of how that writing or essay can be improved,” he added.

Based on guidelines, evaluation and recommendations from generative AI, Davoodi asked students to revise their essays.

“Then I asked them to submit both drafts — the first draft that was produced by the student solely and the final draft that was produced by the student (and) revised based on the recommendations of generative AI. In that case, I can see the process and I can see how their writing improved and also they get a chance to submit a revised version of their essay learning about the mistakes, learning about the strategies that they can use to improve their work before even submitting it to me. So basically, ChatGPT here has two roles. He is my teaching assistant, helping my students to improve their writing, and it’s also a writing tutor for my students allowing them to find weaknesses in their writing and recommending areas for growth. So far, I’ve realized that this method has been very helpful with my students and after using it for a while, the first draft that they produce improved a lot. So the generative AI wouldn’t find as many issues as it found the first time that they use it. That’s just one example,” Davoodi said.

It also helps with fluency and accuracy in speaking.

“In another study, I’m asking second language learners to talk about a certain topic and using that add-on, they are recording it and it automatically transcribes it and gives it to ChatGPT. Then we ask ChatGPT based on the level of the proficiency of the student, to mark areas for growth and identify the issues that they have regarding the choice of word you know, accuracy, grammar and things like that,” Davoodi said.

He hopes to complete the studies and compile the data by summer. He plans to publish the results of the studies eventually.

Davoodi is also using generative AI with his graduate students.

“I teach a research methodology course. In that course, my students work on a research proposal. One of the things that I don’t like to do, but I have to do, is to check if all the references that they list at the end of their article, at the end of their proposal, was actually used in the proposal. Because sometimes people just list references … to have a longer list of references, but they never use those references … For me, traditionally, I would go to the reference list and one by one, I would go back and mark in the text to make sure that if they list something in the reference list, they actually used it. For one student that can take between 30 minutes to an hour depending on the number of references. Right now, I can do it with ChatGPT in less than a minute,” Davoodi said.

There is a group of faculty working on AI and interested in it. They have had an initial conversation, but they hope to have a panel presentation in March for the faculty at UTPB.

“I think by then, we may be part of the group (that is) working on the policy in collaboration with the ITS (Information Technology Services),” Davoodi said.

Friends and colleagues have asked if he thinks AI will replace teachers.

“It’s very similar to that old-fashioned question will we have robots (as) teachers, you know, sort of teaching. My answer to both questions is the same: No. However, I guarantee you that the near future, and that’s not going to be even 10 years, maybe in five years, teachers who use generative AI and other technology are going to replace those who do not because this is the future. You have to adapt yourself whether you like it or not. If a traditional teacher is going to be replaced, they’re going to be replaced by someone who knows how to use this technology. They make teaching and learning more effective.

He would also argue that it takes a lot of creativity, skill and hard work to come up with the right prompt to ask ChatGPT for an essay about the Permian Basin oilfield.

“Depending on the way that we ask, the essays are going to be totally different. So I don’t believe that using it makes students lazy, it’s just a different form of learning that we have to accept,” Davoodi said.

Allystair Jones

Odessa College Biology Department Chair Allystair Jones said AI is great progress for education.

“The information is now organized for them in a better way than just Google. They still have to make sense of all of that. … So we can use this or go extinct,” Jones said.

There have been a couple of ways Jones has applied AI. One way was having his students do a paper on vaccines using prompts he typed into AI. Then he asked the students more thought-provoking questions that artificial intelligence couldn’t answer.

“Another way I’ve used it is I’ve had people use a program called NetLogo. It’s a free program that they can use, but it’s code. You have to write code for this program to work. You can spend a semester teaching students and then have them do what you want them to do. Or you can simply type in the AI, write NetLogo code to do this, and it’ll write the code for you. Then you cut and paste that into the program. Does it work? Does it do what I want it to? If not, let’s go change it, come back and forth,” Jones said.

One of his students who wants to be a rancher wrote a program to optimize alfalfa in cattle production, getting as much meat as possible, for the least amount of alfalfa in weights and pounds.

“He did that using software that he had never seen before. That would have taken him months and months to learn … So AI helped us skip that simple regurgitation step and get to the main meaning.”

Jones also has used it in his first eight weeks genetics class. He asked a student to create a YouTube presentation telling AI to write a PowerPoint presentation. He then had her go through the information and tell the story behind it.

He said the student could use the same slides if she wanted, but he wanted her to understand what it meant.

Although he hasn’t asked everyone about it, Jones said he would suspect there are some subjects that would have more difficulty embracing AI than others.

“People often think science would be difficult, but for me, it’s always been about the story, the meaning behind sciences and what I want to get to my students. So for me, it has not been difficult,” Jones said.