
Block is a first-year political science student at California Polytechnic State University in San Luis Obispo. He is from Encinitas.
Most students (that I know of at least) aren’t using artificial intelligence to complete every assignment for them. And the ones who are copying straight from ChatGPT are the same ones who would be copying straight from a Google search.
Since January, I have been using AI tools to help me with my own schoolwork. While large language models like ChatGPT seem to get better every day, they still aren’t quite good enough yet to churn out A-worthy answers (plagiarism issues aside). But my reality feels at odds with the perception of ChatGPT among many adults who aren’t close with any students who use it. I can only speak for myself, so here are some key ways I use ChatGPT and similar models to help me with my schoolwork.
One of the most helpful services I use ChatGPT for is helping create introductions and conclusions. I sometimes find it difficult to concisely summarize all the information of my work in general. This becomes extra hard when my work is a 12-page research paper, not just a 750-word commentary. Like many people, I find writing the first couple sentences is the hardest part. I will often plug my essay into ChatGPT and ask for its help writing a conclusion. And before you get upset, I never directly copy and paste that into my essay. I’ll use the first few sentences of the AI-generated intros and conclusions as inspiration for my full-length, human-generated versions. This goes a long way to get me past the writer’s block and get into a flow. For me, it stands as a decent (but imperfect) starting point for me to revise, rework and build upon.
I have also been using large language models to help explain specific concepts I’m struggling to understand. I find that internet-connected tools (specifically Google’s Bard and Microsoft’s BingAI) are usually better at this than ChatGPT, which is locked to information prior to late 2021 and cannot access the internet without plugins. For a specific example, I was having trouble understanding the difference between a type I (false-positive) error and type II (false-negative) error in my statistics class. Instead of using Google search, I had BingAI explain the difference.
I predicted in a January essay that students may use AI tools as personalized tutors, but I’m not using it as much as I anticipated. As I begin to take more political science-focused classes, where answers can be conveyed well in paragraphs, I expect I’ll use it more. For now, it just isn’t as useful for classes like statistics.
The last major tool I’ll mention here is text summary. Don’t get me wrong. I try to do a full reading whenever possible. At the same time, I won’t deny that I have busy weeks that make it hard to get through dozens of pages of dense academic writing. Again, the internet-connected large language models reign supreme here. They can usually access the text, while ChatGPT often “hallucinates” (makes up information) or is overly broad. Accessing the key points of a text in an instant is incredibly useful when time is limited. Even when it’s not, knowing the key points going into a reading makes it easier to understand. It allows me to pick up the patterns easier and breeze through in-depth explanations for concepts I already understand.
AI can be extremely useful if used the right way. In just a few months, ChatGPT, Bard and BingAI have helped me more than I could have ever expected. I’m looking forward to seeing how I can continue to use these tools as they become more capable. At the same time, I have to acknowledge the drawbacks of AI. Current models are prone to spitting out false information. (For this reason I never ask large language models for sources on a project.) It is hard to consistently identify whether a text is AI generated, making plagiarism easier. And, of course, AI often has trouble understanding context that a human would naturally interpret.
I think my fellow students should use these tools to maximize their time and efficiency but make sure that they still understand and can explain anything they turn in. I also think skeptics worried about AI in school should know that some students may abuse these new tools, but the vast majority of students are using them responsibly (or not at all).
Please cut students some slack and give us the benefit of the doubt. We know that a shortcut now may cost us down the road.