Teacher caught students using ChatGPT on their first assignment. Debate ensues
Professor Megan Fritts reported students using ChatGPT for assignments, igniting debate on AI's impact on education, critical thinking, and reading skills, with educators divided on its classroom integration.
Read original articleProfessor Megan Fritts from the University of Arkansas at Little Rock reported that several students in her ethics and technology class used ChatGPT to complete their introductory assignment. This incident sparked a debate on social media regarding the implications of AI in education. Fritts expressed surprise that students felt compelled to use AI for what she considered a simple task, highlighting concerns about the erosion of critical thinking skills. While some educators defend AI as a tool similar to calculators, Fritts argues that this comparison is flawed, particularly in the humanities, where the goal is to foster independent thought rather than produce a specific output. She noted that students have expressed a decline in their reading abilities and attention spans, attributing this to technology addiction. Fritts acknowledged the need for educators to teach responsible AI use but criticized the notion that the burden of addressing cheating should fall solely on them. Many educators are divided on the issue, with some embracing AI in the classroom while others revert to traditional methods to combat its use. The ongoing debate reflects broader concerns about the impact of technology on education and student engagement.
- Professor Megan Fritts reported students using ChatGPT for an introductory assignment.
- The incident has sparked a debate on AI's role in education and its impact on critical thinking.
- Fritts argues that AI use undermines the goals of humanities education.
- Students have noted a decline in reading skills and attention spans due to technology.
- Educators are divided on how to integrate AI into the classroom effectively.
Related
Why AI is no substitute for human teachers
A study from the Wharton School found high school students using generative AI for math prep perform worse on exams, highlighting the need for guidance and the importance of human teachers.
AI Cheating Is Getting Worse
Colleges are grappling with AI-generated cheating, prompting educators to seek new teaching methods. Many feel demoralized, with some considering leaving the profession due to declining trust in students.
AI cheating is getting worse. Colleges Still Don't Have a Plan
Colleges are struggling with increased cheating from AI tools like ChatGPT, prompting educators to seek innovative strategies, including curriculum integration and revised assignments, to maintain academic integrity and engagement.
Kids who use ChatGPT as a study assistant do worse on tests
A University of Pennsylvania study found that high school students using ChatGPT scored worse on tests, while a specialized AI tutor improved problem-solving but not test scores, highlighting potential learning inhibition.
Kids who use ChatGPT as a study assistant do worse on tests
A University of Pennsylvania study found high school students using ChatGPT performed worse on math tests, indicating that reliance on AI may hinder learning and problem-solving skills despite improved practice performance.
I really hate these type of questions. The open ended questions that is about myself is not something useful. Students will usually do it because they have and will come up with anything (even before LLM). I understand the point that part of that is for students to get to know each other and professor know more about their expectations.
I was once a grader for Astronomy class in the summer for the public (anyone could take it). Part of the assessment was that in each assignment students will ask one question and each student should answer at least one other question. I really questioned my relation with the world grading those questions. Astrology was the common topic. Random questions from all about anything you could come up related to Astronomy (or you think it is related). I no longer teach or grade and my experience were pre-ChatGPT era. ChatGPT will change education and educators have to adapt.
They were forced to do it and they just did it for the grade. The purpose was to encourage communication and collaboration but in my experience this rarely work. People will just put bare minimum to just do this. LLM makes this too easy and it will be too tempting to use it. Some people did good job both asking and answering. But many were just not interested and had to do it anyway.
Scientists collaborate out of necessity and because most of the them have genuine interest in this field. They get trained during their grad school and usually they still don't do well communicating and collaborating with each other. They are human after all.
For the case we have in hand I can see a student tempted to write poor version on answer to this question without having to worry about grammar, typos..etc and will ask LLM to write in an academic style. How would they practice writing academic writing and express their thoughts properly? Probably never but this is them tricking themselves.
Another problem which makes this case unique is that this course is about ethics. So lets set plagiarism and university rules aside. A discussion on using LLMs would be better start for a class about ethics and technology.
How are you all feeling your attention span change? Are you able to sit and think or read long passages without distracting yourself? Were you ever able to do this? If you find yourself struggling, how do you manage?
I find myself reaching for audiobooks much more frequently these days. Longer tasks are still possible if I find sufficient motivation. Curiosity is not always enough. I have to craft specific goals for myself in order to maintain motivation, but this takes me quite far in my self-study.
Why do the teachers care? Let the students not learn and suffer from it.
The only reason they pull stunts like this is because it actually doesn't have as much of an impact.
Why do employers stop asking about grades, and instead focus on actual experience? Because all things equal, that's what matters more. So can we blame students for trying speedrun these barriers to get to what's actually important? And not what they're told is important by educators who have an obvious bias, but what they can see in the reality of the job market.
https://www.reddit.com/r/Professors/comments/17v6478/chatgpt...
He shared that some teachers now require AI/LLMs for homework assignments, such as writing essays. The actual assignment is to critique the output of the LLM.
As a millennial, even our middle school classes taught information literacy in various forms in the "computer lab". And that was the nascent days of the web.
AI is a tool, not unlike a calculator or Wikipedia. They were both controversial and even forbidden at times. Students adapted. So did education.
The class, therefore, involved laying the foundations, doing the research, constructing footnotes and a bibliography, producing a couple of drafts, and finally submitting a finished work.
And so the instructor was there to shepherd us the entire way through the whole process, with oversight and feedback at every step. This was not simply, "OK you took a class on <X> and it's time to spew out a paper of <N> pages on it." This was learning how to correctly do a research paper from start to finish.
It was fantastic because it really made cheating stupid. Whether you were going to purchase a paper online, or have an LLM write it, or pay a friend to do your work, every cheating method was profoundly irrelevant and useless in the face of this process. At the end, either you've learned something about writing a good college paper, or you haven't.
I took other classes that taught about rhetoric, evaluation of sources, and extra credit was to read and analyze a novel. The instructors were top-notch and highly credentialed, at the top of their game. Especially for community college where I was taking on a FAFSA grant, I thought that the whole process was absolutely rigorous, educational, and extremely edifying even at my advanced age.
>> Hi I'm ChatGPT and I can't take this class because I am a chat bot.
The smart ones swapped their name in when they copy and pasted the answer and went undetected.
https://link.springer.com/article/10.1007/s10676-024-09775-5
(although they concluded it's "soft" bullshit, as it doesn't aim to 'harm')
It really has become a defacto flag bearer for all LLMs in the market, and I don't know a single person who doesn't use save for my father who is from the boomer generation.
English is of course different thing, because there is no actual grammar, it is just word salad and everything goes.
This seems like a straightforward upgrade of the school experience. There truly are dumb questions; many of them get asked in class because speaking to a rubber duck first would be "disruptive." If you're going to take the whole class's attention down a tangent, why is it bad to get a smart generalist's opinion on the optimal shape for that tangent, first?
Related
Why AI is no substitute for human teachers
A study from the Wharton School found high school students using generative AI for math prep perform worse on exams, highlighting the need for guidance and the importance of human teachers.
AI Cheating Is Getting Worse
Colleges are grappling with AI-generated cheating, prompting educators to seek new teaching methods. Many feel demoralized, with some considering leaving the profession due to declining trust in students.
AI cheating is getting worse. Colleges Still Don't Have a Plan
Colleges are struggling with increased cheating from AI tools like ChatGPT, prompting educators to seek innovative strategies, including curriculum integration and revised assignments, to maintain academic integrity and engagement.
Kids who use ChatGPT as a study assistant do worse on tests
A University of Pennsylvania study found that high school students using ChatGPT scored worse on tests, while a specialized AI tutor improved problem-solving but not test scores, highlighting potential learning inhibition.
Kids who use ChatGPT as a study assistant do worse on tests
A University of Pennsylvania study found high school students using ChatGPT performed worse on math tests, indicating that reliance on AI may hinder learning and problem-solving skills despite improved practice performance.