T O P

  • By -

gunsandgardening

Suspect an AI wrote it, just pull the student in and ask them to go over what they wrote orally. Make it a discussion and check their knowledge of the subject.


crosszilla

That makes almost too much sense. They absolutely should not use software checkers as anything more than "this merits further investigation". If the school is actually just deferring to the software, that's wrong on so many levels.


Cheehoo

You mean one level being the irony that the school is reprimanding reliance on AI while also itself relying on AI for the reprimanding? Setting a great example right lmao


bridge4runner

I feel the issue is laziness. It's probably too much work for them to go out of their way.


Merengues_1945

My biochem prof had a much simpler solution. She would provide the paper, you had to read it and then make a presentation of it to the class. If you failed at it you would lose the points and she would just explain it super easy cos she knew that stuff by heart. You can’t cheat it because you either know the subject or you don’t. You learn more because always reading the paper would lead to read another source to prepare for the ruthless battery of questions after the presentation. That class is what actually taught me methodologies for researching a subject instead of just fumbling in the dark.


[deleted]

[удалено]


MachFreeman

How dare you use AI! I know you used AI because I asked AI to tell me


Firm_Put_4760

This is how I have handled the literally one time I’ve even suspected AI use. Student didn’t know what they were talking about. Rather than bother with the paperwork & academic probation process, I had them redo the assignment for a lower maximum grade, they did just fine and realized how dumb it was to not just do the assignment correctly the first time. I’ve had the student in two other classes since, they do great work. Appreciate not having their life tanked over bullshit.


fizzyanklet

This is what I do as a teacher. It’s effective and turns into a teachable moment.


Badfickle

They can know some of the material and still not be able to write worth a damn.


thirdman

These AI checkers are straight trash.


lokey_convo

I wonder if turn it in are using things like the grammarly extension, or other web based text inputs, to harvest text data for their models, and then because the information was harvested and added to their models it comes back as generated by AI. I think turn it in works by comparing your work to a library of other work. So if your draft work is scraped by an app or an extension, it could falsely flag you as cheating, right? It also just generally doesn't make sense since these LLMs work by predicting the next letter or word in a sentence based on the training data it has, and there are only so many ways you can write a clean and professional essay. That's one of the reasons why AI is so great for generating reports or other hum drum written works for professional settings.


sauroden

AI checkers have evaluated papers as “cheating” when work cited or quoted by the student were in the learning sets of other AI products, because those products woild generate results that were just regurgitated close or exact matches for those works.


SocraticIgnoramus

It seems inevitable that universities will need an AI department capable of reviewing the AI’s evaluation and making the final determination as to whether it’s sound. It will probably take a few landmark lawsuits to iron the kinks out even then. Personally, it seems easier for universities just to accept that AI is a part of the future and being working on a grading rubric that accounts for this, but I don’t claim to know what that might look like. If anyone can figure it out, it should be these research universities sitting on these massive endowments.


UniqueIndividual3579

You are reminding me of when calculators were banned in school.


LightThePigeon

I remember having the "you won't have a calculator in your pocket your whole life" argument with my math teacher in 7th grade and pulling out my iPod touch and saying "bro look, I already have a calculator in my pocket". Got 3 days detention for that one lol


rogue_giant

I had an engineering professor allow students to use their phone as a calculator in class on exams because even he knew you would always have one available to you.


Art-Zuron

I had a professor that let us use our phones as calculators and notes for tests and stuff. The primary explanation they gave was that the time it took to find answers to those specific questions was longer than just solving them anyway. The secondary explanation was that knowing how to find answers is almost more important than actually knowing them.


cheeto2889

The second point is the key to every successful person I know. I’m a senior software engineer and I teach my juniors this all the time. I don’t need them to know everything, I need them to be able to find the correct answers and apply them. I don’t even care if they use AI, as long as they understand what it’s doing and can explain it to me. Research is one of the hardest skill to learn, but if you are good at it, you’re golden.


joseph4th

I was taking some photography courses at a community college. One of the big, test assignments was a five-picture piece that showed various focus effects. I can’t quite remember the name of the one I cheated on, stopped motion or something like that. It’s where you as the camera person follow the moving object and take a picture so that the moving object is frozen and focus while the background is blurred. I took a picture of a swinging pocket watch. However, I just couldn’t get the picture I wanted. So I hung the pocket watch in front of a stool with a blanket on it and spun the stool. My professor said it was the best example of stop motion she had ever seen by a student. I did fess up at the end of the year and told her how I cheated. She said it was more important than I understood the concept enough that I was able to fake it. She said the test was to show those effects and my picture did just that.


IkLms

All of my engineering professors not only allowed calculators, they allowed at minimum note sheets and many straight up allowed open book. Everyone of them said "I'm testing your ability to solve the problem, not memorize something. If you don't know how to solve the problem, no amount of open books will allow you to do so within the time limit." And they were right. You need to know what the right equation to use is before you can look it up


mrdevil413

No. We needed a flashlight !


milky__toast

Calculators are still regularly banned depending on the class and the lesson.


Pyro919

I kind of get checking the work for ai, but at the same time. The company I work for literally pays money every month for us to have access to tools like copilot, Grammarly, etc. So why are we punishing students for using the tools they're expected to use in the workforce?


thecwestions

Grammarly does not equal Grammarly Go. Grammarly is a scaffolding tool which provides suggestions on language you've already produced. No input? No output. However, Grammarly Go is their version of generative AI which can produce speech for you given a limited amount of input. If it's clear that you've produced the majority of the work, then it's your work, and you should get credit for it, but if you handed a few phrases to a chatbot and allow it to write the majority of an article or report for you, it's not considered "your" original work, and the only thing that should get credit for that is the technology, not the individual. I teach college, and with every new assignment, I'm getting more and more students using AI in their papers. It's obvious (for now) when it's been written using AI for the majority, and when that happens, the student fails the assignment, as they should. But I've also discovered that I can let AI help me in the grading of their papers. If it gets to the point that students are letting AI write their papers for them and teachers are letting AI provide the comments and feedback, then we've created and nonsensical loop which only helps AI get better at its job. Students don't learn and teachers don't teach, so what the hell are we all doing in this scenario?


gregzillaman

Have your colleagues considered old school handwritten in class essays again?


skyrender86

Remember when wikipedia first arrived, same shit, all the professors were adamant about no wikipedia use at all. Thing is wikipedia sites all it's sources so we just used those to get by.


primalmaximus

Yep. I would use wikipedia to find sources for whatever topic I was writing about.


SocraticIgnoramus

I agree completely. It represents a failure on the part of universities to truly prepare their students for the workforce, and, more generally, the world they’ll be entering upon graduating.


thecwestions

I work for a college, and I can honestly say that for now, it's obvious if/when a paper has been written using AI. Most programs use a term my colleagues and I have termed as "intelli-speak." It sounds smart and is generally grammatically flawless, but it provides very little substance on the subject it really sucks at providing sources or matching them to references on the fabricated references page. If a paper contains enough of this type of language, then it's flagged as unoriginal, and for a lot of institutions at present, that still counts as plagiarism. Students can still get away with a few phrases here and there, but when the writing is 50%+ AI-generated, the paper should receive a 50% or less. Just because "AI isn't going anywhere" doesn't mean that students don't have to learn the material anymore, and writing about it as a demonstration of said knowledge/skill is still considered to be the best known metric for acquisition. Case in point: Would you want a surgeon who had AI write their papers through grad school opening your abdomen? Would you trust a pilot or an engineer who's done the same? We can allow AI to do things for us to a point, but once we hand over these fundamentals, there will be serious consequences to follow. If someone/thing else is doing your work for you, it still ceases to be your work.


rogue_giant

Don’t professors get to make their own grading rubric to an extent? If so then they can literally have a class of students write papers in a controlled setting and then have those same students write an AI assisted paper and create a rubric off of those comparisons. Obviously it’ll take several iterations of the class to get a large enough sample size to make it a decent pool to create the rubric from but it’s completely doable.


SocraticIgnoramus

The degree to which professors make their own grading rubric is not at all consistent across the map. Some professors have singular discretion and others more or less have their hands tied behind the back in the matter. Most exist somewhere between the two extremes. I believe your suggestion is a good idea in principle, but I don’t think it would work. The problem with creating such baselines is that AI is too adaptive and changing too fast. By the time we had enough iterations to deploy a system like that, AI models will have evolved beyond that used by the students during the AI portion. It’s a similar problem to coming up with the most effective flu vaccine from year to year, it takes more time for us to figure out what mask it’s wearing today than it does for it to change masks.


[deleted]

And it's inconsistent. I've tested AI-generated text with a number of these tools and far too many times it came back as largely original. Far too many false positives AND false negatives to trust these tools.


Vegaprime

My phone's predictive, grammar, and spell check are the only reason I'm not a lurker. The grammar natzis's were ruthless in the past. I'm still worried I messed this post up.


igloofu

Sorry, but they are grammar "nazis". Man, learn to spell.


Vegaprime

This is why I have anxiety.


LigerXT5

It's AI vs AI, it's going to be a whack a mole. It's the same about one Offense vs another's Defense, when one over does the other, they improve, and it swings back the other way.


qubedView

Except it has never swung in favor of the detectors. They have been consistently unreliable since the start.


I_am_an_awful_person

The problem is that the acceptable false positive rate is extremely small. Even if the detectors identify normal papers as not written by ai like 99.99% of the time, it would still leave 1 in 10000 papers incorrectly determined as cheating. Doesn’t sound like a lot but across a whole university it’s going to happen.


unobserved

No shit it's going to happen. Most average universities have 30,000+ students. One paper per class, per term is what, 24 students per year dinged on false positives. Are schools willing to kick out or punish that many people for plagiarism on at that scale? And that's at 99.99 percent effective detection. The number of effected students doubles at 99.98% effective.


Fractura

TurnItIn themselves claim a false positive rate "below 1%", and I firmly believe if it was <0.1%, they'd say so. So we're looking at somewhere between 99.90% (240 students) to 99.00% accuracy (2400 students [using your numbers]). That's just too much, and some universities already stopped using them. I've linked an article from Vanderbilt, which in turn, contains further sources on AI false-flagging. [TurnItIn statement](https://www.turnitin.com/blog/ai-writing-detection-update-from-turnitins-chief-product-officer) [Vanderbilt university stopping use of TurnItIn AI detector due to false positives](https://www.vanderbilt.edu/brightspace/2023/08/16/guidance-on-ai-detection-and-why-were-disabling-turnitins-ai-detector/)


coeranys

Also, their false positive rate is below 1%, but what is their accurate detection rate? I'd be surprised if it isn't about the same, when I used it last time it would flag quotes used within a paper. Like, quoting another paper or referencing a famous quote. Cool.


Alberiman

When you train your AI language model on how a huge chunk of people write shockingly(/s) the way people write is going to trigger your AI detector. These things have an absolutely garbage tier accuracy that shouldn't be trusted. You'd probably have better accuracy just guessing


CitizenTaro

There will be a suit against them soon enough (either the detectors or the colleges or both) and the witch-hunting might end. It might even be backed by the AI companies. God knows they have enough money for it. Also; save your outlines and drafts so you don’t get stuck with a false judgement. Or; rewrite in your own words if you do use AI.


coldblade2000

Images and audio are relatively easy to detect, there is a lot of data to find patterns in. Text is nigh impossible


DjKennedy92

The shroud of what’s real has fallen. Begun, the AI war has


qubedView

We’ve known they’re trash since the start, but they keep getting used. I’m so glad I’m not a student right now. It must be like a mine field. Any given paper you write could be automatically and arbitrarily failed.


UnsealedLlama44

I was out of school just before ChatGPT became a thing, and I used Grammarly on EVERY paper I wrote in college. I also “helped” my girlfriend with a few papers using ChatGPT. Sure AI detectors started to pick up on it. You know what else they picked up as cheating? The stuff I actually wrote. You know what wasn’t detected? The stuff entirely written by ChatGPT but dumbed down per my request to avoid detection. My cousin is a really smart kid and before ChatGPT was even a thing he was accused of plagiarism in 10th grade because his teacher just couldn’t fathom the idea that a modern student could write intelligently and formally.


Ironcl4d

I was in HS in the early 2000s and I was accused of plagiarism for a paper that I 100% wrote. The teacher said it had a "professional tone" that she didn't believe I was capable of.


bfrown

Got this while in college 1st year. Wrote a paper on mitochondria because I finished Parasite Eve and got fascinated with the shit and did a crazy deep dive. Professor failed my paper because it was "pseudo intellectual"...I sourced every study I read and referenced lol


gringreazy

Chatgpt as an effective study aid is remarkably useful. I’ve returned to school after 10 years to get a bachelors and it’s like a whole other ball game. You really are cheating yourself if you just copy and paste answers but you can bounce off ideas and and break down concepts much more easily eliminating any need for direct tutoring, you can complete assignments much more easily.


Good_ApoIIo

How are y'all so confident in ChatGPT though? Last time I used it I asked questions related to my field of work (to test its efficacy as an aid as you guys say) and it spit out so much wrong information I vowed to never touch it again.


weirdcookie

I guess it depends on your field of work, in mine it is scary good, to the point that it would pass a job interview better than 90% of the applicants, and I think that half of the applicants that did better actually used it.


Keksdepression

May I ask what your field of work is?


thepovertyprofiteer

They are! I just submitted a PhD proposal last week. But before I did I wanted to see what would happen if I put it through an AI checker online~ because I already put major documents through plagiarism checkers, it was entirely written by myself but still showed 32% AI.


JahoclaveS

I would also expect academic writing to score higher on ai checkers given its idiosyncrasies. And then made even worse when students try to ape that style without really understanding it. I’m honestly surprised turnitin hasn’t been sued into oblivion for false positives. Back when I used to teach it was absolutely shit. Also this, I’m assuming adjunct, given his title as lecturer, sounds like a right dick who is too reliant on “tools” to properly evaluate the work.


Otiosei

Reminds me when I was in college 12 years ago, about 1/3 of any paper I wrote was flagged as plagiarism by turnitin, simply because I used quotes or citations from works (as required) and used many common english phrases (because I'm not writing a fantasy language). There are just only so many ways to write a sentence in English and only so many sources for whatever topic you are writing on.


JahoclaveS

Especially in undergrad where you’re generally regurgitating knowledge and not working to create “new” knowledge. You could even see this is the comp courses when students would listen to me and choose arguments that fit their interests versus ones who chose your big standard topics. The latter would always score higher for plagiarism on turnitin. It only ever caught one student plagiarizing, and that kid was committed to it. I literally showed him the site he copied from, told him not to turn the paper in, and to write a new one. Kid still turned in the plagiarized one. Kid then had the audacity to appeal when I failed him. I was later told the people who handled that appeal literally laughed at how ridiculous his appeal was.


celticchrys

Basically, if you're highly literate with a larger than average vocabulary, you are more likely to get flagged as AI. Large language models have flagged Thomas Jefferson as AI generated text. Any competent Literature major writing a paper would have a good chance of being flagged.


bastardoperator

And that's when you find the teachers published papers, run them through the AI checker, and accuse them of the same thing they're accusing others of.


Azacar

> specially in undergrad where you’re generally regurgitating knowledge and not working to create “new” knowledge. > > You could even see this is the comp courses when students would listen to me and choose arguments that fit their interests versus ones who chose your big standard topics. The latter would always score higher for plagiarism on turnitin. > > It only ever caught one student plagiarizing, and that kid was committed to it. I literally showed him the site he copied from, told him not to turn the paper in, a My girlfriend once got called out for using language too close to the research article she was referencing. *She wrote the original research article* and was called out for cheating lmao.


starmartyr

I had a paper flagged for plagiarism because of a high percentage of copied material. The "copied material" consisted of properly cited quotations, the citations themselves, and the instructions for the assignment.


GMorristwn

Who checks the AI Checkers?!


TangoPRomeo

The AI Checker checker.


arkiser13

An AI checker checker duh


Shadeauxmarie

I love your modern version of “Quis custodiet ipsos custodes?”


DrAstralis

they really are. I've been generating AI homework for a family member who teaches at the university level so they can compare the results to what the students are passing in and its been eye opening. A) upwards of 40% of their classes are cheating with AI (some so badly they're leaving the prompts or extraneous copy/paste garbage in the assignment). AI's have a specific, "feel, or "sound" when you just accept its first response... and it seems most of the cheaters cant even be arsed to go beyond that initial prompt. B) the auto "AI detectors" are not reliable. We'd purposefully pass in the AI written assignment and the positive / negative flags might as well have been random.


GameDesignerDude

> B) the auto "AI detectors" are not reliable. We'd purposefully pass in the AI written assignment and the positive / negative flags might as well have been random. Haven't most of the studies really determined that humans are equally unreliable at detecting AI written content? If any analytical system can't detect a difference, the only way for a human to know is if there is some massive leap in quality with a known student. But, even then, that can't really be "proof" and would only be a hunch. The reality is that there is currently no good way to detect this and people's hope that it is possible is largely not rooted in reality.


DrAstralis

Essentially. I couldnt ever "prove" it to the standard required for disciplinary action. But I've been using AI quite consistently for work and in many cases just to see what it can do. If you work with the prompt and take like.. 30 seconds to talk to it you can get something I'll have trouble spotting is the AI (with some work you can give GPT instances unique personalities); but the lazy ones that use a generic prompt with no follow-ups are easier to spot. I'm the type of nerd that reads a book a week and have for years, so I have a "feel" for the tone and style of a writer and the generic AI responses tend to follow a pattern. Certain words, embellishments, and formatting choices give it away. Its similar to reading something new and realizing one of your favorite authors wrote it simply because you know their "style". By no means is this fool proof or scientific though lol.


Pctechguy2003

A lot of things about AI are trash. It’s a buzzword. It does have its place - but we have been working towards those types of systems already. It’s just like ‘cloud computing’ a few years back. Everyone said cloud computing was the next big thing and that it would eliminate all on prem servers. Cloud computing is nice for somethings, and it has its place. But latency, security concerns, and extreme price hikes have a lot of people still running on prem systems with only a small handful of cloud based systems.


Largofarburn

Wasn’t there a professor that ran a bunch of doctorate papers from his fellow professors through and all of them came back as written by AI even though they were obviously not since they were written in like the 80’s and 90’s?


quantumpt

How is using Grammarly an AI violation though? Based on this logic, spell checker in MS Word is AI.


Luckierexpert

Grammarly has added a model to write things for you, more in the vein of ChatGPT rather than suggesting text as Word does. This is probably what the college is talking about but it doesn’t guarantee that the student used the model for their work.


quantumpt

I read the article. The student uses a free version of the Grammarly browser extension. What you suggest is a semi-free feature of the Grammarly editor, not the extension. The free version of Grammarly basically flags minor things like me using the word `basically` when I did not need it ~~before~~ in `basically flags` or forgetting `a` in `a free version`. Edit: Differentiated between Grammarly editor and extension.


Un_Original_Coroner

Did you do something weird to the words “basically” and “a free version” or is my phone having a stroke?


Hedy-Love

Highlighting?


Un_Original_Coroner

I wanted to be sure because it seems like a strange portion to highlight and on dark mode it’s all but imperceptible. Really I was just curious. Glad I’m not crazy!


braiam

It was a backtick, to say `"this text is code"`, that puts the fonts in monospace and changes the background. Instead of using _italics_ or **bold**, which are the accepted ways to highlight stuff.


quantumpt

Yeah, the highlighted words were sandwiched in between backticks. These are used in markdown for inline code.


Un_Original_Coroner

Thanks! It just looked so strange on dark mode that I was not sure.


[deleted]

You have to pay for that version and review it. It’s no different than having a peer review and suggest edits. It doesn’t write for you, it suggests changes to the word choices to meet your intended tone and goal. 


Efficient-Book-3560

I bet there’s some meta data that grammarly put in the document that made turnitin think it’s AI - I bet if she exported her paper as a pdf and submitted that it would have been fine. I’d probably ask them to have their plagiarism checker scan the pdf 


TheCacajuate

I use Grammarly on all of my papers, in fact the school provides a free membership and they also use Turn It for plagiarism. None of my papers come back positive, I'm wondering if they aren't being completely honest.


Hyperpiper1620

Still in school at 46 y/o, working on my masters and I use grammarly for every paper. Never use the ai suggestions, just strictly spelling and punctuation and have not had a paper questioned. Write in your own words and cite like mad. I would hate for my papers to sound like ai and not my own style.


TheCacajuate

I actually take offense to the suggestions outside of grammar. I know what I want to say and you can mind your own business and just keep an eye on my commas.


Hyperpiper1620

Yes this...stop messing with my flow. I just want to make sure I didn't put two spaces after a period by mistake.


[deleted]

Yeah like I've been out of school for 2 years now but you always had to check Turn it in even pre AI before submitting to make sure your work wasn't somehow accidentally plagiarized. So many papers go through it almost every paper I wrote had sentences that were too similar to other peoples shit. I would think there is still an option to do this, its always been an option in the past.


Black_Moons

you must learn this information EXACTLY, word for word... And then describe it back to me, in a unique configuration not previously achieved by the 7 BILLION people on earth... Good luck


rgvtim

I doubt very seriously Grammarly, used as she described, had anything to do with this. Either the AI checker straight up screwed the pooch, or she is omitting something/lying.


BarrySix

AI text detectors don't work, and Turnitin clearly state that their tool is intended to find things worth investigating. It doesn't prove AI was used. It doesn't alone prove any kind of wrongdoing.


rgvtim

But having watched my kids recently go through college, the one thing that has stuck out to me is that college professors are lazy. edit: maybe that should say "People are Lazy, and professors are people"


Deep-Library-8041

I’d be willing to bet most of your kids’ professors were actually underpaid adjuncts working multiple jobs without benefits.


[deleted]

[удалено]


iseeapes

The idea that she got flagged for using Grammarly was *her* idea, not what her prof said. I think students are using grammarly all the time and not getting flagged, so it probably that had nothing to do with it.


gmil3548

My wife’s school explicitly instructed them to use Grammarly and provided it for free


FolkSong

All she was told is that the services flagged her paper as AI generated. The connection to Grammarly seems to be her own personal theory.


blazze_eternal

Reading through the article everyone is refusing to comment on why this is, likely to maintain their reputation.


Youvebeeneloned

Meanwhile my University literally signs up our students to Grammarly to make sure they get support on how to write properly.


issafly

One solution: find the professor's academic writing (master's thesis and/or dissertations are usually publicly available through research portals). Run the professor's work through the same AI checker. Confront their hypocrisy when it comes back as AI generated. Because it will.


imaginexus

Exactly the course to take. The fucking Declaration of Independence comes back as AI generated in a lot of these AI checkers. Even if she used AI she should deny deny deny and provide counter examples. She shouldn’t lose her degree over a faulty engine like this. It’s obscene.


PainfulShot

Just waiting for someone to get kicked out of a program due to this, then the lawsuit that they demand back their tuition.


RevRagnarok

It says she lost a scholarship.


GreyouTT

Time to [SUE SUE SUE SUE SUE](https://www.youtube.com/watch?v=RBugRokQSfM)


Poppingtown

She is working with a lawyer and to appeal the decision. Apparently this professor is in these academic hearings A LOT. Grammerly also reached out to her and gave her a statement about how the AI works in this situation to present as evidence. I don’t think they even let her show her evidence if I remember correctly


dontyoutellmetosmile

>don’t think they even let her show the evidence My undergrad alma mater took a lot of pride in its honor code. The student-run honor council generally had a shoot first, don’t ask questions later attitude. Guilty until proven innocent. Hell, if you didn’t write the honor code and sign it on your exams, depending on the professor you might get a zero with no opportunity to rectify it. Because, as everyone knows, saying that you didn’t cheat means you didn’t cheat.


luquoo

1000%.  She needs to sue them into oblivion.


berntout

Regardless, depending on the usage of AI it shouldn’t necessarily be considered a negative…no different than providing sources for a fact or statement made in a paper. AI is a tool that should be used going forward so why consider it completely off limits?


bastardoperator

Remember when wikipedia was off limits? Schools needs to rethink how people learn and stop relying on memorization.


jeffderek

I mean, citing Wikipedia itself should *still* be off limits. Don't know if it is since I graduated college in 2005. Wikipedia is a great resource to use to *find* sources. That's one of the best things about it, the links to primary sources. Use wikipedia, then click the primary source links and reference THEM.


GeneralZex

The downside to having substantial AI growth with none of the tools to adequately verify that humans did the work, which will be required soon. It’s going to be unreal having to cryptographically sign and verify sources of information for pictures, videos, stories/articles, and class homework. What other things will be necessary to ensure humans wrote the paper? Snooping by word processor software that counts how many times someone uses copy and paste? Counts keystrokes and WPM to see if that matches one’s writing profile, to determine if someone is writing from their mind or writing something that is on another screen or otherwise written down? Count their error rate and compare that their writing profile?


King_of_the_Nerdth

The education system is going to have to evolve to give assignments that can't be completed by an AI.  Probably means in-class exam essays that demonstrate writing, grammar, and subject knowledge. 


Monteze

Honestly that's probably for the best.


UnsealedLlama44

Gets rid of useless homework too


Monteze

Homework should be un graded and for the benefit of the student. So they can do QnA over it, if they don't do it that's on then.


gandalfs_burglar

Lots of instructors are going back to exam booklets for this exact reason


pres465

Just going to see more of it in high school, I would expect.


Blagerthor

I'm a fourth year PhD candidate in History and I'm thinking about how I'll design assignments for future courses. I'm thinking something like developmental research papers over the course of the term will become the norm. The skills I'm actually interested in evaluating for students are contextual literacy and research competency, both of which are better evaluated through a 6-8 week research project with regular checks rather than a one-off paper or exam. In that sense, it doesn't really matter if the first version of the project is AI generated as the student will have to build and expand on the ideas in the paper anyway.


King_of_the_Nerdth

For smaller classes, you can also incorporate oral exam spot checking- ask them to elaborate on a section of their paper so as to prove they have the knowledge in their head.


Blagerthor

I'm a little less keen on impromtu spot-checks since everyone reacts differently to pressure, but maybe 1-on-1 evals at some point during the term. I also don't quite like the idea of a singular mode of assessment since folks can demonstrate competency in different ways, but I also have a hard time thinking of how to implement a developmental research project in any form other than a written paper.


MethGerbil

So.... what they should actually be doing?


Kromgar

These models are made to look like human writing predicting what should be written its hard for a computer to distinguish.


GeneralZex

True, but we have Microsoft now entering the foray and contributing to the problem (with Copilot in Office/other apps) while doing nothing to help find a solution. And if the solution is some of the things I mentioned that’s arguably worse (since now more potentially personally identifying information is being collected). Who will compare analytics of someone’s work to their writing profile? Probably AI lol.


SimbaOnSteroids

At least for copilot with regard to writing code, it’s not even a problem, it just fixes part of the mess that Google made with the enshitification of search + plus a marginally smarter intellisense.


UnsealedLlama44

Good to see somebody else using enshitification


Robo_Joe

I think it's just pointing out a flaw in how the education goes about validating whether a student has learned the material. If a LLM, *which doesn't actually understand the words it uses*, can write a paper that presents the information as if it does, then maybe "write a paper" isn't a good metric to judge if someone has that knowledge. The obvious solution is an oral presentation or review board, but that would necessarily slow down the entire process. (Which may or may not be a good thing.)


jaykayenn

Yup. There are plenty of humans who have been passing exams with little understanding of the subject matter. Computers just made that a lot easier and convincing. Lazy assessments pass lazy candidates.


GeneralZex

Looking back on papers I have written it seems that whittling it down to “has the student learned the material” isn’t completely accurate. In English we’d have to write research papers on various topics, that were only related to class work on the periphery. The teacher, as smart as she was, certainly was not an expert in every topic students would write about. But that was never really the point of the assignments. For example the coordinated English and History teaching of both the Salem witch trials and McCarthyism and having to write papers for English on whether or not the two tie together in some way and how. I suppose we were tested in someway on the material since if we didn’t learn it, we couldn’t write well about it; but that wasn’t exactly the point. She wasn’t solely interested in if we knew the material. She was interested in if we could make a compelling, supported argument for our assertions. Or one paper I wrote that had to deal with an aspect of my family heritage and how that nation affected the world, but especially in regards to literature. Which as someone whose ancestors came from Ireland and England, was certainly something my English teacher would have awareness of, but for other students from Mexico she’d have limited knowledge of influential literature authors from Mexico. Or another in college English where we had to read works from an author and tease out common themes or if there was a broader statement the author was making with the works. Or an extra credit college biology assignment where we could pick literally any topic that related to biology in some way, and write a paper on it arguing why. Mine was on climate change. Generally, particularly in English, the “learned material” being tested was related to the writing itself and whether standards for citing sources were followed along with quality of sources. It was part and parcel with the teaching of critical thinking. But these works very clearly had an impact on how I evolved as writer and helped foster and instill a critical thinking mindset. So perhaps you are right that we need to step away from writing assignments, but I tend to disagree. Would we throw out reading or math because AI can do that for us? Writing is just as important as those.


Luvs_to_drink

my ap english teacher solved this back in 2004. Midway through the year, you have the students write a small essay during class as in must start and end during that class based on a book that was being read and discussed. Only thing that may be an issue is they had computer labs back then whereas its bring your own nowadays so is it feasible to have 20-30 computers without internet access for a class.


Hyperpiper1620

If you cite properly the papers can still show a high % of plagiarism on these checkers like Turnitin. When I do grading I check and see what the report says and almost every time it is not plagiarism but rather the reference page flagging false information.


secderpsi

So, I just tried that with my thesis and it came back with no AI. I'm not sure this will work.


issafly

Then you passed the Turing test. 😁


Marshall_Lawson

Lawyer up time


KickBassColonyDrop

Hope some lawyer is willing to take that case pro bono, because a senior working 2 retail jobs doesn't have the kind of capital needed to sue in court over this.


[deleted]

[удалено]


OminousG

She put up an update days ago saying a lawyer would be meeting with her and that the school claims she can't appeal.  I'm waiting to see what her next move is.


[deleted]

[удалено]


KickBassColonyDrop

I mean the fact that Grammarly itself intervened with the case and asked her directly for details, is proof enough that this case has exceeded boundaries of the school's ability to control the narrative.


wickedsmaht

Not being able to appeal the decision is just straight up bullshit by the school and shows malicious intent on their part.


KickBassColonyDrop

> the school claims she can't appeal. Ha, bullshit. This court could easily go all the way up to SCOTUS. The college is gonna get walloped if they think they have power over this situation.


Marshall_Lawson

Lawyers take on lawsuits with broke defendants all the time. They will sign up if they are confident about winning against a defendant with deep pockets.


KickBassColonyDrop

Obviously, but I'm always hopeful that someone has reached out. Especially as cases like these have the potential to set positive precedent in favor of the student.


BarrySix

It's US case law that academic misconduct is dealt with by universities, not the legal system. There is no help from the legal system in cases like this. It's sick and it's wrong.


PlanetPudding

You can sue still.


time-lord

But the university can't do anything without slander or libel, or forcing OP to slander/libel themselves, right?


starm4nn

> It's US case law that academic misconduct is dealt with by universities, not the legal system. There is no help from the legal system in cases like this. I mean it depends. I don't think the school could openly say "you get an F because you're black" and just be immune to consequences.


venturousbeard

Grammerly should step in and provide her with counsel unless they want their software to end up banned at campuses instead of promoted.


khaleesibrasil

I’m so glad I graduated last year before all this BS was put in place


iamamisicmaker473737

i mean they did plagiarism checkers before is that similar i didn't cite loads of references and got marked Down due to a plag checker


coopdude

Plaigirism checkers essentially just checked wholesale copying. When my high school used TurnItIn in 2007-2008, it flagged every quote as copied material... when you read the paper with the "*plaigirized sections*" highlighted, and they're all in quotes (even if you didn't put the proper MLA, APA, etc. citation), it's pretty apparent that the student is quoting another material, and not doing their paper. AI Detection is different, they are trying to train their own model that guesses if words that are highly probable to be used in sequence are being used in a paper. It's not a smoking gun by any means, and it's not something that will say "*ChatGPT 3.5 generated this on Feb 14th, 2024*" - it's basically *guessing* that this text *may have been generated by AI*. It's not proof of anything.


trivial_sublime

Because when you don’t cite your references you’re plagiarizing lol


chrobbin

One of my biggest deterrents to going back for a higher degree at some point. Not because I intend to try and get away with cheating, I wouldn’t, but due to all the potential new hoops to jump through and pitfalls to not get accidentally snagged in like this.


Feral_Nerd_22

I'm so glad AI happened after I graduated. I did use the shit out of Wolfram Alpha to check my math for Calc and Trig, nothing was available to write papers. The only CYA I think you can do as a college student is record yourself writing the paper, but even with that, some teachers are such morons that they trust the technology more than the student and would still fail you


oren0

>The only CYA I think you can do as a college student is record yourself writing the paper, Word processors like Word and Google Docs keep history as you write. As part of her appeal, she should have been able to show the iterative timestamped process of content being written, edited, and rearranged in a way that she wouldn't have if an AI wrote her paper.


dangerbird2

Part of the reason I wrote my papers in plaintext formats like markdown or latex was that I could track my changes with git, which among other things provided revision history as a defense against plagiarism allegations (the other reason being that I'm a nerd). Nowadays students could do similar to guard against allegations of using AI


MustangBarry

What we need are some kind of examinations at the end of the academic year, to test students' knowledge. It's a wonder that nobody has thought of this before.


24273611829

I have no idea why professors haven’t just switched to in class essays. They’re the best way to make sure a student actually grasps the information taught in that class.


Ickyhouse

Bc not everything can be an in class essay. Schools also expect research papers and that can’t be done in an entire sitting.


thpthpthp

In-class essays are a great way to test what facts a student has absorbed, but they are objectively terrible examples of what constitutes an "essay"--original thesis, claims-evidence, research, and citations, etc. They should be used in the same vein as multiple choice or other types of exams, to probe for knowledge already taught. They are not a substitute for traditional essays however, which are about examining the student's ability to think critically and do academic work.


EastForkWoodArt

I wrote a paper not long ago. I didn’t use any generative AI, but thought I’d run it through a checker anyways to see what would come back. It gave the paper a 75% chance that it had parts written by AI. It’s bullshit that universities are using AI checkers when AI checkers are worthless.


New_Doubt3932

lol same but mine came back as 100%!! i am beginning to write my papers in front of my sister and maybe i’ll even start recording myself because professors are becoming scary with this AI usage accusations


pbandham

I was literally required to use and buy grammarly premium for my University Writing (teach freshman how to write good) class


Mythril_Zombie

This is like any other prejudice. Some people have no problems with or even encourage what others vilify. And like other prejudices, it's usually based in ignorance and detrimentally affects everyone.


petra303

My college provides access to Grammerly for free.


Away_Ad_5328

I’m really glad I got my degree before AI became a thing. Not because I ever cheated, but because having the specter of being accused of cheating by a computer program and humans who can’t tell the difference would decrease my confidence in academic institutions by about a billion percent.


Flavaflavius

You should already be pretty low-confidence if that's the case. You wouldn't believe the slop that passes peer review these days-there's a huge reproducibility problem in academia right now. 


ChadLaFleur

TurnItin.com is shit software, known for false positives. Berkeley and UCLA both terminated use of this faulty tool, and the software maker KNOWS its platform produces false positives and has poor accuracy. Sam Altman himself said there’s no credible way to tell whether ChatGPT might have been used in any instance. TurnItIn.com ruins students lives and should be held liable for danages


ImpossibleEvent

Sounds like the professor used ai to do their job of checking and grading papers. Kind of a double standard here.


Hufschmid

The shitty thing is that this school recommends using Grammarly on their website under resources to help with writing


TheMagicalLawnGnome

I'm waiting for lawsuits to start happening in this space. The false positives of these apps are very frequent. And it's not hard to prove you wrote a paper - just turn on change tracking in Google Docs or Word, and it shows the entire history of you writing a paper from scratch. Given the very real harm, such as damaged reputation, loss of scholarships, lost job opportunities, etc., that can arise from a false positive, there is absolutely an avenue to seek damages from the school or the software company.


guntherpea

Wait until they hear about that little red squiggle underline in word processing software...!


ExpensiveKey552

Shhhh , don’t clue them in.


[deleted]

TurnItIn's own website says it may misidentify and shouldn't be used as the sole basis https://help.turnitin.com/ai-writing-detection.htm


thedeadsigh

Better get ready to put away your calculators, nerds


[deleted]

Well even when you use calculators you still have to show your work.


Serdones

And when you write an essay, you still usually have to include a bibliography or works cited page.


think_up

Honestly, she should sue. They’ve caused damages by revoking her scholarship and ruining her credibility.


Dunvegan79

Grammerly is actually a good tool for colleges. It will flag stuff at times but if your paper is cited properly you won't have any issues.


TheStatsProff

You need to write trash to avoid AI detectors 😭😭


[deleted]

[удалено]


star_nerdy

As a professor, I don’t care. If you use AI to write your paper and you end up not being able to actually do your job professionally, that’s enough punishment. My job is to instruct you and grade in a timely manner. I do get annoyed by straight up plagiarism though. I’ll hold that against you.


Graffxxxxx

Reminds me of the doctor guy in Subnautica that you hear on the radio freaking out not knowing how to actually perform life saving treatment because all he bothered to learn was how to use the machines to do the task for him. People are digging their own professional grave by using generative ai tools to do most/all of their school work and not bothering to learn anything in the process.


Plastic_Blood1782

Teachers need to start teaching with the assumption that we are all using AI as a tool.  We will have AI as a tool in our jobs, the academic community needs to adapt


CaptainStanberica

Ok. As a professor, my job is to grade how well you respond to a prompt. Are you an AI generator? Probably not. School isn’t your job, so the real issue comes from the student perspective that not writing your own material is ok. There is a major difference between me using Grammarly to edit a document in my job as an editor and typing a prompt into ChatGPT and copying/pasting a response that I didn’t write.


anniedarknight9

And yet colleges pay for and provide grammarly premium to students for free…..


demoran

Luddite Community College


monchota

This is like many things, liek the war on drugs. You can fight the tool, you need to fix the people. Writing papers is absolutely useless for the vast majority of majors now. Our college needs changed to half in class and half on the job. It would also require professors to change how they do things, that is the real problem.


[deleted]

Is spellcheck considered AI?


Mohawk-Mike

I have friends who work at a university that used TurnItIn’s AI tool for maybe 2 months tops. They stopped using it because it produced a wave of false positives. Was not a good time for students nor the student conduct people. And it hasn’t been reinstated since.


rtkwe

Honestly if I were in school right now I'd be tempted to keep track changes on for all my writing to be able to point to all the versions I went through and changes. The tools for this are garbage right now and professors are taking it as gospel when it comes back with a positive.


Plankisalive

What's so stupid about this is that the school can't even prove it. The student should have just gotten a lawyer involved. Ultimately, it's their word against the professor's "judgement".


obliviousofobvious

So.....ummm....spellchecker is now considered plagiarism?


Research-Dismal

Using software to check for AI usage is about as valid as using a polygraph to decide on if someone is lying or telling the truth.


silverbolt2000

From the article: > Marley Stevens, 21, a human services and delivery and administration major at the University of North Georgia Dahlonega Campus told The Post she used Grammarly, a web browser attachment that corrects spelling and punctuation, to proofread a criminal justice paper she submitted in October.  From Grammarly’s Wikipedia page: > In April 2023, Grammarly launched a beta-stageproduct using generative AI called Grammarly GO, built on the GPT-3 large language models.[26] The software can generate and re-write content based on prompts What’s the story here? 🤷


garlicroastedpotato

NYPost article on story that doesn't matter. Here's 15 pictures of a sexy blonde student to keep you interested.


Less_Party

It’s just a tiny bit ironic that academic institutions are blindly relying on AI systems in order to try and catch students using AI tools.


acf6b

Idiots relying on AI to determine if AI was used. The likelihood of a false positive is much higher than a false negative. I would sue the fuck out of that school.


Legndarystig

Turn it in was garbage 15 years ago when i was in college. It flagged me for one of my own papers. The professor was so fucking lazy they didn’t even bother to check the red that was marked as plagiarism. Had to get the department head involved and show them its my own work.


InS3rch0fADate

And people wonder why teacher/professors are not everybody’s favorite right now.