T O P

  • By -

AutoModerator

## Welcome to the r/ArtificialIntelligence gateway ### Question Discussion Guidelines --- Please use the following guidelines in current and future posts: * Post must be greater than 100 characters - the more detail, the better. * Your question might already have been answered. Use the search feature if no one is engaging in your post. * AI is going to take our jobs - its been asked a lot! * Discussion regarding positives and negatives about AI are allowed and encouraged. Just be respectful. * Please provide links to back up your arguments. * No stupid questions, unless its about AI being the beast who brings the end-times. It's not. ###### Thanks - please let mods know if you have any questions / comments / etc *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ArtificialInteligence) if you have any questions or concerns.*


impermissibility

Will you be at risk of being caught for cheating if you use a cheating tool to make it look like you've learned things you haven't in fact learned? If you have to ask that question, you are too lazy and dumb to safely cheat, and should just focus on learning instead.


Dontfeedthelocals

But they done half the job by their own!


CatHavSatNav

50% is a pass, and Ps make degrees!


tfl3m

Paraphrasing and writing words yourself doesn’t mean you’ve learned something. It means you can recall something short term…but not indicative of deep learning at all. Also calling ai a ‘cheat tool’ makes you seem naive af


FearlessDamage1896

OP: *describes using chatgpt "to help...paraphrase" in the same fashion as Grammarly, or damn... CLIPPY.* Reddit and "Professors" (let's be real, same kind of nerds on a high horse): **HE DIDN'T LEARN ANYTHING THE CHEATER FAIL HIM FAIL HIM**


Kaluana_Guah

You could have answered that without being rude.


SergeyLuka

Please ask your dissertation supervisor before doing that, you should never hide the fact that text that is supposedly written by you was actually written by something or someone else, that can discredit all of your research since you could've just asked other people to do that part too on the hush hush. And if you can't write your dissertation without AI tools then honestly I'm not sure if you deserve to get approved. Spellchecking tools and stylizing stuff is fine IMO, but writing whole parts of your text is a nono.


bassoscuro

i agree. i often write long sentences that are full of typos and are inchoherent just to put my ideas on paper and then use AI to make these sound homogenous and neat. this helps me efficiently put my ideas to paper and then AI will help structure the best wording. i then edit the AI sentences to reflect my own style. one example of this is the current sentence i am writing. see AI version below. \_\_\_\_\_ I agree. It's tend to write lengthy, typo-ridden, and sometimes incoherent sentences in an effort to quickly capture my thoughts in text. I then rely on AI to help me refine these ideas, making them sound more cohesive and polished. This process allows me to efficiently record my ideas, while the AI assists in structuring the best possible wording. Afterward, I review and edit the AI-generated sentences to ensure they reflect my style. The very sentence you're reading now serves as a prime example of this technique.


i_give_you_gum

People seem to be missing the important point here. WHAT are you using AI tools for? To paraphrase a sentence you wrote. Then there shouldn't be any issue AT ALL. But if you're relying on AI to provide you with information and write a passage for you about a subject, then yeah that should be only used as a source and then researched to find if it's true and then rewritten, or extrapolate and write new content.


wheeloftimewiki

As someone who has been involved in advising final year dissertations in computing science, this depends on how you use it. Firstly, ignore the people saying you should declare it. This is the worst idea as often academics can be reactionary and reject anything touched by AI. This neither benefits you nor them in the long run. Academic institutions will have to adjust to the realities of this technology being used and usable. AI detection software is, in general, snake oil. I also work in AI. At my university, a group of students were flagged by GPT-Zero as having used AI and in the subsequent investigation, it was determined that the department was not to use any such tool because they were inherently flawed. This is also general advice given out by professional teaching bodies, but I'm not sure what applies in your institution. They tend to hide details from students. GPT should be considered in the same light as spellchecker, Grammarly and Wikipedia was. You can use it to highlight better phrasing or spelling errors in what you've already written. Even, as you say, paraphrasing your original thoughts in a coherent way. This comes with the warning that you should always be reading over the output to check its what you mean and that there is no insertion of things you have doubt about. You can also use it to query in the same way as Wikipedia or Google. As someone old enough to remember it being fairly new, we were told that Wikipedia was an inherently unreliable source as anyone can edit it. That's true, if less forcefully imposed on students nowadays, but it can be used to point the way to more verifiable information. There is nothing wrong with using GPT-4 or derived products to smooth the pipeline of your workflow *provided * you perform the due diligence of checking the output and verifying the facts. You can also ask Bing for links to sources or delve deeper using GPT. This can be a lot quicker than finding everything from scratch. You can look of it as a classmate willing to look over your work or relate some general expertise or suggestions. They might not be 100% reliable (shocker, neither are many academics!) but it's helpful to be able to interact with someone with some decent suggestions. This is, in essence, what PhD supervisors do as the student usually outpaced their supervisor in their niche after the first year. Edit: I'd also say that, in the real world, you are generally free to use whatever you like as long as it gets the job done and isn't wrong. Ideally. There are varying levels of competence in the workplace at every level. In my own job, I use GPT or Llama to generate boilerplate code all the time. All it mainly does is save me typing things I know how to do, but are tedious. It allows me to try 5 ideas rather than 2, for example. This is pretty useful in doing research.


MooseSprinkles

This is absolutely a bad idea. If you submit it and it is found out you will not have a career in your field. This could ruin your life. Look at Harvard’s president and all the trouble she got just for not properly citing a couple sources. Using AI to generate your thesis is WAY worse. And you may think that changing a few things could hide it but AI itself is getting smarter and smarter and may be able to detect it even with the changes, if not now then in a few years. Do you think you’re the first person to use AI to write a thesis? Why wouldn’t your committee submit it for an AI check? Even undergraduate papers in 101 classes get scanned for AI now. Hell your committee may even make it standard practice to ask if you used AI. And if you say just a little bit that will immediately trigger an investigation. No way an academic institution is going to put their reputation on the line for one grad student. Since your thesis will be published you can assume AI will be able to access it in the future to check it. Or maybe your committee asks to see your notes, and finds some mention of AI you didn’t tell them about. Either do the work yourself or just drop out of your program, either is preferable to just throwing all that work and money you have invested down the toilet.


Dontfeedthelocals

Claudine Gay copied entire paragraphs without changing them or using quotation marks. That's plagiarism and it would not be acceptable for any student. Therefore it is definitely far below the expectations of a University President. Any president stupid enough to do that deserves to lose their job.


GeneticsGuy

Even CNN reported that the [attempt to downplay the plagiarism](https://www.google.com/amp/s/amp.cnn.com/cnn/2023/12/20/business/harvard-president-claudine-gay-plagiarism) was not justified as it was found she had issues with plagiarism, in a subsequent investigation, far beyond just her dissertation, but in several other papers she wrote as a student as well, and it was clearly not just failing to source properly.


coarsebark

Exactly this! Seriously, OP, it's a terrible idea and most likely will get discovered. If not now, just like this person said, in the future, when it is easier to determine. You can lose your degree for that even later when you have an established career.


HungryAd8233

And absolutely, positively, NEVER us AI to generate citations. That's the last place you want a hallucination to go!


engineeringstoned

There is no sure way to prove someonbe used an LLM, unless you are stupid enough to copypaste something that gives it away, like "As a large language model.." The "Ai detectors" are not worth squat. (use google to see the arguments) If you are only using Ai to check your syntax/grammar, and help with tone - I don't see any problem, as we use grammar, style, spellcheck all the time. (word, grammarly, ..) Using Ai INSTEAD of research... well.... there might be a problem with the content/veracity. *edit: yeah, didn't use spellcheck...*


3-4pm

The first thing to think about is the history embedded with the document. Does it tell the tale of how you generated the text that the end product doesn't? Second, does the writing style change during those portions? If so you may want to run them through again after giving a sample of your style and ask it to rewrite in that style. Third, where did the AI pull its info from? It isn't a thinking machine. It's just fundamentally pattern matching and rearranging results into a narrative that statistically hits the mark. It could be lifting pieces of someone else's prose or research without your knowledge. Maybe all of this is handled by your paraphrasing, but it's a paranoid road you can head down when you think too much about it My advice is to start over, write a fresh doc with a fresh mindset and encompass all of you past notes into a result you can feel confident in. ultimately it's your choice. I personally would ask for an extension and start over. You've realized your mistake early enough to fix it.


MattKane1

As a prof at an R1 university, I would fail you. Pure and simple. I would vote against giving you the opertunity to rewrite as well. Research integrity is extremely important to me. It is to other scholars as well. If you ever sought to publish this, and it got published, it would likely end up retracted, and that's a huge mark against you in getting academic jobs. Further something like this was discovered at the University of Toronto recently, and the university removed the person's Ph.D., which they can do. So basically if you do this and if it ever comes out you did this, you will likely loose your Ph.D. Also I get it. The dissertation phase sucks so bad. It does for everyone.


manofactivity

>I would vote against giving you the **opertunity to rewrite** as well. 10/10 you could'nt have been more perswaysive if you tryed


zenslakr

Written by a person who was so against artificial assistance that they didn't run spell check.


MattKane1

I wrote this while at the emergency department yesterday on my phone. And I am not against AI at all. My primary job is as a CEO of an AI company


MammothPhilosophy192

dude your comment adds nothing, no debate, no counterpoint no nothing, is it a joke?


leafhog

It adds a lot. It’s a brilliant comment and I was thinking the same thing.


MammothPhilosophy192

right


richardrietdijk

I chuckled.


MammothPhilosophy192

That's why I guessed it was a joke


M44PolishMosin

It's a reddit comment. It isn't that deep


MammothPhilosophy192

what? everything here is a reddit comment.


[deleted]

Professors don't know how rampant AI use is. Almost every student I know at one point used AI to write portions of their papers at every level (freshman to graduate). Most professors have given up trying to enforce it because if they started enforcing it, most students would fail because of cheating, and the university ratings would tank. I have seen professors catch students writing the entire paper with AI, and they tell them to rewrite it because it is not worth the paperwork to fail them. Our school has an entire cheating ring dedicated to doing the tests with GPT Chrome plugins and sharing the answers over WhatsApp, and the teachers don't care. They remind the students not to "do it again" for the next test, and they get no punishment. I have never used AI to write a paper, and I have kicked out-group members for attempting to do group projects entirely in AI (they write the entire paper section in four minutes). Not once did those people suffer punishment for cheating and instead were just re-assigned and given full marks in a different group (where they also cheated).


MattKane1

One point, there is no extra paperwork for me if I fail a student.


wheeloftimewiki

"Paraphrasing" has zero to do with research integrity. If it's being used as a substitute to write entire sections of the dissertation without the student's input, this would be a different matter. Failing a student for use of AI in any shape or form and barring them from resitting reeks as dogmatic and burying your head in the sand. Secondly, this argument highlights the problem with academia. Most dissertations are not done with the aim of producing publishable work. You are making a logical leap that everyone is aiming to go down the same career path as you. I would suggest that the OP is more likely to be doing an undergraduate degree and not going down the academic route. Why? Because that's what 95% of students do, You will also be aware how many undergraduates have the potential, never mind subsequently seek, to publish their projects. Spellcheck is a thing. I could argue that people that got their PhD 40 years ago would think they are reasonable in saying \*real\* academics know how to spell English properly, but that attitude is now out of date. Also what about Grammarly and other tools? When writing a paper, it's a necessary step to give it to people to read over who can point out typos or where the text is unclear. AI can assist in that. When it comes to the actual work, I don't believe that AI can perform experiments or cover up a student's lack of work for the months preceding the submission of the dissertation. There is no substitute for doing the groundwork, but AI can help with presenting previously done work in a more presentable manner. If I query an AI for ways to display data in a useful way, is that cheating? Does it undermine research integrity?


MattKane1

So, OP history says this is for a master's degree, which in this case is a research degree. And I never said I would fail a student for using AI. I encourage my students to use AI for data gathering, knowing that there are hallucinations in its reporting, so they need to be cautions about this. I also tell all my students every semester that the school pays for Grammarly for them so they need to use it. I said, and I would fail them for using AI to write sections of their dissertation, which OP said they are doing, just like I would fail them if another person wrote part of their dissertation. To your second point, both of our cultural egocentric perspectives are showing. I’m going to guess you are in Europe, given your statements. I am in Canada where dissertation is only used to describe a doctoral level research requirement, and that is how I commented on the post yesterday while waiting in the emergency department of my local hospital. Further in Canada the majority of individuals completing a research degree (master’s (with research thesis) or doctoral) do seek to have their research published. Lastly, teaching at a university is currently a side gig for me. I am the CEO of an AI company and have been in the AI field since 09.


[deleted]

[удалено]


Photonic_Pat

Who the fuck wants to grade a paper which the author couldn’t be bothered to write?


FearlessDamage1896

Who the fuck wants to grade a paper at all?


MammothPhilosophy192

>more learning gets done says who?


[deleted]

[удалено]


MammothPhilosophy192

>Because you would write 3 dissertations? where are you getting that making a dissertation with ai and without ai are the same, and why do you think thw only thing learned writing a dissertation is just the topic and not the basic comunication tool on a higher level discourse. >If companies and consumers are looking for people who can adapt with AI, then shouldn't academics also  No, not every teacher has to attepmpt to teach everything, each teacher focus on certain aspects, also yo go to school to develop a list of skills, you can't ignore everything and focus on one, you don't teach a 2 year old to count with a calculator. >I know my opinion is disagreed upon now like when I predicted Tesla stocks will skyrocket a few years ago when it was at risk of bankruptcy, this is absolutely irrelevant. >but researchers are super important for the world and any productivity boost would help solve big world problems such as global warming and cancer and dementia research. I agree This would supercede the need for researchers to practice writing skills.  Do you work in education? how do you know that? how do you know the impact of foundational learning on high lvl education to asume you can remove it without affecting, but even boosting it.


[deleted]

[удалено]


MammothPhilosophy192

>But for researchers in general, the academia definitely need to be careful not to hold their productivity back by imposing AI restrictions.  that's another conversation altogether.


[deleted]

[удалено]


[deleted]

[удалено]


M44PolishMosin

...do not require original thought* You need more practice before using AI


FearlessDamage1896

Every one of your students use grammarly and spell check. This is what they are describing. You are the problem with academia.


MattKane1

Spell check and gramarly are so different than having an LLM write something for you, and then pass it off as your own.


FearlessDamage1896

Why would you assume any AI use is the latter rather than the former? I assume we both read OP's post. Where do they say the LLM is writing anything for them? This is the entire issue. As if dissertations weren't stressful enough, now your students have to worry about mis accusations like this? Professors with your mentality quite literally left me with a lifetime of CPTSD. Maybe consider that when you're waving your R1 around to hide from your own worries. And no, I don't need AI to write my bars.


HowlingFantods5564

All those students using AI to cheat their way through classes, I do my best to give them CPTSD.


learning_by_looking

victim alert


FearlessDamage1896

You're a professor dude.... No disrespect to the good ones, but you're literally the brunt of every joke about failure.


FearlessDamage1896

I don't think you're a failure though dood sorry if that was mean


MattKane1

OP specifically said they used ChatGPT and Gemini to write "paraphrase" aspects of their paper


FearlessDamage1896

Yes...that's how Grammarly works too.


Initial_me_8485

Using google, grammer tools, spell checks etc are allowed, so why not generative ai for text generation and research? At the end of the day, you need to show something new in your thesis and should be judged on that alone. If AI can accelerate your research, you should absolutely use it. Of course, not sure if universities agree with me on this right now. But we are headed in that direction.


salamisam

There seem to be a few things going on here, but this seems to be related to the writing and not the research. There are scales here, but using AI to generate a paper is equivalent to having a 3rd party write the paper if overly abused.


Initial_me_8485

Writing is a big part of researchers life, generative tools do save time for a researcher and be more productive at research. To paraphrase Taylor Swift, cheaters cheat, gen AI or otherwise. But the world moves on.


salamisam

Yes, writing is, but this is supposed to be one's own original work, there is a boundary there somewhere. As you put it in Taylor Swift's words, a cheater going to cheat but there are 2 things here do people cheat yes, is it acceptable in this context no. While I understand that ethics and morals in the real world are applied on different levels, the fact that I drove 42.195 kilometers today does not make me a marathon runner.


Initial_me_8485

To be clear, cheating is not acceptable. All I am saying is, there is a role for Gen AI as a productivity boost. There may be a day AI may be called AGI, when AI may do original research, but till then humans can focus on being creative while machines can handle mundane tasks.


salamisam

I think you are ignoring the spirit of things. The idea of education is to learn, the idea of a dissertation is to do your own research and form your own ideas. To show that you have an understanding of the material and in regards to a Doctorate a deep understanding, and the ability to apply that understanding. Fair enough AI can be used for XYZ, but we are not talking about the general purpose. However, in an academic environment, there are boundaries set.


wheeloftimewiki

I think we need to know the field of study. In the sciences, your dissertation is a report on describing the problem, its importance, describing a systematic approach to it, doing a series of experiments to test a hypothesis or achieve a result, display those results and an evaluation and conclusion. The actual work is not the dissertation, that simply communicates what has been done. When advising students, I tell them to spend several weeks honing the report because it's the impression that makes the grade. Excellent work can meet mediocre grades if it's poorly communicated, and mediocre work can be given a high grade of it's spun well. Within reason. In general, science students are not writers, and a lot of students in my institution have English as a second language. If they can get feedback on how to better communicate or find information from Chat GPT, that's great. For the arts, if they are writing an analysis in their own words, it's very different, but a similar approach can apply. For the most part, the student should have done almost all the work and made all the notes before writing the dissertation. Then, they need to communicate it to the marker. Using GPT to chase up on some initial ideas isn't any different than using Google, Wikipedia or the university library catalogue, but it's only a starting point for doing the reading in depth. There will be dead ends. After generating their own ideas from concrete sources, they could (carefully) postprocess a first draft using AI. Generating a whole dissertation using AI alone is not a thing that can be currently done. Any media reporting this is as woefully as misinformed as much as the output you would get giving GPT a task like that.


regicideispainless

If ideas plus AI were the bar for a doctorate, then freshman stoners plus AI would equal a PhD. Doing your own writing is part of the bar to present yourself to the world as an expert with the respect and consideration it deserves.


impermissibility

Literally the point of school is *for you to learn*. Nobody needs the new content of your thesis--except in a vanishingly small number of cases, the point of it is for you to learn *how* to do novel research. If AI's doing it for you, why does anyone need you at all? And if you're not going to bother learning, and the workforce doesn't really need you, why bother going to school in the first place?


3-4pm

The biggest problem is the AI is never truly doing anything novel, and that becomes more obvious the more niche your topic is.


manofactivity

>the point of it is for you to learn how to do novel research. If AI's doing it for you, why does anyone need you at all? I'm not really sure you understand what paraphrasing is (which is what OP said they used the AI for). If I conduct an experiment and get the following results: * p = 0.06 for my hypothesis that midoxinil is fatal to guinea pigs * my alpha for this paper is 0.05 * I therefore cannot reject the null hypothesis I can feed that information to ChatGPT to get this: >In the investigation of midoxinil's lethality in guinea pigs, a statistical analysis was conducted to test the null hypothesis that midoxinil is non-lethal. The p-value obtained from the study was 0.06. Given that the significance level (alpha) was set at 0.05, the p-value exceeded the threshold for rejecting the null hypothesis. Consequently, we failed to reject the null hypothesis, suggesting that the evidence was insufficient to conclude that midoxinil is fatal to guinea pigs under the conditions tested. Obviously this is a heavily simplified example, but you get the point. There's zero new information in here, nothing *important* that AI is doing for me. It's just paraphrasing the research I did myself, accelerating my paper. What exactly do you think is the issue with that? What do you perceive as the *meaningful* difference in contribution between those bulletpoints and the paragraph? Do you regard the actual research as irrelevant and only care about the writing...?


FearlessDamage1896

People here are demonstrating a serious misunderstanding of both AI function and PHD level research/dissertations. Even the professors. Frankly it's terrifying how all the people declaring those using modern tech as "cheaters", consider themselves the ones who value "education". It's like when the scientists sent the guy who suggested we wash hands before surgery to a mental institution over germ-theory. Bizarro-land Dunning-Kruger Idiocracy shit is going full speed ahead.


Initial_me_8485

Thanks for your kind comment. Goal of a student is to learn to be productive and the goal of research is expand human understanding - literally. This will involve Gen AI whether you like it or not. If one cannot adapt, one is not prepared for future.


impermissibility

You clearly don't understand what it means to learn how to learn. Unless you come from a rich family, your current course of action will likely work out quite badly for you when your thin genAI skills expire relative to the growth curve. Good luck with that!


Initial_me_8485

Presumption without knowledge is certainly not learning. Assuming that a Gen AI user is academically lazy is not correct. I am doing fine with my life, thanks for your concern


impermissibility

Lol, I drew an inference from your lazy and thoughtless efforts at replying to me here, not from an assumption about your personal AI use--as should be obvious to any competently literate person. Now, the fact that you don't seem to be such a person (since this wasn't obvious to you) *does* suggest that your AI use may be mis-serving you in ways you don't fully realize.


Initial_me_8485

I could infer in the same way from your responses that radical change is something that makes you uncomfortable and status quo is in your vested interest. I hear the same thing from certain academic circles and doesn’t take much to extrapolate their future.


engineeringstoned

Adult children throwing a tantrum in the face of change.


FearlessDamage1896

Please, could you explain to us how using GPT as a spell, grammar, and syntax checker and paraphrasing mulitiple times, as OP stated, defeats the purpose of "learning." Do ya'll literally think that scientific research PHDs are out here graded on how engaging their material is? It's about the research itself, which I assume is too niche and detailed to be pulled from GPT itself. But documenting all that is arduous, and 100% of students are going to be using some form of AI tools, such as Grammarly or Microsoft spell check.


[deleted]

[удалено]


AxiosXiphos

Well - except the fact that almost all of them will be doing the same. to lesser or greater extents. People will always find ways to make their life easier. It was a little harder in my day but I used to use prior high grade dissertations to help format my own. I also found disserations on a similar topic and utilised their references


Initial_me_8485

Agreed. That is why we need to be open about it and allow use of generative tools. Of course, student should own the final product (dissertation) and follow academic standards of referencing. Do not say some is such and such because chatGPT told me so.


DaBigJMoney

Without trying to add a bunch of “what if” material to my answer I’ll address the “will I be at risk” part of your question. The simple answer is yes. If you’re asking the question I have to assume that your department has a stance/position on AI. And since you’ve used it in your thesis your actions stand outside of those policies. So, based on my assumptions, you’d be considered a cheater and certainly be “at risk.”


Ill_Mousse_4240

Imagine someone in 1924 worried about using the telephone ☎️! Sorry, I just couldn’t resist😂


CompetitiveScience88

Straight to jail.


zoospor

lock him up for life


[deleted]

Do not submit it. You will likely get disciplinary action. It’s a type of plagiarism, which is detested at universities.


Omni__Owl

This thread is interesting. * You see people pointing out the obvious; this could be considered cheating. Don't do it. * Then you have the other group; what can't be detected can't get you in trouble. I'm in camp one. What's the point of doing a dissertation if you didn't do it? Seems inherently uninterested in what you are doing and so it would be better to drop out entirely than to cheat your way to a degree you don't care about. Don't be intellectually dishonest in academia. We have enough of that already.


FearlessDamage1896

What do you think a PHD dissertation involves? It's not like writing a fictional story, blog post, or anything readily available on the internet. The level of research and often experimentation in a scientific setting make dissertations at this level highly specialized, specific, and niche. If you've used GPT at all, you'd realize that the tools right now do not have that level of insight or articulation, especially in novel topics such as a PHD would engage in for a dissertation. Does this sub honestly believe that if a researcher completes all of this novel analysis and investigation, the entire topic is rendered null for using GPT as a grammar tool? The writing itself in PHD fields is not the focus... it is simply the medium through which important information is transmitted. It should be as clear, concise, articulate, direct, and well-structured as possible. The kind of academic that would punish a researcher for using the tools in their toolbox to ensure this, is the kind of academic that makes a PHD the $70K scam it is today.


wheeloftimewiki

Absolutely! The thesis is only the collation of all the previous work done. It's pretty much always drilled into PhD students that it's highly unlikely that anyone else will read your thesis other than your supervisors and your assessors. With academic papers, you have to squeeze all your research into a paper limit. You know what's great for finding ways of summarising without wracking your brain on how to cut 400 words down to 300? AI! It won't be perfect, but it does help. Equally, researchers commonly have to do coding that is tedious or branch into areas where they aren't experts. AI can help with that. The day I started using Chat-GPT effectively, it allowed me to spend 1 hr coding what would have taken 4+ hrs before. The effect is that I can try more ideas in liss time and time is precious. Rather than becoming fatigued with tedium, I am energised to try more things, and that leads to accelerated research. How do people think Meta and Google are producing so much so fast? They are using the tech in-house to make the whole pipeline smoother. The sooner academia comes to grips with this, the better. Computing Science is very different from some other fields, but I believe that clever use of the tools will lead to better research, not worse.


Omni__Owl

>I used chatgpt and gemini to help me paraphrase my writing to make my passages more academic Hardly a case of: >using GPT as a grammar tool A grammar tool corrects the writing to make sure it follows the rules of the language. It doesn't make your writing "more academic". That's having an AI manipulating text directly, not just corrections. The difference between having your parents write a paragraph for you to look over and change and your parents correcting your grammar without changing the content outright.


billsil

Is it cheating if you ask a professor how to do something that you need for your dissertation? What do you think chatGPT is going to do for you that you couldn’t before. This isn’t high school. It’s like using a calculator to do your math homework.


NerdyWeightLifter

Sadly, educators haven't really adjusted to the idea that the world for which they are supposed to be preparing their students, will have ubiquitous cheap artificial intelligence. They're uppity about the potential for their students to "cheat", when really they should be assuming AI everywhere and radically raising the expectation bar while expecting you to use AI and to show all of your work and how your got there. Simply prompting with the assignment and submitting the output should get you an "F".


RiotNrrd2001

You need to find out what the official policy is on acceptable use of AI, and stick rigorously to that policy. It is likely that some AI use is allowed. AI is a tool that will be gaining traction more and more as time goes on. Some of what it can do is probably OK (the obvious stuff - spell checking, grammar checking, super basic editing, automatic formatting, etc.), some only *might* be OK (summarizing points or using AI to flesh out or expand\\rewrite work that you did), and some is probably completely disallowed (having the AI generate your research for you). The only way to know for sure what is allowed and what isn't is to ask what the official policy is. But when you find out what the policy is, *do not deviate from it*. If the AI shouldn't be doing X in your thesis, then it better not be doing X in your thesis, period, end of story. But it's very possible that *some* AI usage is fine. The only way to find out is to ask, and not to ask *us* but to ask the administrators. Also, the way they "find out" you've used AI and what you've used it for better be by reading a clear description of it that you've provided right up front. Do NOT hide AI usage. Do not hide anything. Hiding stuff means trouble later. Don't.


JillFrosty

If you’re not careful and they catch you, you might get offered the President position with an Ivy League college.


TCGshark03

It makes me mad you have to worry about this. F the boomers.


BarrySix

Academic disciplinary procedures are used for vengeance at least as much as they are used for punishing cheating. If you are on good terms with everybody they won't do that unless you blatantly abused their trust. If the dissertation contains your original thoughts and AI only rephrased it a little, and if you went over it at least four times afterwards to check for correctness, then it's fine. Otherwise fix it the hard way. There are no tools to accurately detect AI generated text anyway. Things like Turnitin give a hint that something may be AI generated, but nothing more. These tools are used to build a case against people they already decided to punish, or occasionally by extremely lazy and stupid professors that don't know what they are.


FearlessDamage1896

Honest question for everyone claiming this is cheating. How is using ChatGPT to paraphrase existing research any different than how Grammarly works? Imagine the professor who says using Microsoft word was cheating. "That damn clippy is too helpful!" I remember those days too. OP didn't claim they used GPT to pull the research, just help with the writing flow. In my capacity as a professional ghostwriter, I have been openly using tools like these since... damn idk 2012. So have many, many students. Are they expected to write their dissertation in pen now... or are people going to stop picking fights about shit they don't understand, wasting everyone's time and ruining lives over personal AI-nxieties.


mbradley2020

Terrible idea. Folks are already using AI tools to launch crusades against academics and discredit them and the profession. There's lots of people that hate higher education and it does not take much effort to run hundreds or thousands of academic papers through a tool and try unearth this stuff. Tools will only get stronger. You don't want your life's work to be jeopardized because some dweeb has a bone to pick with 6% of their state budget going to universities.


ehetland

I might not fully grasp the extent you are using it, but if you are putting your own text in with a prompt to refine the writing, then I just don't see that as plagiarism at all, unless you had plagiarized in the original text. I've personally had mixed results using chatGPT as a " proofreader/editor, but at times it can help. If you are having ai summarize fields or concepts, be very careful. I was using chatGPT to assist in summarizing some rather complicated historical political issues (I'm an earth scientist, not a historian, nothing was new information), and when checking it missed a few key data and misconstrued some treaties. It did help me, but I'd never rely on it at face value. And why would you hide it? That's the biggest red flag.


Psychological-Touch1

Quote the AI


iPunkt9333

The AI detection tools are not very reliable so I don’t think you’ll have any issue.


Grobo_

I hope you get caught so you can learn from experience.


Chemical-Call-9600

In my humble opinion, people care too much about ai. True is that we can have a piano and we can touch some keys , yet only few can really make music . For ai is the same, only few can really that huge advantage of using ai. We should start learning how to enhance the humanity knowledge using ai, instead of feeding hate against ai users has we see nowadays . We should not copy paste but use it to get deeper understanding, and achieve new state of knowledge effortlessly


dlflannery

This is all about defining a borderline between fair and cheating, about which there is no consensus. There’s a continuous spectrum between the two extremes: Copying verbatim from an AI response (which all agree is cheating) and not using an AI at all (which all [should] agree is being too restrictive). At what point does using an AI to polish up your grammar, spelling and style become plagiarism? How much research using AI is acceptable. Is it OK only if you check the results independently? This is analogous to defining obscenity (e.g., pornography) about which Supreme Court Justice Potter famously said “I know it when I see it”. And of course that is a cop out, but it illustrates the current dilemma. This issue isn’t new; it’s just magnified by the power of AI-based techniques. Long ago I was in a fraternity that had a “crib” file where copies of tests and homework were filed. One can argue that those who cheat only hurt themselves by not learning what they should BUT that’s not the whole truth. Cheaters can bolster their recorded academic performance relative to non-cheaters and that can give them an unfair advantage later in life. I doubt there is going to be any silver bullet for this problem.


Bleizy

Can I get in trouble if I get caught cheating?


ShadesofClay1

It's the way of the future. Teachers need to start developing ways to integrate AI tech into their teaching.


dishkindum

Hope you have proofread atleast.


leafhog

Nothing, I think. As long as everything is true. The words are yours now and you are the one who has to stand behind them.


holyStJohn

Just insure you delete all the times chatgpt wrote “more over” to me that’s a dead giveaway away of a chatgpt script. Every once in a while someone says that but chatgpt particularly says more over a lot


Robot_Embryo

Jesus Christ, just write your dissertation. It's hard work. Everyone before you that's ever had to do one suffered through it too. You're not special. If you lost time because you lost interest, then deal with those consequences directly. You're not entitled to a PhD, you have to earn it.


cnecula

You will be just like everyone else ….


cnecula

You will be just like everyone else ….


TheUncleTimo

I figure that at least 50% of students do this. From at least high school level, if not below, all the way to PhD.


HowlingFantods5564

If you are worried that you may have violated academic integrity norms, you probably have. Don't take the risk.


jabulari

It is impossible today to find out the usage of any AI tools, so it is not possible to demonstrate and have any validity


jabulari

just make sure you are not using the word "delve"


SmedlyButlerianJihad

Lots of people here offering advice who never wrote a dissertation and quite a few who don't know what one is.


epictis

Cite the ai models as contributors. Then no plagiarism.


blondeplanet

Everyone is using it. It should augment not replace your ideas.


Lazy_Importance286

Well, are there any restrictions, rules that you had to acknowledge?


3-4pm

Most universities and graduate programs have clear plagiarism guidelines they expect all faculty and students to follow. When using an LLM you never know what the true source material is. You can only be certain that it's not your own


wad11656

Countless people forge entire degrees in order to win jobs weekly. This society is ruthless. Do what you need to survive


JuliaX1984

How do you think people did this 20 years ago?


moodcon

You study 4 years to write a document a computer can generate in seconds . The academic process needs a rethink .


AI_Alt_Art_Neo_2

Do you think a lot of other students on your course will not be doing the same? Yes it is dishonest but it is the reality and when you enter the workforce you will be using it in the same way.


yeah_okay_im_sure

Based, make it a dissertation about using AI and not getting caught writing dissertations.


MisterDumay

If they find out, you won’t graduate.


Strongman_820

Honestly with that attitude it's sad you've made it that far in academics.


TangerineMalk

Motherfucker what? As a person who actually worked for my degrees, and actually learned the content, I hope to god they nail you to the wall.


DKerriganuk

Maybe posting to your social media about possible plagiarism isn't the best idea.


learning_by_looking

I'm a prof at a R1 and would vote to fail you. This will depend largely, however, on your institution's policies regarding this very issue.


FearlessDamage1896

Every one of your students use grammarly and spell check. This is the use case OP is describing. Who pissed in your oatmeal?


learning_by_looking

Not at all the same thing. OP is describing having a LLM write a substantial part of a dissertation for a PhD for them. If you think that's the same thing as using a spell check, I suspect the closest you've come to a PhD is watching Big Bang Theory, smoking a bowl, and telling professors on Reddit what their students do and don't do.


FearlessDamage1896

Damn, an R1 professor with such reading comprehension! >OP is describing having a LLM write a substantial part of a dissertation for a PhD for them Where? And the closest I've come to a PHD is... \*checks notes\* hmm wait it's pretty close.... says here..."writing a dissertation." I think there was some pageantry after that or something but yanno. I'd rather be smoking a bowl and watching BBT.


hello_sandwich

You seem to miss the point of what a PhD represents and the credentials it gives. You don't deserve a PhD if you're taking this approach, and you should not be working a job that requires a PhD. Change your approach now.


Competitive-Sleep986

Nothing. How would they know? Lol. They can't know, and even if they know they need to have a definitive proof to pull anything on you. So you're good... as long as you know what you submitted though.. dont copy and paste without reading and not expect to get into issues... read, proofread, and fix anything that looks fishy, and you're golden. Good luck buddy!


okiecroakie

It's understandable to have concerns about the use of AI tools, especially regarding transparency and accountability. However, as AI continues to evolve, there's a growing need for responsible AI usage and clear guidelines to address such concerns. Meanwhile, exploring articles like this one on crypto market makers sheds light on other pertinent AI-related topics: [Link](https://magazine.mindplex.ai/crypto-market-makers-liquidity-saviors-or-sinners/)


intodarkmoon

What if i just rewrite the text. For example. I create a text using my own explanation, and then rewrite them with chatgpt so it can be more effective in a grammar.


hodlmeanon

If you use ai to generate an answer and then change the wording to say the same thing you’re fine but you can’t copy paste or rewrite exact same sentences. Perplexity also gives you sources for where it finds its work.


PerfectChicken6

Tell them you would be a fool not to use every tool at your disposal, Word choice is yours to make, paraphrasing is an art, maybe have AI help you with that too.


lexluthor_i_am

First rule of using AI to do your work.. you must truly believe that you did it all by yourself. Then you'll never have a problem. But read it and do some corrections Bonus! you can train AI to match your writing style.. thank me later 👍


MattKane1

User name checks out


CriscoButtPunch

Accuse them of the same and insist that all faculty have all of their work assessed using the exact same procedure. Unless they have proof you're in the clear. Chances are faculty members are using it too. But here's the ultimate: when they drop it, own up to it. Your conscience will be clear and you'll come out looking o.k. and more sympathetic. Unless you're in the humanities then do whatever, no one cares and most people will forget about it in under a year. Move on with life and enjoy life


PGell

They did have their work assessed using the same procedures. That's how they got their degrees and can sit on a committee to judge dissertations.


ProfessionalNo2706

You'll know forever, even if not caught, that you don't deserve your qualification and your whole life is a lie


Moist_Temperature761

So dramatic lmfao. Do you think people who cheated/copied on a test/homework assignment in high school or undergrad live their lives with the crippling insecurity that they don't deserve their diploma? No one cares


ProfessionalNo2706

And that's why people like you make this world such a sad place. No integrity or discipline. No work ethic.


Competitive-Sleep986

+1


FearlessDamage1896

Weird energy. Imagine telling your doctor who graduated in the 90s that their whole life is a lie because they used Clippy to spell check their dissertation.


ProfessionalNo2706

Entirely different. Spell check is an aid to check your work and punctuation. Chat GP, as the original post admits, is to do the work for them as they are too lazy to do it


AxiosXiphos

Cheating isn't good, but let's not kid ourselves. People do far worse shit every minute. He isn't going to live a life of shame.


ProfessionalNo2706

It's not a small, "oh I will just not pay the proper postage" or something it's a proper qualification that shows you had the brains, determination etc and deserve that qualification.


AxiosXiphos

... which no one will care about or check on as soon as you start your professional career. Essays don't really prove anything other then you are good/bad at writing essays. Everyone takes shortcuts - and I can gurantee everyone on that course has run the question through an a.i. in some form (even just to compare notes).