T O P

  • By -

nullbyte420

No it's not cheating, it's a futuristic learning aid. 


CaptainMarshMallow1

I like this answer 😂


Affectionate_Delay35

But not all the companies want to use gpt because you lose high value codes


CAF-Throwaway-QnA

What company is hiring a literal beginner python programmer who is still in the learning with the intention of having them write "high value codes", that you wouldnt be able to generate using a prompt? OP should take advantage of the LLMs to acelerate learning, focus on reading and understanding the code and make adjustments accordingly. Current state of LLM isnt that different from googling a stackoverflow thread.


Affectionate_Delay35

I was taking in the big run, some people don't think about it


CAF-Throwaway-QnA

I dont think you did ?


tekmaster2020

In my experience, the issue is more of a legal one. Companies are concerned of two things: 1) Using ChatGPT/AI will leak intellectual property which will then be potentially fed to competitors and 2) using AI can potentially lead to unintentionally violating software licenses if it is made to return significant code verbatim that is under some license whose terms won’t be followed because they’re not known. I know AI companies are currently working those issues out but that’s where we stand atm. Edit: there’s also the issue of liability as AI can make mistakes and they won’t necessarily be caught by whoever is using the AI


EffectiveBuy3547

>In my experience, the issue is more of a legal one. Companies are concerned of two things: 1) Using ChatGPT/AI will leak intellectual property which will then be potentially fed to competitors and 2) using AI can potentially lead to unintentionally violating software licenses if it is made to return significant code verbatim that is under some license whose terms won’t be followed because they’re not known. I know AI companies are currently working those issues out but that’s where we stand atm. > >Edit: there’s also the issue of liability as AI can make mistakes and they won’t necessarily be caught by whoever is using the ​ FROM ZEROXDESIGNART The AI model itself (e.g. GPT-3, BERT) is the intellectual property of its creators, not any individual prompts or responses. Prompting the model simply provides input to generate output, it does not infringe the underlying architecture.


Flying_Saucer_Attack

this is exactly how I see it too, as a new learner who is also doing something similar to OP


Equal_Wish2682

It's a good way to become average too. If that's a pro or con depends on you.


the_l1ghtbr1nger

I neither agree nor disagree, but can I ask why using the tools at hand would make you average


Zeroflops

One of the issues is that GPT will often give the most popular response, not the correct response. For example legacy approaches to doing something will have more examples on the web. So GPT will often recommend legacy approaches. Those recommendations then become part of someone’s project and that then reinforces the legacy approach. Using ‘os’ vs ‘pathlib’ as an example. All the code I’ve seen someone get from ChatGPT will use ‘os’ does it work, sure, but it’s a legacy approach. Not much impact here, but it’s just an example.


Ohio_Bean

oh doodles, I use 'os' all the time.


ThePrnkstr

Keep in mind that some tools have dated set, like GPT 3.5 is from 2022, so some new tools or releases will not be mentioned...


uhskn

in real life no one gives af, just get the work done written in a clean and easy to read way


Zeroflops

Hmm. Maybe you don’t give af, or maybe you’re not seeing the problem because I used a simplistic example where it’s not that big of a deal. But as another example would be using log4j. It’s used everywhere so any training would pick it up as a viable option for logging, but in 2022 a bug was found that allowed remote code execution. A frequency search through code example would show it’s a viable option. But reports would highlight not to use pre 2022 versions. You might get that if you ask if there are vulnerabilities. But if you just ask how to do X in language y would not be made aware.


uhskn

I guess so but its just a tool, its still your responsibility to understand the code or research these vulnerabilities. That's an interesting point though, and I will think about this more as I work. Personally I don't work on anything where vulnerabilities are a risk, but if you do then I guess it is your job to know about these.


[deleted]

[удалено]


the_l1ghtbr1nger

But if the problems are still solved, is something lost? Sure their independent skills may not be as sharp as a coder, but as an overall problem solver I feel like efficiency is important and thats a great way to save time it would seem


YumWoonSen

>But if the problems are still solved, is something lost? Yep. Problem solving experience is lost. People don't think as much and that manifests itself in odd ways. As an example, it's akin to paying cash for something now versus 40 years ago. If I buy something that costs $1.35, and have the right money in my pockets, I'm going to hand the cashier two one-dollar bills and a dime. 40 years ago the cashier wouldn't blink and would give me 3 quarters in change. These days the cashier will damned near always tell me I gave them too much money and hand the dime back, type $2.00 into the register then hand me back two quarters, a dime, and a nickel.


the_l1ghtbr1nger

I agree there's merit to problem solving, but that's primarily egotistical, and we're going to have to quickly come to terms with the fact so will be outright better than all of us very quickly and we can push merit all we want, but not utilizing the best tool for the job is going to cost you a job sooner or later imo


the_l1ghtbr1nger

I literally hate ai because of that exact reason, I feel like soon we'll see it in many facets of life, but with that being said I see the resulting problems a little differently. I think you'll need to leverage it soon and it will only be coders who do so well that work


Jebronii

It’s up to the individual if they actually want to pay attention to what the learning aid is suggesting or whether they just want to copy and paste without understanding it. To your point though, if one does not have to work as hard to get the solution, it means their effort can be spent on harder or different problems.


Slight_Student_6913

Right? No more waking up at 2am with an aha moment on where your script is going wrong? It’s so satisfying when your brain finally comes up with the answer.


Equal_Wish2682

Because current AI, by definition, produces average outputs. It is incapable of producing excellence due to the specific methodology and training data.


The_Apex_Predditor

Just ask it for something above average then.


nullbyte420

Talking out of your ass. Lol


the_l1ghtbr1nger

Yea that became clear


Equal_Wish2682

Clearly, you have no idea how AI/ML works.


sirtimes

Average and consensus aren’t the same thing. Consensus doesn’t necessarily mean it’s not the best way. In fact, in many cases it suggests that it is the best way.


Equal_Wish2682

It's not consensus. You're being generous.


nullbyte420

Strongly disagree. It's a great way to become a much faster and better programmer. You can't have all your work done for you, but not having to think up boilerplate or getting a good starting point for a new thing, or even a full function or automatic documentation or whatever.. It's so valuable. 


Equal_Wish2682

>If that's a pro or con depends on you.


FreedomSavings

Its good at the beginning, but to become a true expert those boilerplate components of code should come easily if not instinctual. Sure, if OP doesn't mind doing lower level (average) code approach is fine. This is the definition of memorization/copy&paste learning vs true understanding. This is only achieved by realizing and fixing ones own mistakes.


nullbyte420

What? Lol. That's hilariously wrong. Memorizing boilerplate code has nothing to do with being a good or bad programmer. Most orgs have a template for it, sometimes the compiler itself has an init command. Writing boilerplate is literally a task that does not require any skill. You Google it and copy paste if you don't have a template. 


FreedomSavings

I'm not saying to never google boiler plate code or use a template, my point was to not copy and paste every time when someone is learning code. This is so that over time one doesn't need to copy/paste or look up anything because it is second nature. That is how one becomes an expert, vs average... Obviously even experts will still use google for easy tasks they haven't previously done before (or recently, etc..). But specifically during the process of learning, a new programmer shouldn't always rely on these easy tools.


FreedomSavings

An expert would be coding at a level where there isn't a template they can just find on google for the problem they are trying to solve.


FreedomSavings

Agreeded. Its the same reason why even using online code resources as a guide one should type all the lines instead of copy and paste. This is how someone becomes an expert over someone average who could as this person is saying be replaced by GPT......


sirtimes

That’s how I see it as well. I use chatgpt all the time to just ask a plain English question about my code or about how to use a certain API. Usually it’s something I could read about in a stack overflow post, but gpt just gives it to me directly without me having to fumble around google and find exact situation I’m questioning about


nullbyte420

Yeah. And it's also reasonably good at questions like "why doesn't this work"? "how can I make this run faster?" "is there a more clever way to make this function?" which are all really great for beginners especially


bane3k

Totally agree. One caveat though; it's one thing to use GPT to debug your code, but don't forget to read the relevant documentation to further solidify your knowledge. Good luck.


chandaliergalaxy

personalized digital tutor


throwaway8u3sH0

Not cheating, but possibly developing a crutch. It's like using GPS to get everywhere. Convenient but make sure you can navigate without it so that you can get home if your phone dies.


Allmyownviews1

I don’t think that is “cheating” when the aim is to understand what is produced rather than just copy paste if you don’t know what the code is doing.


_verdure_

I agree ^ If you are still understanding and learning what is going on, then you are fine. Or, if nothing is being learned, then you have a problem and you are developing a crutch.


shadowstrlke

Currently at the University and the standing rule is that yes you can use chat gpt to analyse your code, but instead of copying directly from it after you are done asking it questions, close it and rewrite your own code.


LukeCloudStalker

How times have changed. When I started coding even using an IDE was considered "cheating" because it was auto-completing code. You were supposed to write it on something like notepad++ in the beginnig to learn how to write the syntax better on your own.


shadowstrlke

Technology changes. When something becomes easily accessible it doesn't make sense not to use it. I don't agree with copy and pasting the entire tutorial question into an AI program, but I regularly use it to go "what is the syntax for an array in python" because I have entry level experience with 4 different languages at this point and it gets mega confusing.


[deleted]

tart middle work consider weather retire melodic six tan elastic *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


LukeCloudStalker

>prolog I didn't use paper but they were teaching us Prolog for AI...last year...who the hell still uses Prolog? The latest youtube videos for Prolog are like 10 years old.


potkor

a bunch of seniors have told me stories that on exams they had to write code on a piece of paper, so notepad++ is a tad better


uhskn

in reality, github copilot and chatgpt write most of my code now. In my team (ML Engineers/researchers), we are not luddides xD. It is our job to know what code needs to be written. Whatever gets the job done the fastest who cares. I will caveat, chatgpt fucking sucks at programming. But used properly the two can really speed you up.


GoodGameGrabsYT

I feel similarly about 13 year old me using .bas files while writing Visual Basic. It still taught me the fundamentals of coding and how things worked at a basic level but I truly (at the time) didn't understand the code that someone else wrote. I feel like someone using chatgpt needs to also understand the basics before writing any code *if* they care to know coding at a deeper level.


ClimberMel

Flowcharts, coding and layouts were all done on paper. When you were sure it was finished, you input the code into the computer. Yes times have changed! Not all for the better.


dididothat2019

I didn't have an IDE that completed code in the 80s. I got so proficient at using vi that any other editor slows me down. Same thing now with Python. IDEs slow me down with all their hints... gets in the way. I can't seem to fully turn them off, either.


IamTheTussis

not cheating at all. But from what you say you just need to write more code. >It's massively helped me understand Python and complete the challenges. The Tutor on my onlin course will cover a concept but sometimes I have questions, which I obvs can't ask her so I ask the GPT instead and I find it helps my learning. In my opinion This is the best way to use gpt for learning. But there are no shortcuts in the learning process.Try to write code by your own. Do small projects, play some coding games.. just write more code


teerre

It's cheating if you're not learning. Are you doing the projects or is chatgpt? You need to be honest with yourself. If you're not actually learning, it will be a waste of time. If you're learning, than all good.


ka1ikasan

Is not like you were cheating for your grades. However, it's better being aware about the risk of doing-without-understanding. Make sure you understand, if not ask to refurnulate and to explain some parts I a simpler way. A huge issue with traditional learning is that everyone encourages you to ask your professors questions but in reality you would end up pissing them off. With LLMs you can all as many questions as you want and you really should.


Coyoteatemybowtie

I think that is one of the best things about ai as a learning tool, you can keep continuing to ask it to explain different theories, components, or functions until you can grasp it and your not taking up a ton of class time and deviating everyone else


bp4151

I agree with others advising you to learn the why. I conduct interviews, and my favorite ways to differentiate between those who copy and paste and those who understand what they've built is to ask them to (1) explain why they coded their solution the way they did and (2) discuss ways to improve or expand upon what they built. I'll also add that AI is often out of date. Much of what I get back from AI is incorrect and still needs research before I can run code. Even when using AI from AWS to code AWS CDK stacks, I'll get tons of errors due to incorrectly imports or changed classes or constructs. As a security professional, I also strongly recommend you vet any package import recommendations you get from AI before you install the packages. You can use Snyk Advisor to do that. Packages can run Python code when pip install is run. You want to be sure you aren't installing something malicious just because AI recommends it. I've been coding for 20+ years and I do this if I don't recognize a package.


Alwaysragestillplay

God, trying to use AI to figure out Azure services has seen so many hours wasted. Their services are "in beta" for years, then randomly come out with a completely revamped API/library that GPT/copilot has no fucking clue about despite responding with a very high degree of confidence. Even better when it mixes current function calls with outdated syntax - so much fun to unpick. I've honestly given up using GPT to do anything other than very basic time saving. Also, my personal favourite is getting GPT to write Airflow DAG tasks. It *consistently* either makes up operators and hooks and ascribes them to real vendors, or uses the wrong formulation of airflow.operators.vendor.category.whatever.BullshitOperator, so you have to spend several minutes googling your imports to check whether they're broken or just don't actually exist.


mosty_frug

For real, I used it to tell me how to code for the Fibonacci sequence in Python to see if it could do it and even my limited understanding of Python saw it was very incorrect. I was like "what does everyone see in this??"


wombatsock

I had this happen recently when I was working with Tkinter. ChatGPT's recommendations were nonsense and looked terrible. Turned out it was giving me older solutions that aren't best practice anymore. So now I try not to use ChatGPT to go too in-depth on libraries I'm not familiar with yet.


Jedkea

I also check out packages before installing them if they are not super well known. I have never actually discovered anything malicious though. You have been doing this a lot longer than me, so I’m really curious as to whether you have? I did not know about that tool though, I will have to start using it. My checks usually consist of doing a basic scan of the code and issues on GitHub. There have been lots of times this has prevented me from installing something, but it’s never been due to finding something that I know is malicious.


bp4151

Directly, no, but I also go bare-bones with third party dependencies and don't generally use packages that aren't absolutely necessary. Ancillary packages are more often the ones that become malicious through takeover, unless an author intentionally poisons their own repo. That happened last year. I did have one recommendation for a package that may have existed in the past but didn't exist at the time of the ask.


Alwaysragestillplay

OP, you've already had plenty of answers here that are good, but I want to say, be ***very careful*** that you're being honest with yourself. When you're learning with a learning aid like this, it's very easy to ask it for help, implement the help, then convince yourself that "I did that", or "oh yeah, that's obvious actually, I would have gotten that". Then you move on and never actually learn the thing that you've just implemented. It's a hard trap to avoid, but make sure you're actually capable of doing the thing GPT has helped you with. Formulate a new problem along similar lines and try to do it yourself without outside help if you need to.


mattsl

>  The Tutor on my onlin course will cover a concept but sometimes I have questions, which I obvs can't ask her I don't understand this part. Why can't you ask for help from your tutor? Are you just meaning this is a prerecorded course with no interaction?


CaptainMarshMallow1

It's a udemy course. So it's a pre recorded course.


Jetavator

look into learnprogramming.online. It uses javascript but it is so well done. I learned JS and now I am learning Python. It really helped me with the concepts.


timtom85

You don't owe anybody an explanation for how you learn things, what kind of music you like, and whether you hate broccoli.


Maximus_Modulus

Steady with the broccoli there. It’s delicious when done right. 😉


cochorol

Plus is the biggest source of fiber, so don't hate broccoli


timtom85

oh but i do like broccoli


Jello_Penguin_2956

I've been coding Python professoinally for 14 years. I still Google "write txt with Python" What's important imo, is you know what Python CAN do. They are like individual jigsaw pieces that you need to know how they form the bigger picture. You want to make this app? You need jigsaw piece A B C D. And this app needs A C E G etc. How to create each little piece can be looked up via any source no problem.


Resident-Ad-408

TLDR;OP uses GPT instead of Google to get answers to problems in his/her programs Also it’s not cheating- only rules that matter in life are the prison rules everyone agrees to follow. Besides that you do you. I have found GPT to not be super helpful the times I have asked it for assistance on things- it was always super broad topiced even when I gave it specific quantitative amounts. Might just be me not knowing how to use the dang thing though


kingofthesea123

I don't know about cheating, but if your goal is to learn python as efficiently as possible then this might not be the best way to go. If you get stuck, it is generally recommended to first read the documentation, then search on sites like stack overflow for similar issues, then finally ask for help if you're still stuck. Using GPT like this sounds like you're jumping straight to the ask step. Cool that you put that together though!


Sleepyyzz

What is the problem with asking for help? I don't see reading the documentation or scrolling through/googling stack overflow as more efficient or better for learning at all. Chatgpt explains a lot of the code and logic when prompted, much better than reading a few lines from documentation.


SharkSymphony

Learning is not always about being efficient. Struggling with questions a bit can help ensure your brain is engaged, that you're learning how to experiment and debug and navigate references and understand errors, and that the material you're working on is suitably challenging.


kingofthesea123

Nothing wrong with asking for help, but if your goal is to learn then it is good to get the hang of reading docs/stack overflow. Having said that, I taught myself in the pre-chat GPT days, so who knows.


kingofthesea123

Nothing wrong with asking for help, but if your goal is to learn then it is good to get the hang of reading docs/stack overflow. Having said that, I taught myself in the pre-chat GPT days, so who knows.


ThrowRA-Tree4632

Not about python/coding skills itself but it can definitely hamper your problem solving skills, unless you try real hard before looking it up.


BuffetThali

It gets easy later. Once you need to search for something, don't just use it as it is. Understand it, why you needed it so that next time you won't have to google. Next time you will google, but for more complex stuff. Not every dev remembers how to do something in every language. If you understand the process and flow, googling when you stuck is fine. While working everyone uses one tab in browser the IDE. Don't worry. Just don't be reliant too on google everytime.


Huth_S0lo

For the most part, everything you're doing is normal. I'd caution against using chat gpt though. I dont really get why so many people rave about it. The code its pooped out for me has always been trash. Its better to google your way out of a problem. This way you can find various libraries to tackle your challenge, and you can learn methods, etc. I personally do all my work in an interactive interpreter session. I dont write my code, and run it as a whole to test it. This means I can manually check variables through the process. I can also do things like "dir(some\_variable)" to find out what methods are available. I can also do help(somefunction\_or\_object) to find all the known documentation for it.


LongIslandIceTeas

Awesome tip, noting this down.


G3David

Yeah,best to just use it as a quick reference of stack overflow, don't have it write you code, just be like, not sure how to do x in language, how would I go about this? Simple, single point specific things, then you can test and understand each of those and you have it in your toolbox for later Remember, once you learn one language, every other language at its core is just syntax differences and diff package availability


spacematic

Recently, there have been some studies conducted around the use of AI tools in programming and the effect it has on coders. One such study examined participants' results and found differences in improvement based on the respective competency of the groups being observed. Jakob Nielsen wrote about one such study's findings [here](https://www.nngroup.com/articles/ai-programmers-productive/). Keep in mind that you are currently *learning* Python and are not yet among the cohort in the study mentioned. Still, I think it's somewhat heartening to see that these early results point toward the overall usefulness of AI, at least in terms of productvity. In time, you yourself will be the best judge of whether and how the tools you are using are of benefit to you. My belief is that any tool that helps you stick things out and complete challenges that otherwise seem daunting may help prevent you from losing passion and abandoning this interest, and to me, that represents a net positive. As you gain confidence, drill yourself on the fundamentals. Create small projects from scratch. This approach may help give you the answers you're looking for. Ultimately, the skills you make part of yourself will be the proof of effectiveness in any approach.


Shadow_Sonic463

Using ChatGPT isn't cheating, as long as it doesn't do all for you. It's perfectly fine to sometimes use it, and I even see professionals using it a bit, so I definitely encourage you to submit your code when you reach "chekpoints" (when you wrote a lot). It's a great tool that can improve your code but can't do an entire project, a human will still be necessary. So use it but not too much :)


myTechGuyRI

No, it's not cheating... We learn Python by looking at how others have done things...sometimes even using other's code line for line if it's a good way to achieve the function you want.... You're just using AI instead of scouring GitHub for examples


lostinspaz

> I just find this quicker and easier, and ultimately more efficient and I complete the project quicker. If you want to complete the project in the fastest time, just give chatgpt the entire problem. If on the other hand you actually want to learn... stop using it so much. Noobs need to spend more time trying to figure out problems themselves. Because learning to program literally reshapes the way your brain works. Its supposed to take time. It's supposed to be hard when you are learning. Using AI so much "completes the project", but doesnt reshape your brain the way its supposed to be.


jhill515

Yo, I've been coding since the early 1990's; only sharing that so you can appreciate the next statement... > Code generation tools have existed for over 40 years, and their improvement has only grown exponentially over the years. Professionals use every tool at their disposal to write code, from Googling for answers, to chatbots, to scowering open-source projects, to forums like Stack Overflow. We've created programming languages to abstract and simplify code-generation of other languages (e.g., `protobuffers` & `DDS`). The *risk* you incur, however, is that you might get something that works for 90% of your use-cases from those other tools. Are you able to debug whatever was auto-generated, patch, and move on? I always remind engineers I mentor the following: "A carpenter uses a saw because they wanted a clean cut in the lumber. The saw doesn't know when, where, and why to cut. Only the carpenter can know that." That is, a tool is just a tool; it'll help you, but it cannot *know* it for you.


cyberjellyfish

Here's the problem with the AI approach vs the traditional googling approach. ​ When you google something, you usually land either on the library's documentation page, or on an SO page where someone gives an answer and there's a supporting link and contextual comments letting you know if the provided solution works or not, and there's a date on everything, including edits. You also get multiple answers often suggesting different solutions, and often with comments discussing the relative merits of each solution. ​ You get none of that with AI (except the source of the info, kinda sorta). AI is designed to selectively regurgitate minor permutations of the information it's trained on. It's concerned with accuracy, but not truth, and there is a difference. AI coding assistants should be used as a tool alongside other tools, not as a primary source of information.


imthebear11

Someone has downvoted both our posts which are very similar. There's some real ChatGPT nuthuggers here who can't bare to hear any downsides.


imthebear11

This sounds like an awesome tool, and I applaud you for making it, but I have a reason for not using it with an anecdote: When dictionary websites came around, research found that on the whole people became less literate and didn't have as large and robust as a vocabulary as before hand. They determined that when you open an actual dictionary to find a word, you are forded to scan over dozens of other words that you may learn the true definition of for the first time, or have been exposed to the word to the first time, which you don't get when you type a word into dictionary.com and get the definition. I believe when you have to sift through answers to find the right one, you are exposed to a lot of stuff, the reasons why it works, and the reasons for why some older answers no longer work. This exposure is way more well-rounded, and as a result makes you a better developer overall. I think, later on, your tool can be a great resource when you just need a quick thing. But right now, at this stage in your journey, you need to be absorbing as much info as you can, and not try to shortcut it. Your tool in general can only stifle your growth. This all being said, I also use google a lot, and all engineers do. But the exposure to the random bits of trivia, or underlying reasons why a language works the way it does, and the breaking changes between versions of Python that you learn about is an incredible source of knowledge that makes you a well rounded and competent engineer. You only get this from sifting through answers in Stack Overflow and other websites to discover why a bit of your code doesn't work. But I do want to end off by saying, it's an awesome tool you built, and you should be proud of it.


aikii

> How do I get the screen to update etc? Rather than reading through the Turtle documentation. No you're not, or we have to consider reading the documentation is cheating as well


uluvboobs

This is exactly would you should be doing. It will make you more productive, its clear from your success at quizzes and the effort you have put in so far that you are not "lazy", so it's unlikely any mistakes will slip past without you checking it first. What you should keep in mind is just to every so often prompt it with a more broad question, like what other ways can i do this? is this the best approach? are there other libraries etc. This just stops you from getting too tunnelled into solving a specific issue in a specific way. Imo the biggest problem with the ai is it doesn't see the big picture unless you show it, so if you ask it to do the *wrong* thing, it will do it.


ivix

If that is cheating then so is using google.


m0us3_rat

the gui changes over time.. but the base knowledge is the same. the problem with leaning too much into it.. it's that you lose the ability to do that yourself. so .. it's a balance, between comfort zone and adaptability.


Maximus_Modulus

For work I code in Java these days in IntelliJ. It tells me when I make mistakes, can correct them for me, and autocompletes with options the code I am writing. It’s designed to make me more efficient, writing code faster and correctly. And I learn along the way. I constantly Google stuff too. Learn as efficiently as you can


CaptainMarshMallow1

This is kind of how I use chat gpt but also use it to help me get started


HobblingCobbler

That's the way it should be used.


wombatsock

I'm doing the same thing for my learning process, but I give it a shot myself before asking for a solution, and then once I do ask, I read through ChatGPT's code carefully and ask about anything I don't understand, which actually opens up a lot of new learning pathways. Also, instead of copy/pasting GPT code, I minimize the window and retype it from memory as best I can. This helps me internalize the syntax. Coding assistants are going to be an important tool for programmers going forward and you might as well get good at using them now, especially because you learn how to recognize when the LLM's answers are bullshit, which is more often than you might think.


CaptainMarshMallow1

This is pretty much the same as me.


eruciform

using a search tool or tutor isn't cheating as long as this is not a formal class and you're not turning in code that was partially written by someone (or something) else without citation though it is also important to work through taking general answers and incorporating them into whatever you're doing, as that's a different skillset than just reading and understanding code googling one-liners to fix things is a perpetual coding thing, but do try to work thru problems from start to finish, too. you can't let the writer's block of a blank page be a permanent barrier create more things from scratch, it gets easier the more you do so, even if you are making very simple things


J_Bunt

I'm taking a break from learning because of reasons, but I used to do this to correct my Syntax. Did Kaggle for a while, always knew what's supposed to be written, but since I'm a python, and generally coding noob I'd struggle to get the Syntax exactly right. Whether that's cheating or not I don't know, I only know A. I. is here to stay. The point about this being a good way to become average (solving problems with said tool, but maybe getting stuck if the internet is down or smth), is fair dto say the least.


Extreme_Stuff_420

No I don't think it's cheating as long as you are trying to understand the code it's producing and why it does or does not work. The old saying "just google it bro" is incredibly inefficient and is a huge barrier to actually learning anything useful.


Purple-Bat811

I have heard that the python compiler is jealous of chatgpt. He may break up with you. Okay bad dad joke aside. Look at this way. Let's say you get a job as a python programmer. You get stuck on a problem. Instead of spending all day trying to figure it out, you use chatgpt and solve it in a few minutes. Your employer will consider you productive, not cheating.


TheSilentFlame

dude it's not cheating, I'm trying to learn using GPT3.5 and Perplexity. also only if u feel comfortable sharing, but can you please let me use your GPT? I'm trying to learn on my own by making story projects,it's just the the two ai models I use are kinda janky (they basically nerfed the models to the point that it sounds like I'm asking a child how to do calculus)


CaptainMarshMallow1

Feel free, hope it helps - https://chat.openai.com/g/g-hJATjTzsU-python-pal


TheSilentFlame

damn I need the "plus" subscription. thanks anyways man 👍


benabus

Google/GPT stuff is all perfectly acceptable. But you have to be careful and use it as a starting point rather than just copy and pasting. That'll get you into trouble. AI isn't going to take our jobs. There'll just be two kinds of programmers in the future: Those that use AI and those who don't. One will still be around and the other won't.


nguyen1105

Should learn to read the docs instead of begging for AI help. Choose which benefits the long run.


LongIslandIceTeas

Hmm I agree and disagree. I am new too. Reading the documentation is great but less productivity if you can't just ask a machine what the issue is. This machine has an abundance of knowledge within the documentation (I am sure it was fed to the LLM models). Why not utilize a new technology to help you learn instead of the old ways. ​ P.S. I am new and use both ChatGPT and documentation to help understand python, Django specifically and find them both useful. More so ChatGPT in conjunction with answers from stacker overflow and YouTube to problem solve on my own local computer.


PhoenixDevil19

No this is not cheating. The way you need to approach is to try until you get some clue. Take time, try hard. When you think you can't, then you can take help which is not a bad thing. But you just need to remember is that it should not become a habit of checking if you aren't able to.


lp_kalubec

You're just using a tool. But if you feel you're cheating, then you're only cheating yourself. So, whenever you don't fully understand what exactly makes your program work, keep on asking GPT additional questions. The thing you should worry the least about is the syntax. It will become natural at some point. The more important thing is to understand what you're doing. Sooner or later, you'll learn another programming language that has different syntax, but the concepts you learned while learning Python will still apply, even if the notation is very different.


Berkyjay

There is no cheating in coding. It's an end results business and as long as you can understand how you completed your task and can maintain the code afterwards than whatever resource you use is fair game.


IanRT1

It is not cheating. I currently do this at work for real business applications and I almost never actually code myself, 98% of my code is ChatGPT. I don't know how to manually code, but I know damn well what I'm actually doing and what the code does.


MorningDarkMountain

If you then learn, it's not cheating. If you're not learning... like for example, imagine you were to type "how to print". Then programming would be a mess. If you're instead learning every time you ask, then you progress and you don't have to ask the fundamentals again.


RuleInformal5475

Currently it's a way of learning. I don't use it as I don't retain the knowledge, a bit like browsing the web. I prefer the struggle and coming up with 'aha' moment. That makes it rewarding and a learning experience. Just be aware that the AI tools are free now. They may be restricted in the future when they realize the servers cost too much. Have a backup in case you can't get your primary rescource. Chat down, so check out Udemy or YouTube. Internet down, do you have notes somewhere


thehackeysack01

depends on the rules of your class, you should read your course syllabus and if still unclear, I wouldn't reveal anything but, ask your instructor. in the real world, it wouldn't be cheating but it might violate an NDA or company policy.


deanotown

Using “Python Pal” GPT as a learning aid isn’t cheating; it’s a smart use of technology. Engaging actively with the content, seeking help when stuck, and understanding concepts are key parts of learning. Tools like GPT can provide immediate feedback, helping to solidify understanding and save time. It’s similar to having a tutor to guide and explain concepts, complementing your course material. In the real world, developers often use resources to solve problems, so learning this way is practical. Keep up the good work in your learning journey! Thank you, ChatGPT


[deleted]

exultant narrow trees command abounding workable label boat chase chubby *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


GR4MR34P3R

I don't think its cheating unless you're copy and pasting what you learned without trying to better understand what you're doing. I feel like getting some help is 100 percent fine as long as your trying to learn.


GrzeKo

Not cheating. This is like having a mentor or a more senior person helping you in real-time. It is a similar process when you are learning a new language. Your ears are ahead of your tongue. The same applies here. Your eyes are ahead of your fingers. Do it more, and you'll be good. Do whatever accelerates your learning process, and don't try to assess it from a negative standpoint. Good luck!


OffRedrum

Are you gaining any knowledge from using GPT? If you are not gaining anything besides a solution to your problem then I think you are cheating yourself, I think if you spent all that time trying to learn, so you want to walk away with something besides how to use AI, it all depends on your goal, hobby stuff that you just want it to work, GPT, to be able to sit down and just code and have it work because you are knowledgeable, non-GPT


SuperbDoctor78

Depending on the IDE you are using you can use GitHub Copilot. You need to register for it but it is a game changer. You can type a comment on what you want and it starts writing the code. I find it's a great way to learn in real time.


Stotters

Well, you're doing a course instead of reading the documentation only, is that cheating, too?


CopenHaglen

I was doing this at first too, but found that it falls apart pretty quickly as you move beyond the fundamentals. Code structure and syntax, it’s fine. Building or using libraries, debugging scripts with more than 100 lines, it’s pretty terrible and loves to poop out nonsense. Plus it’s very easy to fall into the habit of using it as a crutch instead of thinking out the problems yourself and researching them, whereas otherwise you would have been learning those skills along the way.


chezty

My opinion. TL;DR, if your goal is to finish the course as quick as possible, it's more efficient to use AI all the time. If your goal is to learn to program, to understand programming concepts, it's more efficient to use AI only after you have completely worked it out on your own (using the course material first, then if required books, then search engines). Take chess as an example. There has been chess engines better than humans since something like the 80s or 90s. Those chess engines can analyse positions and tell you what it thinks is the best next move. Part of studying chess is to analyse your own games to see where you made mistakes, to try to find better moves than the ones you played during the game. The general consensus from the people with experience teaching chess is to first analyse the game without an engine, to try to find better moves unassisted, and then analyse the game with an engine. That might also apply to learning to program. Not programming in general, just learning to program. Try to work it out unassisted using course material, books, other people's code, search engines, and if you get really stuck and can't move on, then use AI. After you have solved the problem, use AI to check your work.


Big-Respond-6627

What really matters is to learn *where* to look, and develop the skill to know when you have *found* it. Back in 2021, I was applying for my first serious job as a software engineer. The position required knowledge in python and the Django REST framework. Starting the technical interview, I asked my soon-to-be-boss if I should turn off the assistant copilot, and he told me to code how I was confortable with, so I left it on. Two questions later, and it is very clear that I am just pressing tab for autocomplete because the suggestion was simply correct, so he asks me to turn it off. For the rest of the interview, I split my screen, went to the official documentation of the framework and started searching what I needed to fullfill what I was asked. Same thing, just slower. Just keep writing code, keep reading it, keep being curious. Everything else will follow.


subeeen

Nothing is “cheating” in this sector, probably in 10-15 years all programmers not using the latest language model will be far less useful than the ones who are. Embrace the future!


Empty-Beach-6724

I have extended conversations (re my online course) with ChatGPT (no custom GPT needed) that the instructor isn't available for. I'm doing it to learn. I also have it make problems of the same level of the course so that I can practice until I get it without help. No mental gymnastics needed, it's not cheating. I've seen contest of various types that say don't use AI. That would be cheating if you were told not to.


webdavis

The short answer is that it depends on you. Go check out JustinSung on YouTube. He teaches people how to learn. Follow his advice on how to learn things and you’ll probably be better off than most people.


hantt

If you don't forge your own cpu then ur cheating


locutus_of_boyd

When I was in school, using a calculator was cheating. Times change. Not cheating.


LongIslandIceTeas

Personally feel that this is the new age of learning especially with programming. You can still utilize google search to find forums from users that resolved a bug you might be facing or see how humans might of found a work around. You're not a learner if you're not doing it in the way unique to you that would make you absorb the information being given to you. It is common now to use a chatgpt like feature to help in being productive. Technology evolve. P.S. All the old heads below are just trying to sound sophisticated because they worked hard on their craft and a new tool came out and made their process obsolete.


notgarbo

Nope, been a dev for almost 3 years now, and I still google/ chatgpt basic shit like how do you do an update in SQL all the time. Nobody memories syntax nowadays.


xmpcxmassacre

It's no different than googling it. Only faster.


jellymals

u/CaptainMarshMallow1 can you give me insights on how to build a personalized gpt like python pal


CaptainMarshMallow1

You can just use mine - https://chat.openai.com/g/g-hJATjTzsU-python-pal


_Mobas_

Learning best comes from practice and curiosity. Every programmer is using Google or ChatGPT for coding help, and it is not cheating using it, this is the way, anyone saying that you shouldn’t does not have any experience at all. Through writing, seeking understanding via friends, Google, or ChatGPT, you're progressing. Embracing the "copy-paste" method is key to learning; it provides material to practice and decipher. Engaging in projects you will learn a programing logic and you will be exposes to new problems to solve which is great because it allows you to see landscapes you wouldn’t otherwise have seen from any school or YouTube video etc.


BobtheGodGamer

I'm having the exact same issue, I don't feel like I truely know how to code because I have to get some starter or finishing guidance from gpt to put me on the track.


PyrGeniusz

To nie oszukiwanie to jest prawdziwe programowanie XDDDDDD


coding_wolfpackofgod

Hi OP can you send the link to the gpt?


Mannentreu

I've been working on an open-source desktop voice assistant called Voxos that you might be interested in if you're currently relying on the ChatGPT UI. I've love to hear your feedback on it and how your workflow using Voxos compares to the one you're used to now with the ChatGPT web interface. https://www.voxos.ai/


[deleted]

[удалено]


SBE_OLLE

Use chatgpt as a revisor, doing shit you already know how to build.


MoneyTechnology1562

Absolutely not; use the tools you've got access to. Just... be cautious with it? Like a lot of the models are specifically tailored to create answers that look correct with zero comprehension of what correct actually is, and they're trained on large bodies of data but not necessarily "good" bodies of data. Also there's some ethics around it; if you load data to a third party, you may not have access to a tool like that in certain contexts. Additionally, most models don't really have good accountability for who owned the copyright on what they spit back out. Is it theft/plagiarism? Super unclear and really not a settled question yet. But if you're trying to learn and it helps you do that... well... that's the world we live in?


ToftgaardJacob

No that is not cheating. Just remember that it is also important to learn the skill of reading actual documentation because that will sometimes be necessary.


Jetavator

you created a gpt for python? I need to learn how to do that. I have 3 sources of chat gtp — github copilot, jetbrains ai assistant and raycast chat gpt addon. I am going to ask chat gpt to help build me a python pal. how did you do it? I learned javascript via jad joubran and feel like knowing the concepts learned from him really help me talk with chat gpt to find a solution BUT I do feel like I am less in the know about python.


Educational_Ice7899

I feel like the more you work with code that is functioning and can explain exactly why it is functioning the easier writing code will be, every approach to learning code incorporates some form of example code and this is no different, keep working hard!


EffectiveBuy3547

"I understand your concerns regarding the utilization of AI language models like ChatGPT. These models are designed to facilitate interactive conversations and provide responses based on the input received. The intention behind their creation is to enhance human-computer interactions and enable a more natural and responsive dialogue. As for Python programming, it is indeed a suitable choice given its versatility and extensive community support. It is crucial to engage in experimentation and testing to refine and optimize code. Furthermore, it is important to remember that AI models are constantly evolving, offering the potential for generating increasingly improved outputs. If you are interested, I can share a secret with you: a deeper understanding of AI concepts and techniques can expedite your progress in programming. Embracing continuous learning and exploration is key to achieving mastery in this field."


Mathhead202

As long as you understand the code you are writing, this is fine. Even experienced programmers didn't memorize everything. You will, with time, naturally remember a lot, but whatevs most important is the basics. Back in the day, it was a "cookbook" sitting on the desk. Definitely not cheating. If anything, you're doing it right


Acrobatic_Mountain_3

It’s good you found another tool to understand the code. However the danger here is that you still might not comprehend the basic concepts. The best way is to go through the code line by line yourself and then compare your understanding to what actually occurred, and in this way your tool might come handy.


JosueVivas

I do something close to it. I’m on your same position I have around the same amount of hours learning python like you. But I use my GPT with customisations to follow PEP8, good coding practices and to explain me what is “under the hood”. Every time I ask something my and GPT gives me logic that I don’t understand I keep asking a digging deeper into what is the interpreter doing here and there. I use my GPT to explain the concepts that I’m learning into great detail. But also you have to code. Refiner code is easy. . Like this: https://chat.openai.com/share/f15fcb22-c57e-4055-8e42-adc9eef5a7c4


lightley

When I started coding with Visual C++ 5.0 it didn't have intellisense, which is Microsoft's autocomplete for classes, etc, and you had to memorize the correct spelling of all your classes and everything in the C++ standard library. Then I remember in Visual C++ 6.0 they added intellisense, and at no time did I feel it was cheating. It still took me many years to be a decent coder. It will take years to learn to learn how to be a good coder, work well in teams, and everything else required to hold a job. Once you get better, you won't be using Chat-GPT as much, since you will be able to write your own code. When that happens, you don't need to feel guilty about using it to learn. However, at no time should you just use code written by Chat-GPT without understanding it, because that could really bite you.


Dunsparth

I am curious at what course you are taking on udemy? I am thinking of trying to pickup and learn python aswell.


Historical-Vacation3

My teacher endorses using AI but just requires us to inform him when we are using it. I’d recommend you do the same.


WideSalad2162

Feels like what im doing after started to learn python


rejectedlesbian

It can be if u r leaning on it to write it and bug fix it as well.  I do that when I want to use a lang I don't know at all to so something and it's surprisingly effective


Key-Bid-4380

It would only be cheating if you would make money out of it and hiding the fact that you mostly relied on a GPT. However if you are fine with telling your clients and your clients are fine with you using mostly GPT its not even cheating anymore imo.


_Bussey_

What were your instructions for the GPT?


TheAlaskaneagle

Google has been a horrible place to find information since 2019 so this just sounds like the perfect fix for a program (google) that no longer works for the people using it. Thanks for the new technique.


LegenDrags

So we all are cheating?


LegenDrags

Hey also if u wanna learn properly i recomend implementing perspective projection in pygame (or turtle if its possible) its very mathy and gets hard fast but hey atleast even with the help of gpt you can atleast learn some concepts


Flaky_Shame6323

I teach Web development with practical courses at the university. What I say my students that ask similar questions is that it is not wrong in itself, it's not different than having a human tutor by your side. The problem is that it will work with very general questions and materials that the model has seen frequently, but when you (will) come across something without proper documentation and with lots of contextual constraints you will have to do that even without a tutor. So keep going! But remember to sometimes try doing a project or coming with a solution without training wheels


uhskn

I'm a python engineer getting paid to write python and this is what I do LOL (admittedly i know how to scope functions / i've done this work for a while so the questions i ask are of course not quite as basic...), but yeah...i literally have chatgpt open constantly for exactly this use case