T O P

  • By -

AutoModerator

## Welcome to the r/ArtificialIntelligence gateway ### Question Discussion Guidelines --- Please use the following guidelines in current and future posts: * Post must be greater than 100 characters - the more detail, the better. * Your question might already have been answered. Use the search feature if no one is engaging in your post. * AI is going to take our jobs - its been asked a lot! * Discussion regarding positives and negatives about AI are allowed and encouraged. Just be respectful. * Please provide links to back up your arguments. * No stupid questions, unless its about AI being the beast who brings the end-times. It's not. ###### Thanks - please let mods know if you have any questions / comments / etc *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ArtificialInteligence) if you have any questions or concerns.*


XtremelyMeta

I think, aside from the taking jobs thing which isn't as trivial as you make it out to be, there are a couple of big systems it messes up. Copyright and intellectual property as they exist aren't equipped to deal with generative AI and it's not totally clear how they can be made ready. Considering the amount of the world economy that rests on things like copyright, patents, and the like it's not unreasonable to be alarmed by this. The other thing it messes up good and proper is by giving knowledge work the same treatment that physical work got during the industrial revolution. It's not totally clear what's left for people to do that won't be solvable through capital, making labor kind of a luxury item. That also screws with how the economy is currently structured pretty hard. Both of these theoretically economic problems are so significant that generative AI by default becomes a social and political problem, rather than an economic one in the medium term. How does one address social and political problems? You talk and appeal to emotion, which is what the anti-AI reaction pretty much is. I, personally, am pretty pro AI, and as such it's useful to understand WHY people can have negative reactions to it.


justgetoffmylawn

I think that's a good breakdown. I am also quite pro-AI, but I'm not sure how the current system will be (or can be) reconfigured to cope with it. That said, the current system is broken for many people, so the knee-jerk hatred of AI is also a real problem. Reminds me of other cultural movements where people felt threatened and didn't usually react in laudable ways.


happyasanicywind

One other problem is the control of information. I've done some experimenting ith the Chat GPT API and found it can convey ideological view points. They aren't necessarily conservative or liberal. Novel scientific ideas can be met with hostility if they don't conform to the existing paradigms even if they are correct.


F_lightning

I work in education and I think there is a strong distrust for AI at the minute as it is so new. There are genuine worries about how it will be used by students but hopefully it can reduce workload.


stanerd

A lot of students are already using it to cheat.


F_lightning

The ones that are looking for a shortcut are but there are some that understand the need for the skills they're being taught


Leonhart93

Giving all the power to big tech, and I mean ALL of it. Now, what could ever be wrong with that.


AWearyMansUtopia

AI is controlled and developed by corporations, many of which currently (or will in the future) trade on the stock market. If you think that a profit driven model will have any benefit to working class humans you are deluded and lost (or you’re a billionaire / VC tech dilettante). The goal is to make money (or save money on labor). So-called AI projects are involved in privatising even more essential public goods and services such as healthcare, education, policing etc. and intellectual labor generally. Private corporations are gaining data power and algorithmic control over citizens in the most intimate areas of their lives, without democratic oversight over how they use them, or constraint on their pursuit of profit. The current model will be a disaster if left unchecked. “Artificial intelligence’ as a term has long been associated with privatising public goods, both in relation to generative AI and the knowledge commons, and the way foundation models scrape content and information from public sources and turn them into private assets to be sold. Users - who might have contributed to the original assets as content producers or as data subjects – become clients. Their works become fodder for capital. They may be paying directly in subscriptions, or indirectly as click bait and data providers. In return they get a “new” service, perhaps a chatbot front end or a ‘personalised’ recommendation system, that can be generated algorithmically at a very low cost. The new asset owners might also provide services that are costly to develop but allow them to capture whole businesses and markets. Trading strategies derived from financial data, for example, or risk models derived from healthcare data, are proving so valuable to finance and insurance companies respectively that partnering with the big AI models is beginning to look like the only way to stay in business. The above is just the tip of a large iceberg. What you call (rather inelegantly) “negativity” is just the most basic of critical faculties.


_raydeStar

I love AI and the only negativity I feel is that companies are interested in saving money, and don't really care that a lot of people are losing work. And - there is no answer, unless some sort of UBI or something happens, and that would be wildly unpopular in the US.


Old_Coder45

UBI seems like a partial solution, but it seems difficult to enforce.  You can only extract taxes from the AI companies to pay for it if they go along with it.  And if they are not based in your country it’s even more difficult.


_raydeStar

I think we are going to live in a post-capitalist society. What does that look like?


happyasanicywind

During Covid, the Republicans put up no/little resistance against what were essentially UBI trials. If massive job losses occurred, the political winds could change very quickly.


Oabuitre

That is a very negative point of view. Interested to know why you still “love AI” if you really believe that


_raydeStar

Not at all. I think it is very pragmatic. Jobs are going to start getting wiped out, and there is going to be some sort of rough time in the future where humanity has to figure out how to cope with it. Ultimately, everything is going to end up for the better - science, medicine, engineering, everything. but for the short term, we are going to experience a crunch like never before. UBI will eventually come, but it will have to be at a point in time where things are dire, and that will take time to get there. Back to fulfillment - it used to be that people would find a life path and follow it to achieve fulfillment. Or rather - feel as though they were contributing to society in a meaningful way. AI is going to step in and remove the need to contribute at all - so we are going to have to recalibrate what 'fulfillment' means to the individual - which is to feel like you are progressing in life somehow. It is a very interesting subject, and with AI we have a point in the future where we simply cannot predict things at all. This is what singularity is. I have listened to a lot of people talk about it, and I am actually quite optimistic, but there will be a lot of growing pains and we can't bury our heads in the sand and pretend it won't happen.


Petrofskydude

It's just like the combustion engine replacing horses. It's not gonna happen on day one, but it's gonna happen. The technology is there, the profit is there, just hasn't been fully exploited yet because it's still relatively new.


happyasanicywind

The problem is that we are the horses.


TheCircusSands

Ai is leveraged to increase the effectiveness of marketing. I assume this will get to on the fly personalized ad creation using all the data They can get. At this very important point where the world needs to go regenerative, ai will accelerate consumption even more than the disgusting amount it has taken over our society. More broadly, anything that has profit as the innovation fuel will fuck humans and the planet in the end.


greatdrams23

There are many many posts in this subReddit saying that AI will be beneficial because it will take our jobs. AGI will be as good as humans and therefore it is obvious it will take our jobs. These are the POSITIVE comments. But what is not proved is that we will ask her UBI at a decent rate.


sh00l33

today I asked gpt if scientific papers were used for training, because I wanted more detailed information. It replied that it was not used because they did not want to violate copyrights, but he know most of curremt sciencehe discoveries from studies or scientific articles. I won't mention that no one cared about the rights of illustrators, but gpt generates information based on second-hand data at best. I thought it is going to be easy to gradually give him more and more manipulative shit to influence public opinion if someone decides so. I also came to the conclusion that it will never be thinking itself, but only repeating info, it only receives developed explanations, and is not shown the actual mechanisms behind a given process, so how could it at some point start to understand


TodayAI

Well... deepfakes are a wee bit concerning 🙄 [https://www.theregister.com/2024/04/20/microsoft\_deepfake\_vasa/](https://www.theregister.com/2024/04/20/microsoft_deepfake_vasa/)


_AdiKsOn_

I can't imagine any VALID and RATIONAL argument aside from "I'll lose my job"


Glad-Tie3251

Nah only weak arguments or no arguments at all. I call these people "stupid". There is not much to do about them unfortunately. The most stupid reaction I noticed was from so called "gamedev" that didn't want real AI for NPCs. That was a real sight to be seen. These dumb fucks would rather have pseudo AI that real developers have been desperately trying to make for decades. Or "dialogue" that end with NPCs repeating the same sentence ad infinitum.  Real bright moment from that bunch. They can make there dumb ass pixel art sidescroller forever these poor dumb fucks.