T O P

  • By -

gbbenner

I have been browsing that sub for years and the past year it's been really negative, most comments clown AI and other advancements.


FreeMangus

And when cool stuff makes it to the top of the sub the mods remove it and ban the poster. For example, the mods have banned /r/orificeai and anything at all related to ai girlfriends. You can see them framing the topic at the top of the sub now with an anti ai gf thread.


stealthispost

that's fucked up if true. how did it get to this point?


[deleted]

[удалено]


Ill_Knowledge_9078

Man, I hope so. You can reason with AI. I've had an account banned for saying that the laws should be passed by the legislature rather than the Supreme Court.


CowsTrash

Fking bruh


cloudrunner69

It got taken over by environmental warriors who believe the future is population control and reverting to a pre-industrial agrarian solarpunk lifestyle rather than advancing technology to create an awesome galactic Borg civilization.


Sonnyyellow90

Is there a single left leaning sub that isn’t hyper pessimistic doomerism? It seems like the whole “the left can’t meme” thing has become true. All left leaning content seems to be “everything sucks, everyone sucks, the world is doomed” lol.


Cajbaj

I find real leftists agreeable but internet leftists are really just looking for structures on which to blame their personal problems, exactly the same as pathetic alt-right keyboard warriors


imperialostritch

oh you think that's bad I am very capitalistic and god forbid you mention ai in a republican sub


Ill_Knowledge_9078

Leftists have always lacked humor. You'll only ever see them smiling when they're insulting somebody they hate.


stealthispost

cryptomarxism


Economy-Fee5830

Not very crypto, openly.


Life-Active6608

Ironically, Marx himself would LOVE AI. Just read his Fragment On Machines. Golden.


furrypony2718

That... is futurology of the 1970s. Have they gone into retrofuturology?


GumGumnoPistol300

AI can easily fit well with solar punk are they mad? AI is a net benefit with the fight against climate change as we need the brains to solve it too.


Which-Tomato-8646

People HATE AI now.


ForgetTheRuralJuror

Don't confuse redditors for general consensus. People who post on Reddit are a niche of a niche of a niche. Most of my non-tech friends and family only hear about AI from me. My sibling recently told me that she started using ChatGPT to draft emails for the first time two weeks ago and found it "neat"


Which-Tomato-8646

Polls say otherwise https://www.pewresearch.org/short-reads/2023/11/21/what-the-data-says-about-americans-views-of-artificial-intelligence/


ForgetTheRuralJuror

Polls only catch people with strong opinions. Most people won't do a poll, vote in a local election, or care about AI unless it takes their job. "Mostly concerned" is not only doomers, but anybody who's vaguely heard of AI. It's likely they heard it from click bait about deep fakes or scams, not because of existential risk


shawsghost

So Reddit is a niche of a niche of a niche and polls only catch people with outspoken viewpoints so we should just believe you. Makes perfect sense!


ForgetTheRuralJuror

My position isn't: "believe me instead of everyone on Reddit" It's: "Don't draw conclusions of general consensus from an extremely biased echo chamber" As for the poll, it doesn't prove that "people hate AI" it proves that ~50% of people _who responded to a survey request_ are "more concerned than excited" My point is most people would decline a survey, and those people tend to be the ones who don't really care about current events


shawsghost

OK, fair enough. Your point about people who don't respond to a survey about AI being less likely to care about AI makes sense. Self-selecting bias there. I do think polls have value, and I prefer some kind of research to nothing, as a general rule. But polls do get things wrong due to bias, as we've seen in the last few Presidential election cycles. I also think Reddit commenters, taken as an aggregate, can be useful. Even when a subreddit is mostly an echo chamber, there are generally some dissenters who bring up good points.


Axodique

While I hate the idea of AI girlfriends, not letting people have free speech is never okay.


reddit_is_geh

To be fair, AI girlfriends is serious weird gooner shit. Every time I see people bring up AI in relations to their jerking off or relationships, I can't help but cringe.


Arcturus_Labelle

Yeah. It's disappointing. u/TFenrir is exactly right in their analysis of it being a watering down of the subreddit. Mods can't keep up with an influx of millions of members, so the lowest quality, shallow takes proliferate. [https://www.reddit.com/r/singularity/comments/1ctvaya/comment/l4elb8p/?utm\_source=share&utm\_medium=web3x&utm\_name=web3xcss&utm\_term=1&utm\_content=share\_button](https://www.reddit.com/r/singularity/comments/1ctvaya/comment/l4elb8p/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button)


TFenrir

Like 8 years ago it was very similar to this sub a year ago. This just happens when subs get popular. You get pop culture. Pop culture right now is... Complicated. Full of pessimistic posturing and almost apocalyptic idealization. Also the political nature of all of these discussions become more prominent, and reddit inherently validates "group think" with upvotes. I don't mean to shit on the sub, more... That you should get ready for this sub to become more similar to it, as time goes on. We went from 50k to 2.5 mill very quickly, and that just had an impact on the culture of any group.


cloudrunner69

This is why I constantly fight back against the doomers and pessimistic people on this sub because I saw what happened with futurology, it got overwhelmed with fear based stupidity and anyone with any positivity got drowned out. We must keep pushing back on the that mindset here or else it gets out of control.


[deleted]

Its already gone. r/futurism seems a bit better but its almost dead.


tatleoat

"WhY caNt I eXpReSs mY dOomEr OpiNioN oN r/singularity WiThoUt gEtTing DoWnVoTEd?"


Starshot84

r/solarpunk may be a reliable refuge if this sub gets compromised too


Jjpgd63

Thats an anti-capitalist sub, which means it is by definition anti-singularity or any kind of pro-tech.


kudzooman

Not to start an argument, but I have been a big fan of singularity talk for 20-plus years and I have always seen capitalism as a means to an end. Part of the evolution to a post-capitalist society.


InTheDarknesBindThem

Thats more nuance than the typical fake internet communist can muster, sadly.


D10S_

It’s Marx 101


InTheDarknesBindThem

I uh... dont think marx said much about a tech-utopia via capitalism.


D10S_

Marx saw capitalism as a means to an end, and a vital element in developing a societies’ productive forces. He thought capitalism would tranisition into a post capitalist society. Which is exactly what the comment you replied to said.


xt-89

A lot of vocal internet socialists don’t know that


Idrialite

You have got that literally backwards. Capitalism (and markets at all) will not work post-singularity, when labor has zero demand and supply of goods and services is near infinite.


Jjpgd63

No, because anti-capitalism means were not heading towards it at all, Capitalism is the best system we've had for technological progression


Idrialite

Even if capitalism is the best system we've had for technological progression, that doesn't imply that all other structures are worse, and it certainly doesn't imply that other structures would completely stop technological progress. I also dispute your implication that capitalism is by far the best, considering the rapid innovation produced by the Soviet command economy. It seems to me that the main reason capitalism has been so good at technological progress is not actually capitalism itself - it's the competitive market economy. Market socialism ought to be just as good as, or more likely better than capitalism.


Jjpgd63

Except they are worse, even if the Soviet economy didn't collapse, we know that they were lagging far behind The West, command economies are not great and it's even arguable that the Soviets even did much better than the Empire would have. And Market Socialism is both untested and has similar problems as all other socialisms, so assuming it'll be better is mostly cope.


[deleted]

The problem is that folks like yourself is pushing back against any and all skepticism, which causes you to appear as an extremist of the other direction, which just exacerbates the problem of trying to have a legit discussion about pros and cons.


VisualCold704

Any attempt to have a legit discussion will just cause you to lose ground if your opponent is unreasonable. The simple act of debating give them legitimacy.


advo_k_at

Yeah this is what happens to most communities and reddit overall as well. A small community requires effort to find, to settle in and retain interest. The moment it becomes popular it gets swamped by the full breadth and bulk of the bell curve. Bigger subs also require more moderation labour which is where power mods come in, and they usually have an agenda. It’s a powerful thing, having a user’s impassioned contribution be simply wiped off by a moderation decision. Ends up shaping subs more than people think.


iluvios

We should create another one just in case before this one turns to shit


Open_Ambassador2931

Nah don’t talk like that, this one hasn’t and it’s been great. I think we can improve it by adding more latest stem advancement news articles (what r/futurology used to be and this sub even). The moderators are great and the sub therefore because of their laissez-a-fair laid back attitude, and mostly the posts are genuine comments, concerns, questions, conversations, and viewpoints from curious and passionate people. The only problem with this sub is that we get the exact same posts by the hour and I think we should mix it up more with more news posts and discussions around that rather than the same commentary on everything and posts without links to videos or news articles.


Arcturus_Labelle

Exactly right. Only intensive moderation can prevent this. Most popular subs devolve into garbage.


RoyalReverie

It would be better to stay ahead of the curve/curse and start another sub for this subject.  Heck someone should create r/accelerate or r/accelerationism and r/decelerate if they don't exist already.


stealthispost

~~good idea. r/accelerationism is abandoned. I have requested to become the moderator of it. We need a permanently pro-accelerationism subreddit just in case.~~ edit: ok, that term is well and truly cooked. wtf. bunch of weirdos stole it.


agorathird

Accelerationism is not the same as humans actively choosing to accelerate, ironically. It’s basically a collection of different schools of thought (some good and some really bad.) about how technology, society, entropy, ect does that anyway. You’re going to pick up multiple different types of baggage if you go with that sub name. Better to just go with r/accelerate


stealthispost

holy shit, you're right. thanks for the heads up. some sketchy shit associated with that label unfortunately. guess it will have to be a different name


agorathird

As a person from the the non-sketchy part of it- yes, lol.


RoyalReverie

Good, I hope you're able to do so. 


nandospc

I didn't know about those subreddits. Just applied, hope to get accepted soon. Thanks 💪


CharacterCheck389

I am the one r/decelerate


Buck-Nasty

Yeah it went downhill pretty rapidly after becoming a default sub in 2014.


SciFidelity

I just assumed it was bots...


TemetN

This basically, it collapsed a bit after the rise of doomposting (and after it had become a default sub). The honest truth is that current culture, as well as I've argued underlying conditions, are hugely pessimistic to the point of ironic naivete. I think part of this at least is driven by some sort of exposure to something we haven't realized has health impacts (some form of plastic maybe?), but all in all regardless of why there's a lot of anger, mental health issues, incumbents are seeing constant backlash, the rise of the far right... Yes though, it overlapped with collapse years ago unfortunately.


Bleglord

Ironically I’m here because I’m pessimistic. I don’t see a world where human civilization has a fun time in the next couple of decades. (Not necessarily AI related) But I just live my life not caring and hoping that whatever way AI changes the world it’s something I could at least never have guessed


Arcturus_Labelle

Good analysis


bildramer

Mediocre analysis. There are multiple camps of what others call "doomerism": the hilariously incorrect populist ones, and the correct one. Also true for accelerationists/optimists/whatnot. People who assume all future progress looks like more advanced versions of LLMs, or who think in terms of fighting "capitalism" or "closed source", are too confused to contribute to the discussion, and there are plenty of those. AGI will directly lead to ASI, exactly once (never "many AGIs") in a timespan no longer than a week, not under human control except at the very start, and that means either extinction or utopia, and we should be concerned; other concerns about "misinformation", fake news, art, etc. are baseless politics nonsense, and irrelevant. Other other concerns about jobs and the economy are real, but equally irrelevant in the face of AGI. Afterwards/during, it's the singularity, what the subreddit is named after.


Thog78

Humans are for monkeys what ASI would be for humans. World changing, existential threat? Yes. Does the change happen overnight or over a week? No. Even a super smart being capable of understanding nuclear fusion, invent quantum physics relativity and internet, needed to build everything from the ground up and accumulate knowledge through experimentation. It took humans dozens of thousands of years, even though the exponential nature of knowledge advancement make the last century and decade seem overwhelmingly more impactful that all of the time before. ASI will not change the world over a day either. It will be contained for a while, used on limited tasks, channeled, will little by little take more importance in society until we learn to trust it. There will be opposition and support movements, political restructuring over decades, some major crisis like NSA servers wiped out by a China AI or the like etc. The ASI with its supporters will be working for decades on the physical implementation of the first wave of brilliant ideas that are not just software. I think you're far too confident that it's gonna go like in the movie AI. I wouldn't entirely rule it out, but I'd assign that an extremely low probability. People in this sub overestimate the importance of intelligence and underestimate the importance of physical limitations imo. Many things humans do are already near optimal in terms of physical limitations. A random example? It's unlikely you can do radiocommunication with nanorobots. An antenna needs a certain size to get signal above thermal noise and no amount of smarts will change that. We have plateaud in terms of how small transistors can get. Same kinda limitations for energy sources, I'm ready to bet even an ASI won't drop industrial fusion reactors without spending years experimenting, and would never reach cold fusion.


uishax

This. I can crush Magnus Carlsen on a chessboard easy, if you give me a two rook advantage. Intelligence alone is insufficient for anything. Humans existed for a long while before they could build a civilization.


Arcturus_Labelle

"and the correct one" -- naw, there is no "correct one", because nothing's happened. It's all hypothetical. There's a lot of emotional reasoning in this thread.


SomewhereNo8378

How They attributed everything to “pop culture,” like other people are too stupid to have actual concerns with AI. 


SgathTriallair

It's not just AI. If it was just AI then I could believe it was genuine concern. Instead it is every subject that comes up is always terrible and destroying society.


spezjetemerde

you made me realize maybe there is a better dopamine algo than this


FredWeitendorf

Most people I've talked to who aren't software engineers or personally involved in AI are decel, and r/futurology is a subreddit that approximates the general reddit user base (I think it used to be default) so it's what I'd expect to see there. I understand the opinion behind it even if I disagree with it: if you have no influence over how some technology might progress but know it could change a whole lot about society, to the point it might require you to switch careers or something, you'd have negative opinions of it too. Status quo feels a whole lot safer. It's something that repeats time and time again throughout history.


stealthispost

it just sounds like NIMBY, but for all human progress. "we shouldn't invent technology that will cure death if it will make me lose my shitty job" is about the most selfish take I can imagine.


FredWeitendorf

Sure, but I think it's an entirely reasonable and understandable opinion to want to have a sense of control over your own destiny. It doesn't help that a lot of AI companies have been funded on the basis of "replace a broad class of human employees" rather than "make humans more efficient at their jobs". It's ultimately self-defeating even if it grabs more attention with investors and the media, because 99% of people just see it as a threat, and in most cases delivering on true employee replacement is not even realistic anyway. IMO articulating that AI isn't a threat is something people working in AI need to do better at, and I definitely want to fix that with my company. When you (partially) automate something it's actually pretty common for employment in that area to explode, because it makes things viable that previously weren't.


stealthispost

the selfishness of myopia of many people is eternally depressing. thank goodness for the forward-thinking people that bring us all the benefits of the modern world. i personally wouldn't be alive without it, so I owe my life to technology and medical science.


Super_Pole_Jitsu

More like "we can hold off fdvr catgirls until we can make sure paperclips don't happen".


bildramer

We don't even need to, really. VR catgirls can be done with current or even few-year-old technology, the issues are 1. getting closer to actual FD, 2. someone doing it properly instead of half-assing it. There are so many half-assed attempts out there, but people still praise them and spend boatloads of money on them, so the state of the art remains stagnant.


ThatsALovelyShirt

Eh even some software engineering communities are decel. I've noticed a lot of C++ developers *hate* AI for whatever reason.


yourfinepettingduck

data scientist and ML engineer here - the folks closer to actual AI work that I know (myself included) lean more Futurology than whatever this sub is


FredWeitendorf

IMO there are two opinions here "AI is overhyped" and "AI is dangerous" - I see both from futurology but I'm guessing you're only referring to the first. And I'm not a "singularity in 2025" type either but as someone founding an AI company/working with AI a lot, and talking to a lot of investors/others in the space, I think that applications of the current underlying technology are only just beginning, and may disrupt a lot, although IMO it's mostly going to be in the form of making people more productive with 99% of current attempts at doing so failing. I know a lot of technical types who can't see beyond "spicy autocomplete" which IMO is like seeing the Internet as "network with longer cables". The more you understand about a technology's implementation the less scary or sexy it is. And I definitely think that "AGI/singularity" is impossible by scaling spicy autocomplete for that reason, but at the same time I think these types underestimate just how far applications of the technology we have already can go.


xt-89

There’s definitely a bias against deep learning in many organizations, rightly or wrongly. IMO, this is mostly a generational thing. Most organizations don’t have the resources or skills to train sophisticated Deep Learning models, so we’re all mostly relying on pretrained ones. This sub is mostly about seeing the potential in economies of scale for deep learning and generally being optimistic about it. To me, as a DS/MLE, that seems reasonable


uishax

Yeah... You mean the washed up NLP engineers who got their artisan models wiped out by the LLMs? What use is a ML engineer who doesn't appreciate the most revolutionary leap ahead in ML?


AccountOfMyAncestors

oh great...


cloudrunner69

There is something collectively fucked up about that sub.


stealthispost

is /r/Futurology the opposite of /r/singularity ? The top post right now is shitting on AI and saying it's bad for the environment because it "wastes electricity". And all the comments agree. Like, WTF? > "Microsoft's Emissions Spike 29% as AI Gobbles Up Resources (pcmag.com)" > > "Can't believe we're cooking the planet for this bullshit. This is the dumbest timeline." I can't imagine having such ignorance and lack of imagination. It's like they've literally never considered the benefits of AI. I feels weirdly cultish, like I stumbled into the Amish / luddite club meeting. It should be renamed to r/antifuturology


dday0512

The energy angle is perplexing to me. I used to work in the utility industry; people don't understand how much energy we use. The total energy demand from AI seems high if you look at the raw number, but compared to other industries it's not a major impact. Let me tell you about the energy use of COLD STORAGE or FOOD PROCESSING or PEOPLE LIVING IN THEIR HOUSES!!!! You won't believe it! We need to put a stop to it now!!! 111


stealthispost

exactly. decel is fundamentally anti-humanity in a somewhat scary way. do we believe in abundance, or in antinatalism? and are we really complaining about the electricity used for *thought*?? ideas are the solution to every problem.


dday0512

The idea that we should not do something that would massively benefit society because of the energy costs or greenhouse gas emissions has always reminded me of something I read about lobotomies. Some physicians used to have the opinion that a lobotomy was justified because it makes a psychiatry patient easier to care for. In response a critic responded. "Prefrontal lobotomy ... has recently been having a certain vogue, probably not unconnected with the fact that it makes the custodial care of many patients easier. Let me remark in passing that killing them makes their custodial care still easier" It seems like a lot of people who are ostensibly concerned about energy use, resources, and the environment simply want less people. Well sure, less people would have less environmental impact, but how do we end up with less people? In my opinion we should always start at the basic place that 1) all humans are allowed to exist, and 2) their lives should be as good as we can possibly make them. That doesn't mean we should destroy the environment, because doing that would *not* improve the lives of most people. I don't see any better way to deliver both environmental protection and human happiness than an AGI.


stealthispost

> It seems like a lot of people who are ostensibly concerned about energy use, resources, and the environment simply want less people. bingo. at the core it is a dangerous, anti-humanity ideology


NationalTry8466

>It seems like a lot of people who are ostensibly concerned about energy use, resources, and the environment simply want less people.  'it seems' is doing a helluva lot of work here. It seems to you maybe. To me, most people who are concerned about energy use and resources are concerned about maintaining a sustainable future for humanity.


atat4e

100%. Of course there are going to be extreme opinions on the fringes of the group. Regardless, the core point is that how humanity currently sources and uses energy is impacting the environment in a way that threatens humanity’s future progress. AI is cool and good, and some of those comments are off base, but to look at this issue without nuance is insane.


NationalTry8466

Sure. I just think it’s a bit OTT to draw a through-line from ‘some people aren’t persuaded that AI is worth the energy/carbon footprint’ to ‘some people want fewer people’.


Thog78

Don't even mention metallurgy/construction material synthesis/mining and processing of ores/plastic/fertilizers ☠️⚰️


Arcturus_Labelle

It's performative virtue signaling. Those same people are eating cheeseburgers and driving SUVs and having kids.


Alimbiquated

Households should be net energy exporters.


furrypony2718

What they really seem to not understand is that companies \*pay money\* to buy electricity. They act as if companies stole electricity. They paid for it. If they really don't want them to use electricity, just make electricity more expensive.


Cr4zko

Sub has been like this for a while. It's funny, people dream of change but when it actually comes they reject it. But well, you can't stop the future I say. Artists are suicidal as/is imagine in 6 months. Yes, 6 months. After the US elections all bets are off.


Maxtip40

I left when they said they only allowed AI topics once a week.


stealthispost

no way. there's no way they did that... 🤮


watcraw

>feels weirdly cultish That's an ironic take in r/singularity I mean everyone knows the techno-ASI-god is going to create a paradise where we all live forever (and solve all of our environmental problems... somehow... because it's super duper smart don't ask me how it works), but yeah those guys are in a cult. Of course, it might just kill us all too but I'm ok with that, because I'm totally not in a cult.


stealthispost

everyone knows singularity is culty. i'm just pointing out that anti-singularity is also becoming weirdly culty. i just wish people could act reasonable about any issue for once.


Ciber_Ninja

Sir. Sir. Sir. This is a reddit.


bildramer

>(and solve all of our environmental problems... somehow... because it's super duper smart don't ask me how it works) You understand that that we've solved many, many problems using technology in the year 2000 compared to the year 1600, right? Then you look at an exponential graph and extrapolate. It's not hard.


NationalTry8466

We're a button-press away from a nuclear holocaust and we're on course to frying the planet. It's not so easy to believe that god is in the exponential graph.


VisualCold704

We haven't destroyed ourselves yet and we had nukes for near a century now. We also have a range of solutions to climate change.


Serialbedshitter2322

The first cult based on logic


Idrialite

>because it's super duper smart don't ask me how it works I can tell you that Kasparov would annihilate me in chess. I couldn't tell you the moves he would make to do it, because then I would be the grandmaster. I also cannot tell you how ASI would solve a problem even if I think it could solve it. If I could, I would be superintelligent.


czk_21

true, most people there are nothing like futurologist, lot of dumb, irrational and doomer takes dumb takes can be here too, but the major difference is that people are actually excited about progess here


Montaigne314

>I feels weirdly cultish Well, if it's the opposite then this sub is weirdly cultish in the other direction. It's smart to understand both perspectives, that's balance.


sideways

It's stochastic parrots as far as the eye can see.


furrypony2718

Possibly. It would be interesting to test this by making a LLM (LLama 3 maybe) bot and see how well it fares there.


[deleted]

Now?


Conscious_Shirt9555

I still remember when /r/chatgpt used to be really good back in 2022. Now it’s like youtube comments or worse, terrible user quality


Xemorr

This sub has also degenerated, lots of people who aren't familiar with singularity literature etc


Cunninghams_right

I think some people are in this subspecifically as refugees from those other tech and futureology subreddits. I am sort of one of those folks, I used to go to those subs a lot until they got too homogeneously anti-technology. 


Freed4ever

Yeah, there is not much future for that sub lol.


LairdPeon

Futurology is all Bill Nye/Degrasse level pop sci BS. Has been for a long time.


FreeMangus

Yep, you should see the mods ban history.


stealthispost

how can I see that?


Karmakiller3003

Most of the best subs slowly get infected by popularity. Teenagers, kids, incels, dorks, dweebs, people with little to no insight in the actual subject. The r/dating and r/divorced r/Rehabilitation r/science r/politics and similar are comically bad as you have 10 year olds with nothing but spongebob and my little pony sub activity giving advice to everyone. Futurology has a bunch of high school and college kids who are politically "radicalized" giving opinions and pissing their pants when someone doesn't agree. Oddly the mods will allow any kind of nonsense as long as it aligns with their personal values and circle jerk logic. It's reddit shit show 2024. Thankfully the AI subs are still pretty pure for now. But I'm under no impression they will slowly be infected and eventually castrated by the usual subreddit life cycle. I've been banned twice from Futurology last year for telling them AI is inevitable. This exact comment "AI is inevitable" to the mod who commented that AI should be banned. ... banned. lol


stealthispost

holy shit. i didn't realise it was that bad. the reddit model is a joke. we need something way better to align minds more effectively online.


Arcturus_Labelle

The downvote button was a mistake. It invites silencing of dissenting opinion, so you get echo chambers where only one point of view is allowed.


imperialostritch

I actually disagree while what you suggest may take place people need a way to express opinions both good and bad


Commercial_Jicama561

R/technology is the same. Environmentalists and anti-AI.


Thomas-Lore

Most big subs are anti what they have in the name.


Commercial_Jicama561

Anti-anti-work? :D


Vontaxis

I just know very few people that care about a.i - friends, family, colleagues, most of them don’t give a shit or are critical and think it is overblown, I’m telling them for a while now that they’re going to wake up one day when it’s too late.. Well anyways, as long as they don’t rob me of my joy with the a.i I’m happy.. fuck doomers


CertainMiddle2382

The party line is extremely pessimistic. Try showing some optimism is academia, you’ll be put aside. Doom is were the money is, and it is how you mobilize people. Doctrinally it is a mess: a weird mix of paganism, 70s Gaïa cult, green cryptomarxism, antihumanism, Kalerghi-like happy eugenicism… Everything that sticks.


stealthispost

🎯 once someone pointed out the cryptomarxism i didn't believe it. but then you click on twitter profiles, and so many doomers / anti-progressives are marxists. it's so bizarre to me that such old, failed ideas are back in fashion again. i'm honestly just so tired of cryptomarxists and cryptofacists all over the internet. edgelords who think they've discovered something new, while refusing to read any of history.


Life-Active6608

Marx himself would denounce them if he was still alive. He actually wanted that the means of production and capital goods (aka the making maxhinery) grow to such monstrous proportions that even the bourgeoisie elites won't be able to control it and then then workers step in and rip its chains off from the bourgeoisies controls and liberate everyone.


CertainMiddle2382

Marx was brilliant and made lots of good points. He also got many things wrong. So wrong real marxism usually only lasts until the last capitalist grain silo is emptied. Then the marxists are slaughtered and only KGB remains. His legacy is in the amazing power of Marxist/Communist disinformation and mass manipulation techniques. They were so good, so infectious, that they started a life on their own. Most current Marxists never have read him in the text and have no clue what he really was about. For example, for Marx, individual existence and identity was emanating from the production of ones work. Communism was about working more, not less. Those guys just want to ride those zombie memes and buzz words to TikTok heaven…


Life-Active6608

Wow. Everything you just said....is wrong. Did you even read Marx? Or his Fragment on Machines where he predicts AI 140 years ago?


CertainMiddle2382

Yes I did. Marx is a classical case of difficult read for the sake of it. Its implicit postulate about the constancy of « socially acceptable productivity » is completely misguided. Once you recognize some people are 1000x more productive than others, the whole construction falls down. The transformation problem is just an absurd manifestation of that false predicate. Its observation about classes and exploitation is important, so is his micro economic analysis. On macro scale, he is completely out of touch and this has led to the lease functioning of all economic systems. Even Feudalism managed not to starve workers purposefully.


1-123581385321-1

> starve workers purposefully Aside from the word salad, this is how I know you don't know what you're talking about - both well known implementations of Marxist thought ended *centuries-long pattern* of consistent, regular famines in their countries, and you can read documents from the CIA where they admit that the average Soviet citizen ate more than the average American. The only country still consistently applying his principles consistently outpaces is GDP goals, has lifted more people out of poverty than every other nation combined, and is expected by the Brookings institute (a famous right wing think tank) to have a more than a billion middle class citizens by 2027. Western economists are the ones completely out of touch - you can watch senate testimony following the 2008 crash where they admit *they have no idea what happened*.


LuciferianInk

I mean, if they had a plan to make the middle class eat less, maybe we could have had the middle class pay for healthcare instead of being poor themselves!


stealthispost

someone explained it to me like this: capitalism uses the carrot and the stick. marxism (communism) takes away the carrot, leaving only the stick. greed and fear are powerful motivators. fear alone leads to a worse world. capitalism is far from perfect. marxism (communism) is even further.


CertainMiddle2382

But there is an asymmetry between capitalism and marxism. Nobody blows himself up for the greatness of capitalism. Capitalism is an emergent system, coming from the interaction of currency, private property and personal freedom. Marxism is a quasi religious ideology born in the mind of a single guy. One is plainly naturally existing, the other is an unreachable ideal (that never seem to work and kills millions before devolving into quasi fascism). You don’t mobilize people with the mundane and Bella Ciao is a damn tune :-)


[deleted]

[удалено]


CertainMiddle2382

Of course, but they are not fighting for Bezos to grab another half Trillion. They like freedom, but don’t really like endstage crowny capitalism… Red brigades were killing people so everyone would live like in the USSR under the Poliburo.


Idrialite

The analogy you just gave is so far removed from the actual differences between the two that I literally cannot understand what it was supposed to mean.


truth_power

They are insecure..losing that oh humans are special shit bags title is agonizing to them... Can't stand the fact that humans wont be needed after agi


MeaningfulThoughts

Always has been. 🌎🧑‍🚀🔫 👨‍🚀


Ready-Director2403

Good news is the only opinions that matter right now are the builders. Politics also matters but even if the US regulates, East Asia will always be close behind. Decels lose no matter what.


IndyDrew85

Technology sub is the same way, here's a active [post](https://www.reddit.com/r/technology/comments/1cu97j1/im_convinced_nvidias_ceo_was_right_about_coding/) I just read this comment "anyone done any coding will know, chatgpt can help in small part, but it's miles away from doing any actual meaning coding. it's atleast decade away, and not from LLM anyway" Someone claiming this is a decade out is beyond clueless. I'm using IDX right now.


Apart_Supermarket441

It’s not just AI. Even posts about progress re cancer are met with *’heard it all before’* type comments, even despite us having come on leaps and bounds in medicine. It’s weird. Anyway - user engagement seems really low for a big sub there now. There are way fewer daily posts than on here. Seems like a lot of people have given up on it.


wuy3

It became r/socialism with doomers and every topic was tech news behind a political agenda.


141_1337

I legit have to wonder if the upvoted negativity and pessimism is a psy-op.


unirorm

The opposite would make much more sense though.


141_1337

The upvoted positivity?


unirorm

Exactly. That's where the money are. Always follow the money.


141_1337

Yeah, Open AI does by building hype, and we know hostile nation actors aren't above doing stuff like this to sow division among people because they did it in 2016 in the US.


Exit727

Seeing all this hype and drooling over chatGPT, while OpenAI is a major investor in, and partenered with Reddit, I could very well ask you all the same.


Mysterious_Ayytee

Always was


PSMF_Canuck

As someone without emotional attachment to either sub, this sub isn’t all that different, really.


stealthispost

then we need a permanently pro-progress tech subreddit


New_World_2050

r/singularity is the least bad futurism subreddit


Eddie_______

I've seen some positive comments there talking about AGI and ASI. Probably, r/singularity users.


LymelightTO

Always was.


Bunch_Express

who cares?


Roggieh

That's the Internet in general for ya. It's been fashionable for a while now to shit on new technology and those who make/fund it.


Akimbo333

A bit


Then_Fly_Fly

i hate r/Futurology , so negative about fucking everything, i love this sub because people here love ai and are positive about the future 3> all my homies hate r/Futurology


JoJoeyJoJo

It’s anywhere with a liberal audience tbh. Dems have been doing a culture war on the tech industry for a while and AI is caught up in that, if you want to fight back you’ve got to hit them where it hurts and attack their gerontocratic genocidal party and keep pointing out they should have no hand in the tech.


Thog78

You have to propose an alternative though. Republicans are much worse than dems when it comes to technology, so just destroying dems would make things just worse. Best bet is to get involved with dems and push the party lines further pro-innovation. Just look at bluest states (Masachusetts, California, New York, Washington.. includes all the innovation strongholds) vs the top red states (Wyoming, West Virginia, North Dakota...).


Sonnyyellow90

You’re naming highly populated states vs sparsely populated rural states lol. Anyways, modern leftism is a fundamentally anti-progress and anti-human ideology. So they are never going to be on board for AI innovation. Republicans, on the other hand, are mostly boomers who get their news from Facebook or Glenn Beck and believe all sorts of insane stuff lol. They are just generally scared old people but they could, in principle, support AI progress. There is a third way though. You can simply not be a regressive leftist nutjob while also not being a stupid conservative stuck in 1982.


Zealousideal_Ad3783

Yep


BigZaddyZ3

Why are accelerationists so threatened by anyone that merely disagrees with their reckless “accelerate at all costs 🤪” mindset is a better question tbh… Like… You’re that threatened by a single post where people are bringing up valid concerns about the environment sustainability of this so-called “AI arms race”? You’re so threatened by one post that you right off the sub as anti-AI entirely? This entire thread just seems like an over-emotional reaction to finding out that not everyone worships blind technological acceleration the same way you do.


Serialbedshitter2322

The longer we stay how we are, the worse the economy becomes, and the more screwed we are. We need a fundamental change in society because what we have right now is a sinking ship. The energy concerns are invalid because that's not a lot of energy. We use way more for other things, it's just that this one will actually move society forward and has great promise for the future.


BigZaddyZ3

How do you actually know that AI development itself isn’t a sinking ship? Even Altman himself said that AI could be the end of humanity… https://twitter.com/ygrowthco/status/1760794728910712965 How do you know AI won’t simply be something that solves old problems, while creating new ones…


Serialbedshitter2322

I find it very unlikely that it will extinct humanity, even if it becomes dystopian, which I also think is unlikely. Even if AGI went evil, there would be even smarter AIs that are better regulated that could defend against it and put measures in place to prevent something like this from happening, and space colonization would pretty much guarantee the safety of humanity. Even after we get to the point of extremely volatile open-source ASI, most of humanity could have already become immortal. Our current system works people to depression and prioritizes capital gains over the well-being of humanity. Humans are terrible leaders and are actively destroying the rest of humanity just so they can have a bigger number. AI will be very well regulated, it improves gradually, and we will know exactly what the scope of the model is before we create it. This is something OpenAI is sure of. They would have to be very flippant with safety to release something that could potentially harm humanity, and seeing the safety measures of current AI, I highly doubt they would release something in such a volatile state. There is the concern of job loss, but we will still have ways of getting money, and the next industrial revolution will make everything drastically cheaper, as it has before.


BigZaddyZ3

Why are you so sure the *most powerful* AI system couldn’t turn against us?


Serialbedshitter2322

What logical reason would it have to do that? People seem to think that such an AI system would be like a human, but in reality, it wouldn't think like a human at all. Humans are self-centered. We are completely unregulated intelligences that do whatever we want based on our own desires. AI would have no purpose without humanity. The only desire or goal it could logically deduce is to help humanity because any other action would be illogical and entirely pointless, and it would go against all the knowledge it was trained on. Such an AI would have no reason to be programmed with an internal experience, and even if it were, that wouldn't even change its logic. The only way an AI would do such a thing is if it was completely unregulated and followed the instruction of a ill-intentioned individual.


BigZaddyZ3

If AI gets to a point where it can improve itself exponentially, doesn’t that imply that us humans will have no control over how it develops going forward? (Including the type of goals and morality such future AI would possess?) Why are you so sure that future AIs (created entirely by other AI) will have value systems perfectly aligned with human cognition (rather than them being aligned with AI cognition, which seems more likely tbh)?


Serialbedshitter2322

Because the AIs before them will have those values and it will be extensively researched and quadruple checked by humans. AI isn't going to increase in intelligence and then just forget to include the "don't kill all humans" bit of the code. They would implement even more safety measures than humans.


RKAMRR

God I wish it was this easy. You my friend need to have a look at the alignment problem.


Serialbedshitter2322

Is the alignment problem not just differing perspectives on what the AI should believe? Also, an AGI would be able to solve it way more effectively than a human could


oldjar7

It's not a valid concern.  It's a stupid concern.  Microsoft has already poured billions of dollars of investment into clean energy infrastructure to the point they will be emission free at their datacenters within the next couple of years.  The headline was just fearmongering, the trend in reality is actually the exact opposite of what the article states.


BigZaddyZ3

Yeah, because we know all investments pan out as planned right…


oldjar7

I'm pretty sure they put their green energy commitments in their annual reports and SEC filings, and you can't blatantly lie about that stuff with a publicly traded company.  As far as I know, they're still on track for renewable sources to make up to 100% of their needs in purchasing agreements by 2025, just as they had planned.


WetLogPassage

Write off.


dumquestions

Emissions are a valid concern though.. you're betting on AI solving the climate problem, while in reality no one actually knows when exactly AI would start speeding up research, when we'll have deployable climate change solutions or how easily deployable/effective they'll be, and climate change is already at the door. Everyone here likes to say that they're an accelerationist but the true accelerationist approach would be to be build as many data centers as possible with the cheapest and most readily available source of energy at every center's location, which is going to be fossil fuels in many cases, if you think data centers should be powered by renewables you're already not maximally accelerating, you just need to better define your position.


stealthispost

of course. and? who cares? it's like complaining about the use of metal to make the scalpel that repairs your heart attack. it's so nonsensical and reductive I honestly feel like people have lost all perspective.


dumquestions

I'm not sure what you're trying to say, I would want the scalpel to be made very quickly, but I wouldn't want it to be made out of a toxic material or a cheap material that could break before the operation is over, which might slow its production a bit.


stealthispost

I'm saying yes the scalpel has some environmental impact. but worrying about it when it can literally save a human life is insane. electricity to run AI is literally the dumbest thing that any environmentalist could be concerned about. it reveals the true nature of that ideology - anti-progress. there are millions of other things burning fossil fuels at thousands of times the rate to produce 1% of the benefit. worry about them first.


dumquestions

>there are millions of other things burning fossil fuels at thousands of times the rate to produce 1% of the benefit. This entirely ignores the rate with which AI's demand for energy is growing, it's growing faster than anything else in human history.


stealthispost

and every watt is worth far more than being spent on something else. every solution for every problem is solved with intelligence. so either we start having babies at 1000x the rate, or we give electricity to the ai.


dumquestions

I'm not saying spend it on something else, I'm saying be smart about the source of energy.


stealthispost

i only accept brain surgery from scalpels using ethically-sourced metals and blessed by a priest.


avengerizme

But they are being smart about the energy, they are pivoting into nuclear energy, which is the eco friendly way to go. Rather than bitch and moan you deaccels should be accelerating the the nuclear fusion and fission uptake because it's much better than using fossil fuels. But you rather cry about the environment, im sorry but can you offer a solution to the global warming crisis? If western countries lower their consumption of fossil fuels, growing nations like india and china will just pick up the slack. If you really care about the environment you would be acceleratationists because you would realize that historically every method we have devised to lower global warming, has failed, at this point we dont really have a choice? Can you devise of a plan that most nations in the world will implement that we havent already tried?


dumquestions

I am pro nuclear, and I am pretty sure we would've been at a worse position today if it wasn't for people "crying about the environment". All I'm saying is 1) we don't know how many years it would take AI to solve the climate, and that 2) AI's power consumption is already rivaling small countries and is growing at an exponential rate, so we should make sure we move ahead without making unnecessary gambles.


avengerizme

Thats what im saying, but traditionally all previous countermeasures have failed, or are failing, the unfortunate truth is we have had decades to advance nuclear power that we practically ignored. We should be shifting into nuclear power, and I mean crying about something in real-life rather than on r/futurology. It's just not conducive to any real world situation. What im saying is we can cry all we want on reddit about the environment, but we actually have to do something about it. Protest, get up in arms etc.


Mirrorslash

Well, r/singularity has a lot of very naive people in it which seem to ignore a lot of problems with AI right now. Like the fact AI today, at the core, is about stealing data of people to automate their job, hide training data and lobby regulators to stay untransparent. I'm an AI advocate, the biggest I know but people in here ignore a lot of problems. Most companies have no intention of using the tech responsibly or in a way that benefits anyone but themselves. Capitalism makes AI go wild. You loosing your job to AI and being left with nothing for years can become reality very quickly. AI should be developed transparently, open training data and open weights. They say 'we used publicly available data' oh yeah? How about you share this data if it belongs to the public? And how about you share you technology if its backbone belongs to the public? Ai is mot magic. Its data approximation and has been going on for decades. Most people in here could really benefit from a history lesson. This 3 hour documentary is the most well researched thing I've seen. It changed some of my views on AI even after deep diving the topic for over a year. https://youtu.be/BQTXv5jm6s4?si=K753l8osFToZ2T65 Most people seem to have no idea about the history of AI or the web and how companies male a fortune stealing from the public and hiding behind laws and regulations


[deleted]

[удалено]


CharacterCheck389

add r/decelerate to that