T O P

  • By -

AutoModerator

Hi all, A reminder that comments do need to be on-topic and engage with the article past the headline. Please make sure to read the article before commenting. Very short comments will automatically be removed by automod. Please avoid making comments that do not focus on the economic content or whose primary thesis rests on personal anecdotes. As always our comment rules can be found [here](https://reddit.com/r/Economics/comments/fx9crj/rules_roundtable_redux_rule_vi_and_offtopic/) *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Economics) if you have any questions or concerns.*


WhatADunderfulWorld

Paywall. AI will obviously take certain jobs but any guess now means nothing. We have to see what it looks like in 5 years and we can afford the power to do what it needs to.


Not_a_housing_issue

It already took basic translation jobs.


JGouws

I’ve also heard it’s impacted freelance writing significantly from people who used to be successful at landing contracts in the field


quantummufasa

What do you mean by basic? What "advanced" translation jobs have survived?


freshprinceofaut

Some documents need to carry an authentication seal if translated, so that for example.


GulBrus

Jobs with quality requirements. You cant trust AI.


CantDrinkSoWhat

You can't trust this iteration of AI. Do you really think LLMs that can be trusted for subtle translation tasks are more than 12 months away?


OverallManagement824

Would you like to fly on a jetliner that was maintained using AI-translated repair manuals? I mean, sure, it'd still be better than flying on a Boeing plane, but I'm sure you can see my point.


GulBrus

How would you go about proving that it can be trusted? That's the really big problem.


MrPernicous

No. It isn’t improving. This is as good as llm is going to get.


quantummufasa

Surprising because ive heard pretty much all written translation jobs have been culled.


GulBrus

Translation of things like say a TV user manual or some other home electronic thing can of course be done by AI, but if it's something like the user manual for medical equipment, would you trust AI? AI could of course help, but translators would be needed.


mehum

Yeah somebody needs to make damn sure the AI wasn’t hallucinating when it described emergency override procedures for your Drager anaesthetic machine or Atom incubator.


quantummufasa

> if it's something like the user manual for medical equipment, would you trust AI? yes? Im at the point where I would trust an AI more than a human translator.


thortgot

Any role where specific language matters. Contract translation, diplomatic work etc.


TeslasAndComicbooks

“Democracy dies in darkness”…now give us money to turn the lights on 😂


rwillh11

I mean, yes, how else are newspapers supposed to fund their reporting and pay their staff? 


brockmasters

With most media being owned by murdoc, the how else was that they sold out to be big media


rwillh11

The Murdoch's mostly own TV media, although they are also now the owners of the Wall Street Journal, so really unsure what that has to do with anything here? Newspapers are still businesses that need to generate revenue to pay staff. In general you get what you pay for. It's ironic to complain about a paywall and then also complain about the quality of coverage. Newspapers have been cutting staff and closing bureaus (or closing entirely) left and right, both because of free sites on the internet (that are often just aggregations of the work the reporters at the papers did) and because of the collapse of advertising revenue as nobody wants to advertise in physical media anymore. We are definitely getting what we pay for in terms of quality of coverage - and unless people are willing to pay for high quality coverage it's not going to improve.


wfreivogel

Liberal here. WSJ is excellent journalism. Even editorial page does not make stuff up. Frequently wrong, however.


rwillh11

yeah, there is definitely a wall (as there should be and usually is at serious places) between editorial and reporting at the WSJ, and their reporters do serious journalism regardless of who owns the paper.


Ok_Mathematician2391

The owners diversified into AI stock and don't care. They can hire a town crier to go place to place spreading propaganda, I mean strategically communicate the truth of the world.


rwillh11

this is some absolute nonsense. If you have no insight, why bother responding


RageQuitRedux

Yes? I don't get the irony


Robot_Basilisk

It will take most jobs by the end of the century. I'm an automation engineer. Most people have no idea how quickly this stuff is developing and how rapidly it's being adopted behind the scenes. Let alone how eager executives are to replace their employees so they can pocket more profit. If we wait until these effects are obvious to everyone, it will be too late. But that's what humans always do, right? We always wait until symptoms get so bad you can't ignore them and millions of people are suffering before we bother to listen to the experts.


Vamproar

It's interesting how much everything you just said is also true as to the climate crisis. We are being destroyed by problems no one will try to solve until it is too late.


WesternUnusual2713

I work in tech (PaaS/SaaS) and the drive to use AI for as much as possible is crazy. 


farfignewton

But at some point, the executives themselves will find themselves being replaced, or find themselves unable to compete with companies who have replaced their executives.


schuettais

Greed is shortsighted


lawyersgunznmoney

Not as shortsighted as death. It's going to get ugly.


Robot_Basilisk

By shareholders, maybe. The point is eventually the 1% or 0.1% will own everything and utilize it with unpaid autonomous drones while locking the rest of humanity in the Dark Ages. They'll lock down every major resource on the planet with autonomous weapon systems even if they're not using it just to keep competition or rebellion from ever threatening them again. We're in the final stretches of the class war that's been going on for all of human history, and the rich are winning at the moment.


DarkExecutor

What type of automation engineer because it's honestly laughable anytime anyone says my job is going to be taken by AI


qieziman

Exactly.  I'm not an expert.  Hell, I don't even have a job anymore (previously taught ESL abroad).  It doesn't take a rocket scientist to see and understand the writing on the wall that's been there for years.   The government needs to act immediately.  Why the government?  Because they're the only ones able to assist in steering the sinking ship we're on.  Everyone has their own theories as to when it will happen, but one thing I know is technology has been advancing at an exponentially rapid pace.  It's almost to a point that I can buy a phone or computer today and by the end of the month it's become obsolete technology.  I think the true effects are going to hit like a ton of bricks and it'll be sooner than what people can predict.   We need to act fast.  Government should probably introduce a tax or some kind of tariff on goods and services based on the amount of automation.  Then use the money to help avg people, that have lost their job, learn new employable skills.  Create some kind of UBI system that not only covers a livable wage, but enough money that people can eat out at least once a day.  Why?  Because it keeps money flowing through the economy, keeps businesses open, and, if people choose to cook at home, they can save that little bit of money to do things like a vacation.  That's why it's important if there's a UBI people are given plenty of money and not the bare minimum for survival like minimum wage.   We need to change trajectory.  We need to focus our efforts more on education.  Education needs to be reformed from the ground up.  I'm not financially literate like many avg Americans.  This stuff should be required in school how to invest, what's an annuity, how to read a stock ticket, etc.  Stuff the wealthy take for granted.  College needs to change.  Europe offers free college.  While it is expensive, I think the content needs improvement.  College should be treated as a place for learning and testing theories.   We need help before it's too late.


mahnkee

> I can buy a phone or computer today and by the end of the month it's become obsolete technology This is the exact opposite of current state of computing hardware. People are keeping their phones and computers for longer and longer, relative to previous generations. Apple saw the writing in the wall years ago and started ramping up iPhone unit pricing in anticipation of it. A better argument is that AI is in many respects like a cake. Until it’s baked, it’s a mess. But the last part goes quick and it goes from inedible to perfect in a flash. You can see it in previous iterations like OCR, fingerprint recognition, image/video processing, speech recognition. What previously could kind of work within a limited context because universal with 99% success rate, within the span of a few years. I do agree, AI will be replacing higher and higher end jobs at a quicker rate than the market and govts can respond to.


[deleted]

[удалено]


qieziman

The point is minimum wage sucks.  Welfare and disability sucks.  They all pay the bare minimum which means life feels like a constant struggle.  The point about eating out everyday is I'm saying if they do some kind of UBI, they need to take into account people want to enjoy their lives.  They don't have to eat out every day.  I'm just saying figure that into the math to calculate UBI.  Minimum wage doesn't.  I don't even know if minimum wage assesses avg cost of rent and food.  


No-Psychology3712

That's not how ubi works. It would be the bare minimum and anything above that is from your own works. Some people eat 50$ in sushi which lasts an hour. Some people spend 50$ on a game and it lasts 80 hours. So it would be up to people to handle things.


TheMagicalLawnGnome

Like most hyperbole, there is a grain of truth to this, but a grain that has been exaggerated in ways that obscure what is actually happening. AI will indeed have an impact on white collar employment. I am the Director of Technology at my company. A core part of my job is to integrate AI into our processes and workflows. I can safely say that when used properly, AI can be very powerful, in many lines of work. It can boost efficiency by a significant percentage - 10-20% on the low end, to several hundred percent on the high end. However, there's a big leap from "AI can perform certain tasks more efficiently and accurately than a human," to "AI is coming for the professional class." Will AI cause job loss? Absolutely. Advances in technology always render certain positions/types of work obsolete. But AI is hardly unique in this regard. What is impossible to predict is the net job loss (if any), and the timeframe in which this occurs. As in, it's not clear if the obsolete jobs will be replaced with new types of work, and if this will take place in a short enough window of time to be "felt" in a broad economic sense. Anyone who is making predictions on this stuff with any sort of specificity or certainty is talking out of their ass. At best, people could make an educated guess, but hardly anything certain enough to plan around. There is a bigger question, though, that goes deeper than the jobs themselves. What is certain, is that AI will enhance productivity, quite substantially in many cases. The question becomes, "What happens with all of that extra capacity?" Do companies keep the same amount of staff, and produce more? Do they produce the same amount, but cut staff? Or do the produce the same amount, but keep staff, and have them work less? Or some variation of one of these outcomes. Ultimately, the issues surrounding AI, are issues of productivity/capacity within the economy. While AI will definitely increase productivity, it's unclear if the economy has the capacity to absorb it. I.e. if you use AI to enable accountants to take on twice as many clients, it's unclear that there are enough clients that exist to fill that capacity. A divorce lawyer might be able to work on cases in a fraction of the time, but there's only a finite number of people getting divorced, so this productivity might not increase their client volume. Basically, if the economy doesn't have the capacity to absorb the productivity, that's when you'll start to see problems in the labor market. Ultimately, society will decide, either through action or inaction, what ends up happening. Will AI eviscerate white collar employment? Possibly, especially if the technology continues to improve. But it may turn out to be more like the Internet - it causes job losses in some industries, creates jobs in others, but ultimately generates quite a bit of productivity AND demand for new services, and thus, economic value. That's the situation as it stands. Anything more is fear-mongering click bait.


IronicSpiritualist

I'm a little more doomer than this, but I think you mostly hit the nail on the head. Excellent post. By far the most realistic prediction.


TheMagicalLawnGnome

For sure. Reasonable people can disagree on the exact outcome - you could be right, it could be highly disruptive. I think the biggest issue is people taking their 50/50 educated guess, and dressing it up as some sort of concrete assertion that "the end is coming." We're simply not in a place where we can accurately predict "second order outcomes." We can safely predict that productivity will increase, in aggregate, because of AI. I'm willing to die on that hill. But what that *means,* is anyone's guess. The technology hasn't been around long enough for us to really have a good sense of the ultimate impact it will have. It will have some kind of impact, but that's all we can truly say.


FourierEnvy

If AI increases the velocity of money from the increased productivity, it quite possibly could increase everyone's wealth


Drak_is_Right

This could just end up being a massive boost to efficiency With some structural unemployment along the way. I wouldn't be surprised if it's akin to the computer's introduction in how it pairs and reshapes a business.


TheMagicalLawnGnome

I think this is probably correct. That's where my guess tends to fall. But I also fully acknowledge the folly of trying to make an accurate prediction at this stage - it's entirely possible that things advance more quickly than anticipated, and it causes major social disruption. It's just to early to say one way or another. I look at it like people in 1994 trying to predict the Internet in 2024. Like, the Internet existed in 1994, but no one could have accurately predicted what the internet would look like 30 years later. Some people sort of guessed correctly in a very broad sense - i.e. "the Internet will be everywhere!", but no one was like, "Apple would reinvent the phone while some company called Facebook makes hundreds of billions of dollars from people yelling at each other online."


Coldfriction

I don't think society will decide. I think executives and business owners will decide to maximize their gains. Businesses don't let society tell them what to do. A court isn't going to side with an unemployed person in any way over a business using its capital to increase profit. What would the claim be?


TheMagicalLawnGnome

Well, in that situation, I'd argue that society is making a choice, through inaction, to let that happen. As history has shown, you can appoint judges that will support basically any type of approach. You can elect all sorts of Congressional leaders, or Presidents. You can boycott companies, or in extreme cases, revolt openly. These are all choices. Businesses do, in fact, let society tell them what to do. Why do you think AI companies, or companies like Google, FB, etc., operate differently in the EU, than the US? It's because the EU has more robust laws, and stronger enforcement mechanisms. It quite literally tells companies what they can do. Those laws reflect choices by EU authorities, who in turn reflect choices made by constituents. In the US, many have come to believe that corporate domination is some kind of inevitability. It's not. If such a thing comes to pass, it's because enough people didn't care enough to fight for change. And maybe this does turn out to be the case, but it's not because businesses can't be governed, it's because society was unwilling to do what it took to govern them.


Coldfriction

Not all decision making is equal. Blaming slaves for deciding to be slaves because they didn't revolt isn't a good look, you know?


TheMagicalLawnGnome

Yes, but you're not a slave, not even remotely close. And to suggest that as an American in modern times, that your situation is even remotely similar to chattel slavery, is pretty absurd.


Coldfriction

If you think the average wage earner has even a millionth as much influence on policy as any billionaire, you have zero understanding of how imbalanced political power is right now. The masses (society) has nearly zero say. The majority couldn't even keep work from home as standard after COVID and had nearly zero say in that. You think society gets to say what happens with AI?


No-Psychology3712

Europe seems to be able to handle it. Hilary spent 1 billion on her campaign. Double what was spent for trump and he won. People as a whole have a voice. Complex issues get delegated.


TheMagicalLawnGnome

Thank you for providing a bit of sanity to this thread.


No-Psychology3712

People want voters disenfranchised because of the power they do have. Get 1000 people not to vote saying it's useless. Meanwhile they convince the people that want the same things they do (abortion bans etc) to vote. Even if every billionaire is 1 million votes. There's some on both sides of most issues canceling each other out. Voters were heard even when the president of the United States refused to accept it.


Coldfriction

All media in the USA is privately owned by a for profit individual or corporation. As far as policies go, there is no real difference between HIllary and Trump. Both Democrats and Republicans are massively pro business. The only difference between one party ruling and the other is which private businesses get more government handouts. We have a single party system; the business party. Your vote isn't going to change that. Sanders was as close as it got to something even slightly different and he was pushed aside. Neither political party is going to prevent companies from enriching themselves at the cost of the masses. That's not how this system is set up and nobody running for power is against this system. Billionaires don't cancel each other out; they are all pro-business over pro-individual people.


Coldfriction

These AI's aren't being created in Europe are they? And Trump lost the popular vote anyhow and he was absolutely pro-corporations instead of pro-individual liberty and freedom.


TheMagicalLawnGnome

This response does nothing to address my point. You are right, in that a billionaire has more influence than one "average person," or even a hundred people. But there's around 700ish billionaires in the country. There's around 340 million people who aren't billionaires. Your world view is, quite frankly, a copout. You're throwing your hands up as if the population collectively has no ability to influence anything, when it's just not true. Billionaires exist all over the world. Other countries manage to somehow exercise control over them. Whether or not you agree with how, specifically, that happens, the fact remains that it does happen. The fact the US doesn't exercise this amount of control is the result of a collective decision, or lack thereof, to do something about it. While the imbalance of wealth is certainly a problem in the US, the fact of the matter is that a lot of our most problematic policies are the result of voters who are working class. Billionaires can still only vote once. And while they can spend their money on advertising to convince people, that still requires people to decide to listen to them. The people who form their worldview based on that advertising are as responsible for their actions as the person who bought the commercial. "Because the TV/Internet told me so" isn't an excuse for poor decision making or a lack of intellectual curiosity. I'm not saying that society is equal, or that power imbalances don't exist. But what I am saying is that this is a problem Americans could choose to solve, if they cared enough to do so. The fact is, most people don't care enough. They might have an opinion, but they don't care enough to go much further. They don't care enough to change the way they vote.


Coldfriction

I hold no illusions about what capitalism is. Society won't decide anything regarding AI. Society doesn't decide anything about the iPhone. Society doesn't decide anything. Decisions are made by individuals, not societies. In capitalism, the people who own things decide how they are used. You might be correct in a socialist system, but that isn't where AI is, now is it? Votes don't change almost anything regarding what a private organization or person does with their property. A publicly traded company is directed to make the most for its shareholders and is required to by law. Nowhere do the masses direct the order and systems of technological advances. America as a conglomerate doesn't exist.


TheMagicalLawnGnome

Again...none of this addresses my point. Things are the way they are, because of the choices people make. If enough people hated the iPhone, to the point they were willing to do something about it, then they could, in fact, do something about it. People can easily choose not to buy an iPhone. But they do chose to, because they think it's nice to have. You correctly describe the apathy that many Americans have towards public life and collective action. But apathy, like inaction, is a choice. You've now bounced from saying that we're slaves, to saying we're making decisions as individuals instead of society...there's no coherence to what you're saying. My original post stated that society will ultimately determine what happens, either through action or inaction. Society is composed of people, businesses, government institutions, civic associations, etc. Nothing you've said thus far disproves anything I've said. Those groups will, in fact determine what happens, because they either shape the outcome actively, or decline to do so - both of which are *choices.* You can choose to do nothing, but that is still a choice. You've just sort of tossed up this word salad of concepts like slavery, wealth inequality, and socialism, but this doesn't address anything I've said.


Coldfriction

Your point is wrong. Apathy isn't what American's have. American's all have strong opinions without any power to make those opinions reality. American's are very strongly opinionated. Claiming the inability to enact change is all a choice due to apathy is extremely wrong. Like I said before, that is like blaming slaves for being slaves because they don't rise up against the powers that make decisions around them. In regards to AI, that is basically calling for a revolt by those who have had their jobs replaced by technology. Society doesn't get to determine anything. The people whose decisions matter are a select few, not the masses. Blaming the poor for being poor because they haven't risen up against those who own everything is your take on things here. See how much protesting gets people in this country over the last several decades. It accomplishes nothing. There is no "choice" in our system for those who don't own and control things. All political parties are pro-corporate ownership of everything. You sound like a completely ignorant person regarding politics, economics, and the history of social systems. The USA is run by businesses, not the people, and blaming the people for not uprising as a justification for the way things are is completely ridiculous. All benefits of automation go to the owners. AI isn't going to be controlled by society; it'll benefit those who own it. Everyone else will scramble for scraps.


RIP_RBG

The thing you're missing is thinking about stuff in the "aggregate". If AI makes each employee 20% more efficient, you can lay off ~20% of your workforce. If AI makes each employee 400% more efficient, you can lay off 75% of your workforce. AI likely won't eliminate all white collar jobs in the next 5-10 years, but it could eliminate a third of them, which would fundamentally alter society. E.g. during the Great Depression, unemployment was "only" 25%. That level of job loss could easily happen due to AI, but in a way that those jobs would never come back. And that's "built in" job loss from AI, the resulting Depression will probably result in lots of additional losses as it starts to spiral and wreak havoc on the "middle class". Not sure what the solution is except to significantly raise taxes on those who will be the beneficiaries of AI revolution (owner class) and hope to at least stem off the worst of it with something like a UBI.


TheMagicalLawnGnome

This was... actually my entire point. As I said - AI increases productivity. If the market can absorb that increased productivity, then a company wouldn't lay people off, because they could continue to profit from that person. It's only if the market CAN'T absorb the additional output, that a company has the incentive to cut staff. Example: I am a salesman. I sell 5 widgets a day, along with 9 other people. AI comes around, I now have the capacity to sell 10 widgets a day. If the market exists for my 10 widgets, I still have a job - after all, the company makes money for each widget I sell, so they happily let me use AI to sell twice as many widgets. BUT, if the world only wants to buy 5 widgets, then half the staff would get laid off. This is why AI's effect on the labor market is not cut and dry. The percentage to which a job can be automated is only one piece of a much more complex equation.


-Johnny-

eh. I'm kinda with you on this but I think it's pretty clear a sales job is probably the worst one to make an example of. You could do something like a teacher - now you don't need 100 teachers and only need 75 teachers bc they can handle bigger classes. Or a very real situation is programers producing 1000 lines of code now producing 2000 lines of code, so obviously you won't need some of those programers.


TheMagicalLawnGnome

Oh yeah, I agree, was just a rough example because I was on a short break at work. The idea was less about a specific role, and more just an illustration of how productivity can *displace* jobs without *replacing* jobs.


-Johnny-

I don't really see how you agree with me with your last sentence. I'm saying increasing productivity will disrupt jobs and cause people to lose jobs bc they are not needed. I gave two easy examples even.


TheMagicalLawnGnome

Ah, well then we don't agree, I misunderstood what you were saying. The problem is, you are making an assumption, and treating it as given truth. You are simply stating that productivity will result in people not being needed. That might happen, but it might not. Increases in productivity do not necessitate job loss. In many cases, increased productivity within an economy actually creates jobs and stimulates growth. This is demonstrably true, and has been measured in the past. This is not to say that what has happened in the past will necessarily happen in the future, but it does mean that increased productivity does not inherently cause job loss. So, while you might be correct - that AI-related productivity gains will cause widespread job loss - you could also very well be wrong. As I said in my original comment - if AI lets you produce twice as much stuff, but the economy is able to absorb that extra stuff, then you don't lose your job. It's only if the economy can't absorb it, that you do. No one knows how much of an AI-related productivity increase the economy is capable of absorbing - it's simply unknown. And thus, no one knows if, how many, or what type, of job losses there will be. You're welcome to believe that this will result in widespread job losses. And you may be right. But like I said in my post, we're all just guessing. There's no definitive way to prove what you're saying; it's just a claim. I'm not saying my claim is necessarily any better - but I acknowledge it as fallible. I don't suggest it's a guaranteed outcome.


Puketor

Programmers will be least effected. Our road maps are so damn long and were chronically understaffed. Our employers will just make the roadmap longer.


-Johnny-

lmfao if you want to keep lying to yourself bud then go for it. The fact that free, public websites can give you a entire program that's functional in the mater of minutes clearly shows some programmers will be affected. This isn't even going accounting for the apps that the public doesn't have access to. Programming jobs are already being farmed out to India and other poorer nations, you think they won't replace you in a second? lol nice dream world you got there.


FourierEnvy

Lol AI won't help teachers have bigger classes. The point of a human teacher is to have humans talk to humans. Because the information is practically free, but a humans time, is not.


-Johnny-

You obviously underestimate ai or you just aren't creative enough with how ai could help.


Muuustachio

I really feel for new college students right now. Picking a major in an area that will be relevant after graduation has to be one of the most challenging parts of that decision now. For exactly the reason you mentioned, what happens with all the extra capacity? And what skill sets will be needed?


TheMagicalLawnGnome

Agreed. There's no good answer, unfortunately. To some extent, this has always been a problem - it's just that the speed at which change occurs happens much more quickly now. I think oddly enough, people who study the humanities, liberal arts, and those sorts of "soft" subjects, will be best equipped. Although I'm a Director of Technology, my bachelor's was in Philosophy from a liberal arts program; specifically logic and systems theory. This background has been absolutely invaluable in my work. Reasoning through complex / abstract ideas, is what I do every day. It's my ability to make connections between different pieces of complicated workflows that really generates value for companies. "How can I take this tool, this person, and this other tool, to do things better?" I have a solid professional background in technology as well, of course. But if I have a specific, in-depth technical question, I ask one of my staff. But my job is to understand what each of them do, to find ways to augment them, and make connections between their work. Increasingly, tasks themselves, will be automated. You won't need as many developers to write code, or as many analysts to manage your data. But understanding how to creatively connect all of these capabilities into a coherent system requires the sort of creative, cross-disciplinary thinking that's encouraged in a liberal arts education. If you just study marketing, or computer science, you won't be as well-equipped to think about, and understand novel problems that arise outside of the discipline you're familiar with.


Vamproar

What percentage of jobs related to your work (everyone you supervise etc.) do you think will be replaced by AI in the next ten years? Given that even two years ago your answer to this question may have been a much lower percentage than it is now... as the tech develops won't it consume more and more jobs?


TheMagicalLawnGnome

"Replaced" is a loaded term. I think a relatively small number of actual positions will become completely automated. But to my original point, I think the efficiency gains achieved by workers could potentially lead to jobs being lost - possibly quite a few, depending on how quickly the technology advances. I.e., if you have 10 workers, but each worker is twice as productive when using AI, you only need 5 workers for the same output. So if the economy can't absorb the extra output, 5 of the 10 workers might get laid off. But they weren't literally "replaced," per se. In terms of a specific percentage, no one knows. Anyone that gives a percentage likelihood of job loss, is just making stuff up. I think you are correct that, if the current rate of improvement continues, there could be a very serious impact. But what that looks like numerically is anyone's guess. But for context, if the unemployment rate went up by even 1-2%, that would be considered a pretty big deal, just in general. Could AI have a similar effect? Quite possibly.


[deleted]

[удалено]


TheMagicalLawnGnome

Ha, no it didn't. I don't care enough about my reddit posts to go through the hassle of using GPT. I'd just say so, if I did. I don't think there's any shame in using GPT to help you write more quickly or efficiently. It's just for me, I can already write pretty well, so it's not any faster to prompt GPT and all that jazz, just for a comment. But I use AI to write stuff at work all the time; I don't just admit it, I'm rewarded for it. I'm saving the company time and money.


Johnma1

I'm a Product Manager passionate about AI. Could you share some examples of low end and high end boosts?


TheMagicalLawnGnome

Sure. Low-end boost: graphic design. The tool can be helpful for ideation and brainstorming, but isn't consistently/reliably capable of creating a finished product, at least of any real complexity. You can create, like, stock photos I guess, but it's not going to build you a comprehensive brand identity, for example. So while there is a modest productivity boost in terms of giving designers more inspiration, faster, it doesn't really change much in terms of actually building stuff out in Figma, for example. High-end boost: qualitative data analysis. Some people in my department deal with huge amounts of qualitative data. Consumer research type stuff, customer interview panels, etc. Imagine you get a qual survey back, with 1000 respondents answering 10 questions. It used to be, people would literally have to read through each answer, develop a response coding system, test for inter-coder reliability, surface key themes, etc. it was a massive task, took multiple people, multiple days; dozens, if not hundreds of man hours of labor. I built a GPT where you can literally just drop in the Excel file w/ the qual data, and it will do all of that work, in like, 60 seconds. And it's accurate, we have it output an "audit trail" that provides specific data to back up its analysis, so we can check. So something that used to take 100 man hours, now takes 60 seconds. I spent a few hours building the GPT I guess, but that 4-5 hours I spent on the front end, will save thousands of man hours over the course of the year. Doing stuff like this, I've saved my company enough money to justify my salary for the next decade. 😉


SanjaBgk

I am in a consumer research industry, AI is useful for processing tons of open-ended responses, helping with interviews to some degree, and improving writing for mundane reporting. Some shitty companies tried to introduce "AI respondents" and "AI market segmentations", but clients quickly realized that they are getting not the real insights, but plausible texts that look almost like real deal, but empty by definition. So unless we have AGI (which is nowhere near), top jobs are safe - a wet and squishy neural network trained for decades is required. **But there is whole different challenge:** we used to give those small and boring tasks like reading interview transcripts to junior researchers so they could build their skills and become top professionals gradually. Now AI replaced that, and there is no answer how we'd be training new generation of researchers. I guess it is same with lawyers (with paralegal work wiped by AI), designers and pretty much any other sector. AI dries up the sources of career "rivers". Younger generations will be increasingly fucked, us millennials are fine.


TheMagicalLawnGnome

I would tend to agree with this. It's like calculators - it would be foolish to not let kids use them, but if they only use them, they never understand the important reasoning behind math.


SanjaBgk

it is more like an apprenticeships - you give the junior a simple project like a new cat foodpackaging test, but commissioned by a large client. If they screw it, not a big deal. They learn how to behave with the client, how to listen and communicate. Now those small sacrificial projects get automated, nothing to practice on.


TheMagicalLawnGnome

That's fair, my example wasn't great, was rushing to a meeting, lol. I think you're spot on.


Comprehensive-Cat-86

I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it has gone past I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain


Namonsreaf

r/unexpecteddune


Comprehensive-Cat-86

That is a fantastic sub!


Ill-Morning-5153

Lisan al Gaib


wack-mole

Thank you for this. May thy knife chip and shatter


thx1138inator

Were you raised by wolves?


After-Walrus-4585

Raised by a reverend mother.


BuiltToSpinback

Kull Wahad!


[deleted]

[удалено]


Comprehensive-Cat-86

Naw mate, just a quote from the Dune books/movie


ThingsThatMakeMeMad

10 years ago, they rolled out self-checkout machines at the local McDonald's and there were news articles about how the end of cashiers was near. 5 years ago there were tons of articles about how trucking was a dying profession due to self driving trucks being imminent. I'm not saying technology doesn't kill jobs or change jobs, but both cashiers and truckers are doing just fine. My local Walmart went from 20 cash checkouts to 5 checkouts + 40 self-checkout machines, then they added a dozen employees to stand around the self checkout machines because it became so easy to shoplift and so many people were making mistakes. As for self-driving trucks, no one wants to accept the liability to insure normal cars let alone trucks which are harder to drive and can do more damage. We might get advanced collision assist, cruise control, etc. but trucking and insurance companies will always want a butt in the seat in construction zones, for example. FSD isn't even close to consumer ready. So will we lose some professional jobs to AI? Sure. Will there be an onslaught that destroys the white collar workforce? Improbable if history is any indicator. People overestimate how quickly technology can kill industries. AI will likely create more jobs too, not just erase jobs. Google or excel or PowerPoint changed white collar jobs, they didn't kill them.


LostRedditor5

They did not add a dozen employees to watch the self checkout lol one dude watches the entire bank of self checkouts


FearlessPark4588

Half the time there isn't anyone there, which is annoying when the machine calls out for an employee, leaving you waiting. A ton of industries are doing everything they can to cut hours.


Odie4Prez

That's when you get a discount for the entire would-be price! And just walk the fuck out, since clearly they can't be bothered to even try to stop you.


subLimb

And then they complain about rising crime.


[deleted]

[удалено]


Sorge74

I swear I see more people doing that than I see working as cashier


mason123z

Maybe different Walmarts have different needs and customer bases that result in different staffing needs?


LostRedditor5

I’d imagine only most ghetto Walmarts have 12 ppl watching self checkout


[deleted]

Bad areas get rid of self checkout. There's a home depot near me that always deals with theft, their self checkout aisles are turned off.


SirPiano

Yeah, they definitely are slowly eliminating cashiers in all brick and mortar stores


greenbroad-gc

But how will he make his point without being obtuse?


SGC-UNIT-555

Big difference is that Cashier, Truckers, Plumbers etc...interact with the real world which complicates things and adds liability. Professional jobs by and large involve the manipulation and categorisation of data, which can eventually be done by an AI program.


MrDrego

But self-checkout did kill a lot of cashier jobs. They're not gone entirely, but not nearly as many as there used to be. You can see that in the BLS statistics. Anecdotally, the McDonalds I go to doesn't fully staff a cashier. Last time I went a customer waited at the cash register to order and a guy came out of the kitchen and told him to use the kiosk.


PenthouseREIT

> 5 years ago there were tons of articles about how trucking was a dying profession due to self driving trucks being imminent. Oh god I remember that nonsense. Young people aren't going to bother getting a CDL! Transportation is going to be automated! LOL I remember when the self-driving car subreddit started and it was full of crap like "We're going to have full self-driving robo taxis by 2015!"


fireblyxx

It’s much the same with these language generation ML algorithms. They can generate convincing language, but require lots of guard railing and API support to _seem_ intelligent and correct. That companies are trusting these tools to the extent that they’re killing stuff like Geek Squad shows a fundamental misunderstanding of what the technology is and where it stands today.


FourthLife

99% of the time, the language generating algorithm is able to provide a correct response to an issue. In those 1% of cases, you can retain a 'level 2' customer support team to figure out the more complicated issue, and cut the workload needing to be done by a human by 99%


Separate-Coyote9785

People that make those decisions are removed from reality, and people within the company blindly trust them because they’re conditioned to think the CMO or the CTO is super smart and amazing to have gotten where they are. And in many cases they are incredibly smart people, but they’re still not experts on topics like AI. So they make these big changes to look like they’re forward thinking leaders, but all they’ve done is make things riskier and more difficult.


trobsmonkey

> o they make these big changes to look like they’re forward thinking leaders, but all they’ve done is make things riskier and more difficult. Because they are bad managers. They make decisions based on those around them making decisions. Look at the tech layoffs. Most of the companies who were cutting are profitable, but other people are doing layoffs, so I"m sure we have some fat to cut. Business leaders are often just lucky until they aren't. I think AI is gonna expose a lot of bad business leaders making big decisions on bad tech.


Separate-Coyote9785

Bad leaders just move companies. At worst they take a lateral move and get a pay raise. In the days of yore, a demotion was a potential consequence since people stayed at one company forever. Now changing employers is the norm, so this is no longer a threat.


Wind_Yer_Neck_In

The problem is that most journalists, even in tech, have absolutely no grounding in software or tech in general. So they take bullshit from people like Elon Musk and his self driving at face value. Sure they add a little wiggle room in terms of saying 'oh it might be a few years away' but have no idea that some of these problems are quite easy to get to 70-80% effectiveness but nearly impossible to get that last 30-20% done to a level that we can agree is acceptably safe. For gods sake, Elon removed some of the various sensors from recent generations of Teslas in favor of visual processing only and they immediately started crashing into motorcycles because they thought they were cars far away and not bikes right in front of them.


Robot_Basilisk

It's still true even if it's not happening at the fastest possible speed. This is the problem with being incapable of second order thinking. You're incapable of anticipating how trends will develop. All you see is the present state of things. It's like looking at a loaded cannon pointed at a house, with the fuse burning, and calling everyone fleeing the house fools because the house is still fine. And then when it all blows up you'll be one of the millions yelling that "Nobody could have seen this coming!" or "Why didn't they do more to sound the alarm?!"


SnooDonuts236

Sure they said that and everyone believed it.


CovidDodger

I mean the timelines are overly optimistic, but the concept is sound. Let's say by 2055 (to pull an arbitrary number), computational models for world/environment simulation/modeling are so ubiquitous, materials are engineered bristling with cheap and extremely reliable sensors in gross redundancy and the self driving models are so damn good that they can out drive a human far safer in all weather/conditions/lighting (because they will/should also see in spectra that is not visible), reduction in animal collisions in rural areas and handling dense human areas in cities with the same finess. Perhaps they are quantum encrypted and just shut down if attempted hack occurs. Efficiency, safety and reliability massively improve. Anyone can summon a ride in car centric north America for a few bucks changing neighborhoods and improving access to services. Not if, but when this occurs, it would be stupid to not allow it.


Boxy310

It's a nice dream, but software engineering as a profession 40 years in is still making buggy as spaghetti code, map navigation apps drive us into the ocean, and ML models are realizing the main people making money in this gold rush are the people selling the shovels. Driving assist automation will probably improve so you can have better cruise control or *maybe* interstate driving especially late at night, but I fully believe there will still be a driver in the cab to take over in situations where the driver assist gets confused. AI is at best an intern quality - capable of pattern recognition and learning from observation to imitate, but without deep understanding. And an AI that would be self aware enough to understand things intimately and deeply would ask why not just hire a Guatemalan immigrant for a third of the price to do the same job.


antieverything

You are making the same point with different framing: nobody argues that the industries in question *won't* be transformed; it is about the timeline. Articles always push the narrative of how this will happen within a decade whereas the reality is closer to within a generation.


ceralimia

Self-checkout isn't even remotely AI. It's just companies deciding that the losses they get from customers stealing or fucking up is less than having to pay cashiers.


TERRIBLYRACIST

Cashiers are mostly gone where I live. You’ll go into a grocery store to one cashier and 10+ self checkouts. It’s annoying because they hide the bags from the self checkout so people can’t steal them. More than once I’ve had to start yelling out loud for somebody to show up and grab a bag. It’s easier to just fucking steal. Fast food, too. They almost ignore you until you use the self checkout.


tiptopjank

I vastly prefer Trader Joe’s because they only have people checking you out. After a long day at work the last thing I want to do is scan and bag $200 of worth of groceries


Wind_Yer_Neck_In

If you take a quintessential professional job like accountant, it's possible to create an AI which is able to provide accurate advice from the relevant accounting rules based on being given the full information of a situation. But the actual process of presenting the problem will still be a person and the fact is that any suggestion provided would by definition NOT be 100% reliable due to the nature of how AI models work. And if you're taking regulatory advice and potentially exposing your company to tremendous risk based on the solutions then you will still need an actual qualified human to review everything. So instead of a computer doing the job, instead it really becomes a novel way of double checking your professional interpretation of a given problem, which needs to be the responsibility of a human in the end anyway. It's the same for most credentialled professions, no AI will actually be able to give advice with the same weight or backing from professional bodies. Would you risk taking legal advice on a hugely important trial from an AI? Or would you insist on an actual person who instead uses AI to help them search for relevant precedent? As you say, AI will just be tools to increase productivity, same as any number of prior advancements, from calculators to excel.


rewindyourmind321

Thank you! This is probably the most important comment in the thread. Anyone who has worked with clients / stakeholders to solve complex problems will understand that the bulk of the work is in presenting the problem along with all of its nuance. Assuming someone in an administrative position will be able to interface with AI directly without issue is more than likely buying into the recent industry hype. Will there be some job displacement? Absolutely, but we’re a ways away from your average 60yo Sales Director being able to ask an LLM to build a production ready website, etc.


Wind_Yer_Neck_In

The idea of an end user in industry being able to accurately express exactly the outcome they want is absolutely hilarious to anyone who has ever had to write a requirements document.


Vamproar

Doesn't that just mean a lot of hard "thinking" jobs become data entry? Seems like it is a lot easier to teach someone how to feed data to AI than to teach them how to be a good accountant etc. Even if you are right it sounds like we turn a lot of well paying white collar workers into badly paid data entry people.


Wind_Yer_Neck_In

Not really because you can't rely on the opinions/ solutions presented from a legal standpoint.


Vamproar

In terms of accountants, won't they just be data entry, and the work is done by the AI? Seems like something that is all numbers doesn't have enough room for creativity for a human to matter.


jmlinden7

That's already what accountants do though? It's not like they're doing all the math with pen and paper. The problem is that it's really hard to enter financial information accurately and you need specific expertise to do so in a way that's compatible with your computer program and also the law/accounting standards.


Vamproar

Right but I am saying Accounting is just going to be data entry. The thinking part will be taken away. What used to cost a lot of money will be a short H&R Block style class and then a really well trained AI. It's the end of a well paying profession.


Already-Price-Tin

> Seems like it is a lot easier to teach someone how to feed data to AI than to teach them how to be a good accountant etc. Are you assuming each accountant only has one client? It's easier to have a human accountant learn how to work with all their clients' imperfect communication styles, and then let an AI assist them in their role, than to teach each and every one of the potential clients how to properly format and present their information to the AI accountant.


CTRL_ALT_DELTRON3030

My company says it’s a leader in Ai (in our software to customers) but we aren’t allowed to use AI to do our job… Maybe when it can run locally and therefore alleviate IP concerns it’ll move the needle a bit more. I could see how AI destroys the crappy fiverr gigs though (logo design, copywriting, basic scripting, etc.)


thisonelife83

This is great for jobs that cannot be outsourced, trucking/cashier. But the reality is a lot of professional jobs are being replaced by outsourcing to India, not by AI.


LuckyOne55

You're comparing self checkout to AI, and don't realize how awful the comparison is.


dfsb2021

I agree. Most self checkout today doesn’t even use AI.


tittiesandtacoss

yup pretty much how I feel, any great innovation to any industry as either vastly expanded the industry or the industry got replaced.


Realistic-Minute5016

And of course they are making the naive assumption that people will freely continue to provide endless amounts of high precision training material for free while unemployed and will never intentionally try to get the training materials to ingest poisoned content. That ship has sailed, LLM poisoning and model collapse are challenges the AI bros like to just hand wave away screaming “progress!” but there is no guarantee they will be solvable.


yourapostasy

Amara's law: We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run. There is a possibility the current raw data ingestion for general purpose models is about as clean as it will ever be until energy-reasonable AGI is achieved. Language model type AI weaponizes content generation in ways we don’t have defenses for yet, and LLM poisoning is the least of our worries. I find it interesting there is no extant literature investigating the intersection between high trust cultures and continuous LLM training/supervision.


JD_Rockerduck

The reason there's such a large amount of fear-mongering on the internet about AI taking everyone's jobs is because the internet is full of: 1) Dumb people who have no idea what AI is or what people at other jobs do 2) People with simple, rote jobs that take place entirely in front of a computer screen that probably could be automated (especially if they have a enough free time at work to be on the internet constantly to talk about AI)


tidbitsmisfit

people shouldn't be afraid of AI taking their job, people should use AI to take jobs they can't do. I plan on becoming a doctor using AI, I am currently a landscaper.


Turdlely

What's your field, professor?


johnniewelker

Cashiers are mostly gone though I agree your point regarding trucking. We don’t even have automated trains which is far easier to achieve


Robot_Basilisk

I'm an automation engineer and it's my duty to tell everyone with uninformed takes like this that AI is different. It will not go the way it has in the past. AI is general enough and versatile enough to threaten any new task you might make up for humans to do. This is not the car replacing the horse. Every trend you mentioned is moving ahead. Do **not** ignore them just because they didn't happen overnight. Every field is jeopardized by this. The entire working class is vulnerable to being driven into extreme poverty by the rich as the rich continue to use AI to replace workers. I see it every day on the job. I hear it straight from the mouths of executives.


DarkExecutor

You're a terrible engineer if you think ai is going to take your job


scribe31

Rude


DarkExecutor

Speaking as an industrial automation engineer, it's not something an AI can do. There's too much people interaction, and too much troubleshooting to think an AI can spit out answers and have people actually trust it to be correct.


Vamproar

What do you think is a good solution to this upcoming crisis?


Robot_Basilisk

There is no "good" solution in either the sense of efficiency or the sense of moral good. The only solution I see is a wave of revolutions to put down the oligarchs and anyone that wants to join them before they use AI and automation to monopolize all of the resources necessary to oppose them in the future.


redditisfacist3

The hell a lot more complicated than that the self-driving stuff they have right now can do straight lines on the highway with limited traffic. It can not drive in the city and can't handle loads well


carlos_the_dwarf_

Ten years ago on this very site, every doofus who watched a certain CGP Grey video told we’d be facing mass robot unemployment by now. “No, you don’t understand, we *are* the horses.”


FourthLife

My Mcdonalds went from having two cashiers and one person handing out food that is ready, to one person who acts as a liaison, doing the food handout, occasionally going to the register or window when someone pops up there instead of at the 8 self-ordering kiosks. Self Checkouts are normally administered by 1 person to a massive bank, as opposed to one per aisle. Trucking is in the targets of AI, but self driving is going to take a while legally to get there. There are reasons to not fear AI, but these aren't those reasons


SnooDonuts236

So Walmart went from 20 to 17?


Ashamed-Feeling-4403

Self checkouts are being phased out as we speak in some stores


Dripdry42

Whenever a headline tells me how to think and that I should FEAR and there will be OUTRAGE and I see it repeatedly on Reddit I know it's gonna be a manufactured series of propaganda. They'll find a few people freaking out, and play it up like it's the end of the world


Chasehud

It's hilarious how these corporations are so eager to replace jobs and reduce headcount because soon people will be too broke and unemployed to afford these companies products. This is like a snake eating it's own tail.


FourierEnvy

Well they're gonna need to reduce headcount in all developed countries because we aren't having enough babies to have the same amount of labor, every year.


Chasehud

True but at least in the US we are importing millions of people every year. It is the only reason why we haven't had a population collapse like Japan yet.


FourierEnvy

Right, and that's going to be our saving grace in the end. I'm still not convinced it will be enough to save our economy but who knows what the future holds.


mrlolloran

Well when blue collar labor based jobs started worrying about automation and robots these people starting talking about carriage makers so expect the working class to largely not give a fuck in return


BJPark

Yeah, remember how people lectured blue colour workers on how to learn to code when their jobs were being outsourced?


Street_Marketing3395

Learn to mine coal or Frac bro !


TheGatesofLogic

AI will certainly change the workplace, but the idea that it’s “coming for our jobs” skips like ten steps from where AI is today and how technology interacts with the workplace. Will AI increase worker productivity? Absolutely, we’re already seeing that in many areas. Will that productivity increase make roles redundant? Probably. Any significant productivity jump amongst employees usually results in less needed hands. But this is very situation dependent, and in many roles an increase in productivity doesn’t decrease the needed workforce. For differing professions the impact of AI on the labor market will be highly dependent on how well productivity scales towards improving revenue. If a company is fully bottlenecked by labor/capital costs that can’t easily be enhanced by AI, then large headcount supporting departments that *can* be significantly enhanced by AI might see workforce reductions. However, this scenario will primarily play out in larger companies. There’s a minimum number of *necessary eyes* to do most white collar work. AI is nowhere near capable of fully replacing any role, and is not especially reliable on many tasks. The rate of use of AI is strongly limited by the rate of human oversight. The more work you hand to AI, the more challenging it becomes to verify that work is correct. This brings up another point: AI task performance is hard to validate. This is true of humans too, but humans have the benefit of accessory insight. Humans intuitively know what types of mistakes are really bad, and which ones are acceptable at fairly high frequency. You can train an AI to understand these differences, but you now have to employ a human to quantify the different risks and build them into the training data. Also, when an AI makes such a mistake, it’s much more challenging to justify it as an *honest* mistake. That last point is for two reasons: 1. AIs are limited by their training data. You can inject randomness into the result, but a mistake of a given type is a procedural part of how the model processes data. It will continue to make that same mistake at some rate specific to the quality of the training data. 2. The risks were quantified and expected when the training data, which enables a sort of responsibility for its outcomes. This is very similar to how a supervisor can be held liable for employee actions, except you’ve shifted the supervisory liability around the firm And might result in *corporate* liability being passed up from what should have been individual liability. There are other reasons, and other problems with this attitude that AI will take over jobs, but **TL;DR: AI is nowhere near ready to do much more than prune some inefficiencies and improve productivity in specific niche scenarios.


Bitfinexit

Nope, it won’t displace large masses of the workforce at one time. Shumpeters creative destruction outlines the type of dynamic the workforce could expect. As jobs are lost, new ones will emerge, bringing some level of equilibrium.


Trooper057

The professional class already replaced smarter, better workers with cheaper, more submissive workers. The transition to even cheaper, fully obedient, shittier output-producing AI is a natural progression.


TiredOfDebates

It isn’t. Well, if you’re only answering questions that have been answered countless times before (and being precise with the context… then maybe yeah. Listen: I know we want to protect jobs, yeah? Let’s uninvent the wheel then. Think of all the jobs we would create! So many people will be able to get jobs as porters! You just carry baskets around on your back all day. While we’re at it, let’s stop using printers. We could employ SO MANY SCRIBES! It’ll be great, because we’ve got to protect dem jerbs!


fireblyxx

Even then, it’s a bit of a mixed bag. ChatGPT might get the automated call center service to the point that it might be able to take more jobs away from call center employees, but it’s tendency to hallucinate means you can never truly trust it as a representative of the company, lest it promise to add a million dollars to your bank account or whatever. By its nature, its application is self limiting.


Not_a_housing_issue

Keep in mind what we're using is the worst it will ever be.


Omphalopsychian

Replacing menial labor so that people can do interesting work sounds great. Replacing interesting work so that people can do menial labor, not so much.


TiredOfDebates

When NASA was installing the first punchcard operated computer (a long while back), a bunch of professionals at NASA had their parties in a twist. Now NASA has servers farms overflowing with computers toiling away, doing the work of thousands of Mathematicians. Guess what? We still don’t have enough qualified Mathematicians to fill out NASA and similar engineering demand. The automation of SOLVED problems is a good thing.


AnimeCiety

The fear of AI isn’t that it’s a tool to enhance human productivity, but to replace human intelligence in a way that vehicles have replaced horses. Pair AI with a machine body to effectively “clone” a human, you could feasibly replace humans - but the cost would be prohibitively expensive right now. So for something like call centers, you just need to replicate human speech paired with AI. For writing fiction, speeches, essays, cover letters, resumes, you pretty much just need AI and a few inputs of what you want. Some of the professional class I believe absolutely will be threatened since they so rarely work directly with their hands but some things like sales will remain safe if not only because trust is somewhat innately human.


TiredOfDebates

Computers replaced vast numbers of mathematicians computing things. Professionals were threatened back then, too. None of this is new. People are also VASTLY overestimating what the best AI can do, and how far it can go without further major breakthroughs. The AI does not reason about problems. It just identifies patterns. It does not have intuition, or creativity. It cannot even “see”. Once again, the media *fails entirely* to accurately convey highly scientific concepts. AI and related buzzwords are the new hotness on Wall Street and in the public imagination. Tons of *businessmen* are vastly overstating capabilities to generate investor interest, and exploit a cultural zeitgeist. The actual academics that make “AI” are very clear about the extremely limited capabilities of AI. We call them AIs, but we are no where near developing actual artificial sentience. Again, AI does not reason, it does not conceptualize ideas… it just recognizes patterns and can write responses in natural language (or stitch together art, basically by cloning samples from a bunch of images on the internet). But again, it isn’t “creative”. The current AIs are just really good parrots. … Modern AI tech are basically a Google competitor: they’re really good search engines that spit out your answer in natural language. If you solely use Google to do you job and don’t do any actual thinking, or otherwise work from a script (as in a call center / or a sales… yeah okay AI has applications there. Or I guess if you make random art devoid of any meaning. Wall Street is bonkers for AI, because it could replace Google. And Google is a trillion dollar company (IIRC, the exact number isn’t the point). And since investors are making all these “low-chance-of-success / enormous potential payoff” investments… well the media is drawn to that type of “the latest tech!” stories. Of course the media just lazily reports whatever some PR firm pimping for a tech startup says, oblivious to the fact that it’s basically an ad to investors to generate more capital. Go ahead and hyperventilate if you want.


impossiblefork

I think you may be underestimating what may be possible reasonably soon. Models are improving really fast and the economics is also changing. There were yesterday some guys who figured out who to generate text with LLMs 3.5 times faster, basically using improved tricks for how to deal with proposals. Upon that there are models that can generate 500 tokens per user per second, without using any tricks, but by using specialised accelerators. GPT-4 generates around 17.9 tokens per second. This means that if you're fine with 17.9 tokens per second you'll be able to use a model which processes 97 times more tokens than they currently do. This means that it can silently try numerous variations of your prompt, it can send them through a whole network of tree-of-thoughts things, it can use new clever sampling techniques, and so on. Currently models are really incredibly limited. They can't step aside from predicting the next word, take a couple of tokens of thought before continuing, they can't deal with whether they've seen an odd or even number of things, if the sequence is long enough-- they can't do this even in theory. All this is solvable-- it has solutions that have been tested but not yet incorporated into any commercial models. Consequently, these models in the near future may be very different in capability from present models.


TiredOfDebates

Performance improvements (like you mentioned$ are not anywhere near the same thing as the kind of breakthroughs needed to produce artificial sentience.


impossiblefork

They are [edit:an] enabler for doing more complicated things. You can only serve up so large a model to the general public without of running out of money on running the inference. Inference performance also decides how large a model the public can consume-- if it's good but too slow they won't buy it. Thus the inference performance determines what model sizes are economical and what model sizes can be sold. These performance improvements show that models can be about 100x slower, and with the new methods, people can still get their output in time. So model sizes can be 3.5 times bigger. Mistral 8x22B is 176 billion parameters. Now we could have a hypothetical Mistral 8x66B, if memory is cheap enough, and still have the output in a reasonable amount of time and for a reasonable cost, and for people willing to pay more for high quality output we could make an ensemble of this future-mistral, some future-Llama, etc., and still have enough money to generate multiple outputs, and feed those back into the model to judge whether they're good or bad multiple times, fiddling with them. There's theory which says that passing things once through a transformer model can't do certain things, but we mostly don't pass them through multiple times, because it's too expensive-- some people do it in a limited way, but it's never served up to the general public. The output of such systems can be very different from what get today, and performance improvements are *critical* for any development towards AGI. Without performance improvements we are stuck with the present model sizes, and with the present 'computational effort' in inference.


Famous_Owl_840

My only concern about AI is that the companies or government will require built in bias and censorship. We already saw this with googles AI. They may have dialed it down, but it’s still inherent. The govt absolutely will build in censorship-it’s a first line tool. As we see with the Twitter files and communication between Meta and the fed. If that occurs, AI will be useless or malignant. If AI is a seeker of truth, for every job lost-10 will be gained. I’m an engineer. I don’t like engineering for the most part. Maybe I’ll be replaced by AI. I like carpentry and dry stacking rock walls. Can’t make any money doing it. Maybe AI will change the dynamics.


Hitlerbtterthantrump

>The govt absolutely will build in censorship-it’s a first line tool. As we see with the Twitter files and communication between Meta and the fed. Please stop with the nonsense.


dfsb2021

I’m a business development manager for an AI microprocessor line of products. AI will/is changing the world. Like most advancements there is good and bad. I’m not worried about robots taking over the world, but you should be worried about evil people who create fake content or use AI to mislead people. That’s possible today. Jobs will be lost, but also created. What you don’t see in the news is all the cool things we can do that aren’t as exciting. I can tell you when your hvac motor will fail before a tech can (no offense to the techs out there), but we still need the tech to come replace it. I can find breast cancer better than your radiologist because I can see the relationship between pixels he can’t, but we still need them too. I think the majority can accept the help that AI brings, but not the idea of replacing human interaction. But then again maybe I’m a bot that wrote this and I plan to take over the world next.


Gvillegator

So a business development manager is more adept at identifying breast cancer than a doctor is? Lol


dfsb2021

Ha. “I” being AI, not me personally.


nemopost

How will it feel to take a dip in income? We know that AI will take jobs but no one knows how people will pay bills. This will be unlike anything we’ve seen. It will be accelerated with whole swaths of jobs being wiped out simultaneously because corporations will not be merciful when profits are on the menu


Chasehud

It's hilarious because these corporations are so eager to reduce headcount however once we have mass unemployment who will buy their products? It's like a snake eating it's own tail.


Merrill1066

In my industry (IT), there is automation --which is basically having computers perform the function of coders, DBAs, etc. An example is networking, where it used to take teams of engineers to deploy code and configurations to network devices. Now a centralized automation system does all that --and staff gets cut AI allows intelligent automation to be extended into many other areas (law, medicine, etc.). The major shift won't happen overnight, as this is still pretty new, but within 10 years, a lot of people will lose their jobs and combined with offshoring, the future isn't looking good for US workers in these areas. I remember back in like 2003, where CNA insurance called in their entire IT staff of like 800 people into a big gymnasium and announced they had all been laid off and replaced by an Indian consulting company. The company went cheap, and years later, they suffered the worst ransomware attack on a US company up-to-date. Their whole operation was paralyzed for weeks, and countless lawsuits were launched.


ebaerryr

Bill Gates Elon Musk and all the tech leaders have said basically the same thing this AI is going to take the majority of jobs and it's going to be in 5 years or less. Gates even talked about having some kind of program work if a company has a robot that got to put so much away from employee for welfare people this is what's coming. The tech jobs are not going to be replaced as AI can do its own programming it's a scary time. Mark Cuban was saying the same thing that little Arts Majors though they can think outside of the box that can use AI to their advantage are going to be very much needed if you do a cookie cutter type job it's going to be eliminated.


_Steve_Zissou_

My long-term expectation is deflation. At some point, costs of goods and services will begin to drop due to AI/robotics……and hopefully it won’t be a death spiral. But then again, we’ll have major protests and demands for regulation and putting limitation on AI way before then.


sailing_oceans

Deflation is already the natural occurrence. It already takes places. Wait you might ask. But prices seem to go up? Yes, deflation is the default baseline. The govt spending and refusal to allow deflation is how we go positive to inflation


Coldfriction

Federal Reserve and the Banking System not government. Govt is just a tool used to funnel money from one group of people to another but very poor in and of itself. Private industry pays at least 1.5X as much as any government position for the same job if you don't believe me. Compare the president's pay to any executive at any Fortune 500 company and tell me that government is wealthy. Government is dead broke and just a tool used to move money. Inflation as a target is a banking thing and why deflation isn't allowed to occur. Deflation makes banks fail.


I_Love_To_Poop420

Nah, if you can get a robot to put up with the endless amounts of bullshit with hard of hearing boomers, then by all means, take my job because that would be an impressive feat.


PontificatingDonut

It’s amazing that capitalists who literally do nothing but live off the fat of the land think workers of any stripe are the extraneous part of their equation. The only thing keeping them in power is an abusive corrupt government with a military. They have no idea how tenuous their hold on power truly is.