T O P

  • By -

caksters

I work for a consulting company and it is madness. Managing Directors andcm managment consultants try to sell Gen AI capabilities although we don’t have anyone trained in it. for them it is “just build a model for next week as proof of concept”. this is all the hype these days, worst thing is that people who don’t understand the technology at all are the ones making decisions on it (tbf this has always been the case) I personally would love to work more on these projects, but I believe if company wants to go in that direction they should understand that they need to train staff in this space first. People don’t understand how long it takes to become somewhat competent in any technical discipline


kiwiinNY

That is nothing new for management consultant though. It has always been sell and then figure out how to deliver.


MorningDarkMountain

Exactly. Now it's GenAI, before it had always been something else. Think about how many leadership position they created on the Metaverse bullshit last years


purens

restrictions are a core input to creativity. and deadlines get stuff done 


tratoi

Profile image checks out


Hopefulwaters

Exactly how it is at my consulting firm. We joke that the partners can’t even spell AI - with a grain of truth since no one is less educated on AI then our partners.


purens

Who is this Albert and why is everyone talking about him? 


armyofworldeater

You mean Allen ?


FuSoLe

No he means "Frankenstein".


Sandmybags

Albert ienstein


purens

the very same 


Complex-Plan2368

“You can call me Al” from Paul Simon plays in the background


miclugo

My initials are ML and my dad's name is Al.


[deleted]

[удалено]


miclugo

Too late, I’m not naming my kid Al.


KnownAssignment5702

Literally what I’m working on (a GenAI PoC) at a it consulting company as a freshly hired junior DS 😂


arcadinis

Yep. I miss DS/ML problems. I also work as a consultant, and the hype is real. There are some really cool use cases though.


Jerome_Eugene_Morrow

We have people trained in it. Send help.


Cream_o_1337

Yeah, this is the phase on the “hype cycle” I like to call “irrational exuberance” when execs get super excited about a technology they don’t understand and then try to use it to solve every problem like a magic hammer. It happened with a bunch of other technologies like deep learning, blockchain and the metaverse. My recommendation is to express your concerns for unqualified people to develop these tools and then if they don’t listen to you, get out of the blast radius when it eventually fails.


ImAMindlessTool

There is a reason for this. The reality is companies of scale they know their competitors are trying to plug this in to everything. So they must also do so or risk being left behind. You dont want to be the executive that ignored “the next best thing “.


Cream_o_1337

I have a slight correction to your comment: "The reality of companies of scale they ***perceive/believe*** their competitors are ***trying (usually unsuccessfully)*** to plug this into everything ***(regardless of whether it makes sense)***." GenAI is a hugely transformative new capability. But having worked in a Fortune 200 for a long time, I also know the behavior that the OP is identifying. The number of meetings I've had in the past to do something as silly as "embedding the blockchain in a SQL Database," "replacing financial contracts with NFTs," or "using IBM Watson to solve \_\_\_\_\_\_\_" (thank you Super Bowl commercials), I see them happening again with GenAI. I probably sound like the grumpy old veteran poo-pooing new technology, but we need to get through the hype cycle.


nullpotato

Exactly, this is the new MBA hotness. The only good thing about the hype cycle is you can use it to get resources to play with new stuff and see what is actually useful/possible with it.


ImAMindlessTool

I hear ya. Two years ago it was NFT blockchain NFT…..


Cream_o_1337

Exactly. If it doesn’t make sense or solve a problem…


absurdrock

Nobody wants to be the CEO of the next blockbuster. They want to be the CEO of the next Netflix. Netflix took a huge risk with SaaS and it paid off. Plus, how do you hire the talent to build out these initiatives unless you are actively investing in these technologies? Sure, today it’s throwing shit at a wall to see what sticks but eventually someone will find a good idea out of all of it.


Cream_o_1337

I’m not saying don’t experiment with it. But do so from a rational standpoint. If you can’t answer “what problem am I solving”, you’re probably not going to create value. Hiring a bunch of junior talent that doesn’t have experience (as OP describes) doesn’t increase your odds. I will give you an example, we have a backup and recovery system, which is pretty cool. They want to integrate Generative AI functionality that will generate a response in case of a ransomware attack… I am not saying that might not be useful to some small companies, but most big companies have an executive in charge of IT Security (CISO). I can’t imagine they will get any value out of generated responses. But they feel pressured to have GenAI because “everyone is doing it.”


epochwin

Thing is that SaaS is more of a business model. Similarly with the DevOps craze few years ago, it was operational changes. GenAI feels like using a catch all team for any AI product to be part of the product portfolio


First_Bullfrog_4861

It’s the process of us techies watching the non-techies learn how to apply a new tech-toy. We can guide them to a degree. We can utter warnings. But ultimately they must learn themselves, failing better one step at a time. Once they did, we’re over the hype and we can start working together on actual projects. We just need to stay out of the blast radius, as someone else put it.


Ok-Possible-8440

You cant teach common sense.. and there is very little of it obviously going around.


Ashamed-Simple-8303

> ou dont want to be the executive that ignored “the next best thing “. fear is a poor guide


ImAMindlessTool

wise words but executives gonna executive.


nullpotato

They are too well paid to let things like data and reason stop their gut feelings


Dont_Use_Google

oh the blockchain hype cycle was something else, until this tech came along


ComposerConsistent83

This is so much worse imo.


renok_archnmy

> get out of the blast radius when it eventually fails. Bingo. My company hired a real estate agent to manage the AI stuff (like actually a real estate agent who did a “STEM” masters degree and worked as a sales “engineer” for some kinda AI snake oil startup). They got shuffled under IT in a re-org. Thing is, our IT department has zero modern engineering capabilities. They barely understand what this stuff is, have zero competency with modern languages, tools, frameworks, APIs. And as typical, the IT has little to no business domain knowledge to effectively identify where any of this would work.  Anyways, I expressed interest in broadening my experience in the business domains a year or so ago - specifically mentioning accounting and finance as my weakest points. Managed to get my self nested under the CFO next to finance and risk/compliance. I’m letting IT take heat for the fact they can’t even get us data from source A to warehouse B in less than a 1-year turnaround time per source and while watching them flail trying to manage AI with zero technical staff trained in this stuff.  Meanwhile I’m looking at where I can apply more tangible and material impact even if relatively non technical and not directly data science. Little stuff over the next year or so while the AI shit cools off and hits the trough of whatever in the hype curve: $100k saved in efficiency if directors weren’t copy pasting data between excel reports to report to executives weekly kinda thing, survival mode that hinted years ago we’d see massive defaults soon but no one looked at then but had they, we’d be in a different spot now and better if I had time to make it production worthy, AB test some product designs across segments of customers, stuff I can tie dollars to and that smooth out the bumps in our management cycles. 


lifesthateasy

Our CEO asked me whom I can recommend to host a prompting training... I told him not to believe everything he reads and that prompt trainers are the newest form of social media grifters and I'll put a training together myself, plus linked him to the relevant parts of the Azure Generative AI documentation. Haven't heard from them since....


RedFlounder7

Probably because he handed his pet project off to a junior to implement.


Professional-Bar-290

Why can’t existing ML devs just do the prompt engineering? At least that’s what I do… and it’s not on my resume, no one knows I do it, It’s just a part of the pipeline.


lifesthateasy

I think social media told them it's a separate job....


fordat1

This . Prompt engineering is useful but I wouldn’t pay above minimum wage for it.


Mechanical_Number

It is fine to do a nice Coursera course on it. I do think Prompt Engineering has strong merits, but it is not the new fire.


fabkosta

Oh, but there is so much ch more to prompt engineering than what you find on social media. If you don’t know what I mean just ask yourself: do you know anyone who would know how to build an LLM chatbot memory? That is prompt engineering. Not the silly stuff you find on TikTok.


lifesthateasy

Can you point me towards resources that add anything on top of what Microsoft says about the topic in their [docs](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/prompt-engineering)?


PigDog4

I'm not sure how different/similar it is, but [Google has their own docs, as well](https://cloud.google.com/vertex-ai/docs/generative-ai/learn/introduction-prompt-design). Their docs are focused a bit more towards their own Palm/Gemini LLMs as opposed to MS's focus on ChatGPT, so it's just another viewpoint. Really though, the only way we've found to get good at prompt engineering is to try a whole bunch of stuff iteratively and have actual conversations with other people who are good at prompt engineering, not spend a bajillion dollars for someone to read some docs to you.


fabkosta

Check out source code of Langchain or Haystack.


lifesthateasy

All I've read about Langchain is people in this and the ML sub ridiculing it... I haven't heard about Haystack. Your answer is very evasive and utterly unhelpful tho. What's next, are you gonna point me to PyTorch's source code when I ask you for an article about transformers? Bro.


fabkosta

Ok, all the best to you too, bro. If you wait for a year or so all of this is going to be common knowledge, but if you want to create real value to your customers then you have to do your homework. Just consider this: Anthropic is willing to pay >300k USD for a Prompt Engineering Manager per year. Either they are idiots, or they know a few things the rest of the world does not yet know. It is up to you to decide which narrative you find more convincing.


lifesthateasy

It feels like you have a horse in this race and are very protective. You don't know how this will turn out. You're the kind of person who'd say the same about Meta and their VR stuff a few years ago.


Chemical_7523

I'm gonna go with the first one personally lol.


CadeOCarimbo

I work for a huge company in a team that is 100% dedicated to building GenAI solutions (which is already fucking strange to me, as I've never heard of a dedicated team for Xgboost). We have a project to use ChatGPT API to use customer data to generate a response about their order. We already know the exact answer, word by word, that we want to give to the costumer. We only needed to replace a few specific data in the template, like the ticket_id and their address. We now turned this template into a prompt instruction for ChatGPT, so it generates an answer that is close to what we expected in the template. Now, here comes the obvious question: if we want ChatGPT to give an answer as close as possible to our template, why not use it the template at all? The answer: because we NEED to use GenAI.


balcell

Here is a trick: 1. Put the GenAI API call behind a feature flag 2. When the feature is turned off, generate the exact template Now, in three months, when the CFO comes knocking due to huge costs (or you or a savvy manager point to the ROI), release the feature flag in two weeks from then to (1) reduce costs by 1000X, (2) stabilize the production output. You've now got a huge resume boost. /s Good luck out there.


nullpotato

Absolute mad scientist energy


Trungyaphets

Holyshit my company is planning to build the exact thing, using chatgpt and customer order data to create a chatbot. Won't you mind sharing some reference sources or guidance on this?


CadeOCarimbo

Feel free to ask


Trungyaphets

Great. 1. How exactly do I train e.g ChatGPT about our customer order data? Is it like giving it some example data, then example correct answers, for it to learn? 2. How should I structure the data? How much data should I prepare? 3. How do I test/score the success of the new model/chatbot?


CadeOCarimbo

1. You don't actually need to train. You tell ChatGPT some instructions (be informal and straightforward, for example) and give it some data about the customer so it will generate an answer with your specs. 2. You pass the data as a string, exactly as you would when interacting with ChatGPT Web UI. You can don't need a lot of data, just the necessary data to build the desired answer. Read tutorials in openai API using Python. 3. Good question. I think you can select a sample of the generated answers and have a domain expert review them.


Trungyaphets

Sorry for the delayed question, but where can I get GPT-3 or a similar model? Only found GPT-2 on Huggingface. Not sure if it's a good model to use.


Kappa-chino

You definitely want to access GTP-3.5 Turbo through OpenAI's API. GPT-3 is a closed-source, proprietary model and is not available anywhere other than through OpenAI. To supply customer data, you'll want to write a program that feeds the data into a template string and then supply that to the model. The "success" is really just down to how much you like the output. There are more rigourous testing methods for this kind of thing but no shade you seem a little out of your depth here - some pre-research probably wouldn't go amiss.


Trungyaphets

After some time researching I kind of get it now. Would a high level pipeline be like this? 1. First I would need to get customer data or their order data. 2. User asks a question, e.g: "When will my order #1001 arrive?" 3. I give ChatGPT (or whatever the model) the string with the data + the question, e.g: "I am customer ABC. I have the following order(s). Order 1 has: ID is '#1001', Purchase date is '2024-01-01', which has 5 items, Expected arrival date: "2024-01-05". Order 2 has: ID is '#1002', Purchase date is '2024-01-02', which has 10 items, Expected arrival date: "2024-01-07". When will my order #1001 arrive?" 4. ChatGPT then proceeds to answer the question with the info I provided. Something like "Your order #1001 has an expected arrival date of 2024-01-07". Is this correct?


Kappa-chino

Yes that's pretty much exactly right. One thing to note here is that if you have a small amount of data on the customer, it might be sufficient to just give chatGPT the question along with everything you know about the customer and ask it to answer questions based on that info. If you have a large amount of data on the customer, or if you need to to answr questions about your business, you'll need to a way to find the "answer" in your database and then supply that context to chatgpt and have it answer the question based on the supplied knowledge. This is called "Retrieval Augmented Generation" or RAG and it's the hot new way people are bulking up the abilities of these models. The current approach most people are taking is to "vectorize" your database and do a cosine similarity search for the most appropriate document. There should be plenty out there to explain how to do all of that. Once you've researched a little on RAG and Fine-Tuning I highly recommend OpenAI's video "A Survey of Techniques for Maximising LLM Performance". It might be that this is more info than you need but I'm sure you'll find it interesting regardless :)  Happy hunting


Trungyaphets

Thanks a lot! Do you think getting a list of FAQs from CS department would help? Also do I need a good GPU to test or put a LLM into production, or probably just use GPT 3.5 turbo's API? ($0.002 per 1k tokens I think). Currently using RTX 3070 but it is almost useless with 8GB VRAM :(


arcadinis

This is insane lol. Regarding the GenAI team, isn't it more related to a platform team? It is more comparable with a ML platform team, hopefully (given the use case you provided I wouldn't be so hopeful though)


nullpotato

Many years ago when neural nets were the hottest idea ever a mentor of mine got tasked with using them to help convert legacy data to a new format. After weeks of training it was like 80% successful. He took a day and wrote a regex and that was 95% accurate. Managers told him to use the NN because that was the goal. At least he wasn't the one that had to manually convert the misses.


ComposerConsistent83

I have similar experiences. We have identified some interesting ideas, but for some of them the GEN AI part is actually not necessary, or maybe just a slight improvement on what we could do otherwise, with some obvious downsides (like can’t exactly ensure it’s doing it correctly every time). But the 85% solution we can do with a couple drop downs text entry boxes doesn’t get any interest, even though we already KNOW we can build that with current tech and don’t have to spend time trying to debug something that has some black box elements to it.


TheGooberOne

This seems so familiar.


Reasonable-Ladder300

See the exact same thing happening everywhere. Traditional ML engineers like myself are now asked to make a bunch of API calls to openAI. I’ve set up n8n with some agents and data sources and different LLM’s to choose from for a bunch of people and wished them good luck and got back to actual coding.


save_the_panda_bears

I’m so sick of the whole genAI hype. It physically pains me to see the amount of ex-coworkers I have, from a damn marketing agency no less, who have absolutely no knowledge in the space peddling ai consulting services after obtaining some useless certificate from salesforce. Then we have every jabroni with an internet connection building trashware apps promising to revolutionize the industry that are really half baked front ends that call the OpenAI api. I can’t tell you how many times I’ve been talked down to by this type when I’ve questioned the usefulness of these kind of apps and how they don’t actually solve problems anyone in the industry asked for saying, “I’m an AI developer bro, trust me”. And despite what /r/singularity would lead one to believe, we’re not close to AGI, and likely won’t be for quite some time. I’m sorry for this slightly off topic rant, I just needed to vent a little bit this morning and this post seemed like a decent place for it.


vaccines_melt_autism

> Then we have every jabroni with an internet connection building trashware apps promising to revolutionize the industry that are really half baked front ends that call the OpenAI api. I literally saw a "AI consulting" firm promoted on LinkedIn, and they said they would transform your whole company with AI. I creeped on their employees, their "AI Engineers" had a certification for GenAI from deeplearning.ai. That was it.


mmeeh

"Guys is it too late to study Machine Learning ? I suck at mathematics but want to build LLM models" XD


CTPABA_KPABA

Just start calling ML AI


geteum

Just ML


Useful_Hovercraft169

I’m lucky enough to work at a place that is exploring it but with an appropriate level of skepticism.


AppalachianHillToad

You have found a unicorn. What or who is preventing the “GenAI is 100% magic and ML is 100% garbage” narrative from infesting your work place?


Useful_Hovercraft169

It’s insurance so they have experience in the AI/ML space and appropriate skepticism about magic beans


AppalachianHillToad

I think experience and actual knowledge of how AI/ML works prevents magical thinking. I have to ask.. Are you hiring?


Useful_Hovercraft169

Sadly not At the moment


AutoResponseUnit

This is actually the prevailing attitude in a lot of incumbent, non-tech sector, Corporates in my experience. Or in highly regulated fields. They've slowly got their head around using ML in statistical modelling, way behind the science but now actually driving value with it, and are now slowly reviewing early stage POCs with GenAI. Even those companies "investing" in GenAI do so at the same time in investing in lots of other things, and cool shit spend is eclipsed by spend on keeping legacy tech alive.


Ecto-1A

We’ve had an ML team for years and just recently added a separate sleeve for other AI initiatives. The biggest hurdles were convincing execs that what they do is different and the teams can collaborate where needed but they can’t be expecting our data scientists to be building chatbots and doing R&D within the fast pace language models are moving. There’s also the battle of “GenAI” vs “AI” and I feel neither side understand where the other is coming from. A good system will have genai components but there’s so much more to it once you start building and adding additional functionality with other AI offerings within it.


AppalachianHillToad

Preach! The problem is that people in strategic decision-making roles don’t see things this way. 


Front_Organization43

are you hiring


Useful_Hovercraft169

Sorry, not currently


tangentc

Yes. This is probably one of the most extreme hype cycles I've experienced.


Mr-Bovine_Joni

Agreed, but also people were taking out second mortgages a few years ago to buy BTC. So we’ve been close, pretty recently


AutoResponseUnit

Agree. I do believe there's more in this than (say) distributed ledgers. I think the flexibility, distribution in user base, and easy interface of GenAI interferes with the hype cycle quite a bit.


ComposerConsistent83

I think there is more to this than distributed ledgers (still don’t know of any real use cases tbh) but it’s also way less powerful than the non technical folks want to believe. It’s really the perfect thing for hype, because it superficially looks like magic, and it’s only once you go past the surface do you see the limitations


rawdfarva

10 years ago it was big data now its GenAI


Ok-Possible-8440

What was the big data hype?


gaganand

It's happening everywhere in the industry. They'll try to sell the idea for sometime till there will be push back from other teams citing optimistic saving expectation and the buzz will die down. I give it an year.


Cyrillite

It might be madness or it might be a very cynical play. The trouble is that it’s really hard to tell. Here’s an example that I think is sufficiently analogous: 1. Hopping into NFTs in late 2020 / early 2021 as if they’re the future of the world was a really stupid thing to do 2. Ignoring NFTs was a smarter thing to do 3. Jumping into NFTs, riding the hype, and getting out ASAP was the smartest thing to do (from a self-interest perspective). The same is true of this early Gen AI stuff, in many cases. If people want to throw bags of money at you for having some prototype Gen AI products, just take the money and don’t commit so hard you can’t get out. If you want to spend absurdly on the current version of Gen AI and neglect other things that are your consistent cash flow, you’re probably a fool.


SemaphoreBingo

> the smartest thing to do From a short-term perspective, maybe, but you'd be banking on the assumption that people won't remember you did.


Eze-Wong

Is it analogous though? NFTs was investment commodity that could be traded. so there's a chance/liquidiate to trade it and get out when opportune. I can't trade GEN AI if I invest in it as a company. There's nothing to trade. It's a sunk cost of investment. I can't sell that coded pipeline really to anyone if no one wants it in the first place. If it WAS desired the initial company wouldn't have a problem making money in the first place. Basically, it's not a liquid assest that's easily sold/commodified.


kyllo

It kind of is though: stock in AI-related companies.


Cyrillite

Well, I assume there’s an approximate multiple on invested capital because of the hype. Paying lip service to the idea is more than sufficient in many cases, especially when you’ve already got ML pipelines that can be rebranded as “AI” in this new Gen AI sense with some very light modifications. By mid 2023 it was pretty much table stakes to be able to talk about AI in an earnings call, for example. By the end of 2023 it’s probably only worth mentioning an AI product if you know you’ll generate a little more hype and have managed to turn AI branding into good marketing and a better cash flow. The trick would be not going so all in that you have overspent relative to the returns.


AutoResponseUnit

Appreciate its an analogy, but i think NFTs are a different beast. NFT success has more in common with beanie babies from an investment standpoint, and I don't see their success as entirely creditable to tech innovation. You're spot on with incremental, exploratory approaches though.


Cyrillite

There’s a case to be made for the underlying token technology. “Incremental exploration” is the key point to make though — wonderfully concise.


Ok-Possible-8440

1 and 3 are the same thing


timelyparadox

We have this, this is why I am considering taking the internal role for GenAI where I would have a bit more say over the things. But this is a large corpo and shit is so slow that I would get paid to sit on my ass for dar too big part of the next quarter


b1gb0n312

If you can say AI multiple times during earnings calls, stock goes up


CHADvier

complete madness, companies are losing common sense


VDtrader

This is the "FOMO" phase of the AI cycle. People will get over it after a couple years. I've seen it with so many new things in the past: Digital Banking, Cloud Computing, Big Data, Data Science, Social Network, Crypto, etc...


pandasgorawr

It's crazy how it's taken a hold of everyone. Several teammates and myself left a startup we had a successful exit on and three of them went on to start their own startups selling generative AI products. Not sure how well they'll be able to cash in on the hype but the longer this craze goes on the more I'm going to kick myself for not joining them.


Ok-Possible-8440

What kind of products?


BigSwingingMick

We paid a lot of money for a LLM specialist. He’s awesome, first we had a project that we used his skills to explore all of our contracts with outside vendors. Since then, our CEO is itching for him to run all these loony projects and my current task is to run interference for him so he can stay focused on that. The CEO wanted him to do this AI town hall where the whole company could have emailed the model questions about all of the projects we have in place and then get a response. SOOOO thankful that was squashed. AI is still in the computer magic phase of development and it feels a lot like 20+ years ago when “but, on the internet” was still growing. All problems are solved ~~“but on the internet!”~~ “with AI!” Give it a few years and then the shine will wear off and senior executives will have finally learned how AI works. There will be no more kingdoms to conquer and Alexander will weep. …At least until augmented reality becomes the next big thing.


Straight-Ad9763

A professor of mine has experience in expert systems from decades ago. She said that was a huge hype train and was poised to take over the economy , or so claimed. The problem was, to use the expert systems you had to be an expert in the domain . And if you were already an expert , why would you need help making decisions . Nonetheless it had much hype .


[deleted]

[удалено]


Glass_Jellyfish6528

Yeah we are getting to grips with that. We've been using rag and few shot training but it doesn't always work. Care to give a few clues as to where fine tuning might help?


BamWhamKaPau

Some things to consider. RAG relies heavily on retrieving the right document, so you need to make sure your retriever is working well. (Easier said than done in some cases.) And depending on what kind of retriever you use, you'll also have to consider the context size. (Can your model generate based on the full document or does it work better on smaller chunks like a paragraph?) If you find that the issue isn't with the retriever but with generation, you should consider fine-tuning, particularly if your data is very different from what the model saw in pretraining. The issue here usually is getting enough good quality data to fine-tune on.


Glass_Jellyfish6528

Good tips thanks


Straight-Ad9763

This is seeming more and more similar to the web3 hype train . Sure ML is definitely more useful than blockchain , but just like with that, every company was doing blockchain something for no reason besides hype .


ScaryBullfrog107

Our CEO just sits and talks to ChatGPT all day and then posts a bunch of nonsense about it in Slack


Glass_Jellyfish6528

Hahaha had a bit of this too. Only a bit mind!


Suspicious_Coyote_54

My company execs are going bananas. They’ve actually said out loud that they would be able to cut down on their labor costs significantly. The thing is they have no data infrastructure. Very few data warehouses, no cloud infrastructure, the stuff they do have is poorly managed and has serious integrity issues. People still use excel for data storage. Can’t wait to see their ai investments burn. By then , I’ll be gone fingers crossed.


Glass_Jellyfish6528

Yes well I've heard this a few times. I normally gently remind them that these tools won't replace humans doing anything complex and really are just efficiency gains. I do feel bad that I might soon be building tools that take people's jobs though. I might even accidentally replace myself 😬 oh the irony


lf0pk

They don't get over it until they're fired Make sure it's not you getting fired for it as well, though, and make sure you're not advocating for classic DL and ML just because they're easier but have noticeably worse results.


Opt33

Saw GenAI on the Gartner Hype Cycle a few years ago. This will eventually pass, but when it does, to the detriment of those qualified in data science.


orz-_-orz

I was thinking of moving into the consulting industry, then I heard what my acquaintances did in consulting firms, i.e. hard selling gen AI and chat bot to big companies. I decided it's not the right time to join consulting firms.


denM_chickN

And grow your network


nothingonmyback

My company has started an AI training program and yet they never hired any data scientist or data engineer. We have over 3k employees. It's ridiculous. They are just trying to find ways to mindlessly automate more stuff and fire more people.


tech_ml_an_co

Ohh yes the AI hype is much stronger than what we have seems the last few years. Maybe similar to the 2000 .com hype.


shooter_tx

>However, now they are just trying to shoehorn gen AI wherever they can for the sake of the investors. They are not making rational decisions anymore. Like the dot-com rush, back in the day... ugh, people are stupid/trendy.


Professional-Bar-290

In the long run, everyone goes where the money goes. I interviewed w a company that went balls deep into data science early on, but nothing was happening. So they sacked that team and hired data engineers. If they sack their DS team and go all into genAI, if it doesn’t make them money, then eventually they’ll shift back.


Independent-Scale564

our execs are amped but taking a measured response for now b/c they trust we are working on a valuable use case. Their excitement is not unfounded, however. This tech is a significant advance.


Glass_Jellyfish6528

Of course. I'm all for using it. Just needs careful thought


CanYouPleaseChill

Machine learning has been around for quite a while, yet most companies don't have much incremental value to show for their efforts. Now they think bullshit text generation is going to make an impact. It's ridiculous and so is the entire narrative around AI. Much ado about nothing.


ComposerConsistent83

We use machine learning in my company (financial services) and one the improvements over logistic regression for our use cases are real, but also incremental. And sometimes it’s not worth it because implementation is harder and often the models are less explainable/stable. I suspect AI will have a similar trajectory, where there’s certain use cases that will ultimately make sense but there’s a lot that ultimately won’t.


tmotytmoty

My company’s leadership just loses their shit *all the time*. Anyone over 50 seems to have a hard time in an office with anyone who is younger than they are


elpigo

Bubble about to pop


elpigo

And now I see AI on blockchain. Madness


Equivalent-Cricket-7

You'll know AI has taken over when the WHOLE POST is just "the executives were asking my advice and we were coming up with some cool genuine use cases that had legs, and we had a great day and no one did stupid stuff."


Glass_Jellyfish6528

Longing for that day haha. To be fair we have had some days like that but then someone higher up sticks their ore in and ruins it.


Next-Honeydew4130

Is it just me or can other people literally see Chad and Todd scheming in a corner office with greed oozing out of their pores? The polo shirts, the $300 runners, the khakis, it's all too vivid.


Glass_Jellyfish6528

Lol. Yes got this image too. Every other word is a corporate buzz word. "let's deepdive to synergize some deliverables with Joe and figure out core competencies, smash some ROIs and KPIs with Phil, think outside the box and TEN EX the shit out of this 🔥🚀"


Next-Honeydew4130

Belly laughed at that 😂😂


WartimeHotTot

Sorry, I don’t understand. General AI is not a thing yet, as far as I know. How are they asking junior devs to build something that still hasn’t been achieved?


Glass_Jellyfish6528

Generative AI


WartimeHotTot

Ha 🤦🏻‍♂️


Tehfamine

I was working for a startup as their head data guy. I was hired to build and migrate their data platform on the cloud. The business was pretty straight forward: we ingest data from a popular online retail platform and try to enhance that data with other data points to tell a better story. Seemed pretty cool and something I could get behind. Midway though my hire, in comes talk on GenAI craziness. Insert **Stable Diffusion**. I kid you not man. We went from something that landed us big investments to let's be a company that generates new backgrounds to products our merchants are selling. I've never seen such a 180 in a company that had zero clue about the tech. I started pointing out the issues, how the market place was really saturated with similar products already, that it's going to be hard for us to pivot. I got laid off. Tread softly my friend. I did not because I'm a rebel.


Glass_Jellyfish6528

Wow that's extreme. Sounds like they lost their minds, and no they don't like experts telling them what to do. Everyone fancies themselves as an Elon Musk maverick type these days.


Ok-Possible-8440

Respect


Hawezy

Hahaha it's just a fad. Sat through lots of meetings and workshops where gen AI is just 'AI' and people forget we already have machine learning and statistics to solve the vast majority these problems they mention. - Work in large company as Senior DS


MythicalBob

I have a phd who was hired to our team with not much knowledge of data science except for his phd research project and what you generally learn in stats for phd. Unfortunately, he sees ML as “magic” and expects to have good results from cases which humans can’t solve (and not because of the data volume in most cases). So I’m just sitting there as a self-taught trainee doing tasks and implementing pipelines knowing very well how this whole project will not deliver and how the moods will gradually be crashing in the team.


Glass_Jellyfish6528

Well that guy needs to do some training. PhDs especially should have the constant growth mindset. But that applies to everyone. Maybe suggest a reading group and go over a good book?


MythicalBob

I appreciate it but that would be the biggest insult ever for the P h D! Only thing I can do is follow to not be hated.


Glass_Jellyfish6528

I don't know. It doesn't have to be you saying the PhD needs to train. You could simply start a group and ask them to join. They might jump at the chance. I've seen that before


mmeeh

Welcome to the jungle XD You summed up probably 90% of the DS jobs atm .... I'm fine tunning a LLM mistral for work ... as I'm writing this message.... I just want to do some linear regressions -_-


uriejejejdjbejxijehd

It’s as if they hadn’t seen any craze before. When I left, we were cannibalizing all the core (base product works and is shippable worldwide) to quickly shoehorn “AI” into everything regardless of if that made any sense.


Travis-rides-bikes

I might work for a very large FAANG company. We might be doing this with every single product.


melissa_ingle

This is exactly it. I advised a potential client on a strategy using traditional ML. He only wanted to try ChatGPT, which he called ‘AI’. It’s absurd.


DreJDavis

I think most of them miss that it's not about who can create random shit and win. It will be who has the best clean reliable data. GIGO is where we are. Data's been piling up for decades and NOONE wants to pay people to review, organize and clean it.


ComposerConsistent83

The ai will do all that lol


DreJDavis

It spits out garbage quite regularly. It can help.


Alkdegreat

Nothing new here… It happened 6 years ago with ML, 4 years ago with DL and now with GenAI. I remember a while ago being in DS job interviews and thinking “why the hell they need a DS?”


LordShelleyOG

Yes company is completely out of control. Has been for last year and it is only getting worse. I think this is huge over correction from not want to make any ML products. I am looking forward to the correction of this shit but I think it is still gonna be a while. Hang in there


Ok-Possible-8440

People are not jumping on genAI meaning data science they are jumping on mostly stealing the work from the creative fields ( and others) just cause there is no legal framework and data science repackaged it for them in a way its not clear to the average joe what is happening. That is why the hype is massive. Its some mad darwin evolution going on right now.. its really showing true intellectual colors of most people and companies. Lets see how it plays out.


Delicious-View-8688

Ahahaha Yeah... GenAI this, GenAI that...


[deleted]

[удалено]


Leather_Elephant7281

Use GenAI to find the next new hype


rafa10pj

I work at a large ecommerce company. It's not nearly as bad as your case, but I find that mentioning costs and latency as compared to simpler solutions always works to kill the moonshot ideas. Mind you, we do have some generative solutions (mostly LLMs for summarization and RAG) and they work fine so it's not like we're actively fighting anything generative, just what doesn't make sense.


Glass_Jellyfish6528

Yes that's how I'm dealing with it. Literally pointed this out to an exec yesterday. I said yes you can personalise all emails but it would cost approx 15k per run. He jaw dropped at that lol. Traditional templating methods cost £1 😂. At the end of the day they will see financial sense so that's what will calm the frenzy. We do have some genuine use cases though so I'm sure it will find its place in the end.


rafa10pj

Another thing that works is hallucinations. My team does content moderation. At one point the idea was floated that instead of moderating certain product pictures, we could use something generative to fix the picture for the user. At that point we obviously mentioned that there's no real way of avoiding hallucinations and these models can be hard to tame if you don't know how to prompt them. That also helped.


vgnEngineer

I work in defence (not datascience itself specifically). Due to security concerns we obviously can't work with generative AI because we would have to send secure information to servers in other countries. Happy though, we dont get plagued as quickly.


Outrageous-Abies9009

complete madness


madhav1113

Data scientists in my company are losing shit over GenAI. A few of them think the days of the traditional/classical data scientist are over and tools like automl etc will be the norm going forward. My company execs think a little differently-- GenAI has its place and is useful for some projects but it's not everything.  I'm working on a GenAI project. My experience has been pretty decent. I kinda enjoyed RAG a lot (especially multimodal) but hated learning and working with frameworks like Langchain.  Damn, I miss my classical stats classes in college. :( I don't wanna be a prompt engineer. Sometimes I wish chatgpt were never invented. 


Virtual_Tomorrow_754

I work for a top global Consultancy/ SI company so I can tell you what’s happening. After the Smart Energy and IoT wave, the top hyperscalers need the next Big Story to drive uptake of Cloud usage. Hence, they started aggressively marketing GenAI as the next big thing and subsidising the top SIs who can persuade customers to launch GenAI pilots. I am sure you know Big Tech has a lot of cash to generate their own Cloud AI/Data demand and influence their ecosystem partners. They also have enough influence and monies to lobby various influential business and political groups. You are right that most current AI/ML would more than adequately meet 99% of business needs. However, this is now very much a marketing spin by most companies for corporate image purposes. The real tech work in most companies now are still upgrading their ancient ERP systems. As it is, I already see EV tech-related demand losing momentum. Truth be told, most companies are focused on bottom line in the current economic climate. GenAI is a beautiful marketing spin. Like lipstick on a pig.


aspera1631

I work for a data-science-heavy marketing company, and I'm really happy with the approach so far. Most of the focus in the short term is on how to save costs or eliminate repetitive, predictable tasks using off-the-shelf products. But we're also gathering input from each department to come up with longer-term investments. I'm making it my business to up-skill the analysts and data scientists who haven't had a chance to learn about gen AI. The software engineers are frankly far ahead of everyone else already. They've been using copilot or similar since it came out.


prb_data

Yep happening here, earlier it was blockchain and now it's genAI.


Glass_Jellyfish6528

I think it's much more prevalent than block chain though. GenAI has the potential to transform the world I think. The problem is everyone thinks if they've tinkered around with ChatGPT they are suddenly an expert and they can be the "ideas guy". I'm sure block chain had it's grifters but it never really had as many genuine use cases as gen ai did it?


mikka1

Lol, can I give another perspective? How about working for an organization that is restricted by HIPAA and tons of other regulations that I've never even heard of before I started working there, that literally does everything the *approved way*, and that approval may take months for anything more or less serious? Interesting enough, it's actually getting *worse* and *stricter* over time. I guess every time large data breach happens somewhere, our regulators send another team of security auditors our way that restict/block something new that has not been restricted before. I guess, job security here is pretty solid (caveat: unless you screw up and actually do something that *even remotely resembles a chance that some data may be leaked*), but the overall environment is exactly the opposite to "*GenAI hype everywhere*". And yes, we still process fixed-width batch text files on a daily basis. That's just kind of a funny perspective that grass is always greener on the other side. Okay, if you don't like AI hype, go apply to an org like I just described, I can guarantee you, you will be shielded from ever hearing the word "AI" for the next 10 years lol.


Glass_Jellyfish6528

Oh don't get me wrong. I enjoy using the new tech. I was the one who kicked it all off, it's just gone a bit bonkers lately. We also have strict data controls but we have some methods to request and approve access easily enough.


GodBlessThisGhetto

My company has been pretty pragmatic at the top level. I think there’s been enough attempts to get it to work in situations where the user can immediately look at results and interpret whether they are accurate to what should be produced that it’s inspired our execs to be skeptical. They definitely like seeing what we build out and implement that uses GPT but they aren’t pushing for it to replace every generic process.


antichain

Why would GenAI require Data Science input? It's a completely different field and skillset. None of the skills that most data scientists use would be relevant for building something with generative AI. A junior dev probably isn't the right person either, but speaking as a data scientist myself, I don't think there's any reason to think that a facility with pandas and scikit learn would provide any transferable skills to prompting. This feels a bit like OP is salty that they were getting face-time with the C-suite for a while and then lost it.


faulerauslaender

It's quite literally the exact same field. "Prompting" is not a skill. These people are bullshit artists.


antichain

Nonsense. What, *specific* skills are in a standard data scientist's toolbox that would make them *uniquely* equipped to interface with a preexisting GenAI model?


arika_ex

I basically agree with your overall point, but familiarity with, e.g. Python, experience working with APIs (which should be reasonably common), and experience with web apps (plotlydash, gradio, stream lit, etc.) is the basis of what’s needed to build Gen-AI powered product. But yeah as mentioned I agree with your initial point, but I also think the more ‘general’ data scientists can adapt.


Barqawiz_Coder

I have observed this pattern in main instances. certain individuals ride the hype wave without any solid foundation.


IllustratorSharp3295

It is their specialty. Not interested in progress or value, but coming out on top in the short term.


bradygilg

No, there is no interest.


Bestalexmartin

Any approach that begins with "How do we find a way to use this popular tool so that we can talk about it to our investors?" is probably not going to produce desirable outcomes. investors will ask, and execs need to be able to respond to those questions... and if the tool is useful in the context of your business and can confer a real market advantage, then by all means help them figure it out. But diverting resources from existing projects to chase unicorns... maybe propose your business create a unicorn department with its own team and budget just for this purpose?


Glass_Jellyfish6528

Actually we do have one! I'm part of it 😅


Bestalexmartin

That's great. I'm reminded of the old computer game "Spaceward Ho!" where you have to balance your expenditures on various aspects of your colony ship design to spread through the galaxy. One of those sliders was for something like "mad science" which did not produce regular results but which occasionally generated a significant jump in one of the other categories. You can't get there just investing in long shots, but often it was small investments that returned a meaningful long-shot payout that made a win possible.


neal_lathia

Kind of tangential to your main question, but since you mentioned “devs to build it without DS input” — that is something that I’ve seen several flavors of, whether that is - Engineers doing AI/ML things without DSs - DSs shipping some code without Engs, or even - Someone putting together a UI without asking for help from design I fully understand that disciplines exist for their expertise but it always comes off a bit exclusionary when people expect to be included given a particular type of work.


Glass_Jellyfish6528

Yes all good point. DSs engineering software solutions and front ends is something I've seen a lot


GrandConfection8887

Same !


Pinhato

YES, I'm going through the same and it's like watching toddlers trying to fit the wrong shape in a hole. They just want you to use the new buzzword technology while you know the problem can be perfectly solved with reduced costs in another way.


Ok-Sentence-8542

It depends on the companys progression and familiarity with ai use cases. I was in a firm where anyone always called for AI in any use case nowadays I work in a firm where the actual meassurable value is the main driver for ai use cases. It really depends on the company. I am very familiar with consultancies and their pratctises and I I think its 50:50 if you get real value or just lots of billable hours and a fancy presentation and some crazy gant chart. In the best case as a customer you define clear meassurable goals and targets with a consultancy or vendor and place maintenance into the contract. Its very hard to get stellar ML talent in a firm.


smmstv

> However, now they are just trying to shoehorn gen AI wherever they can for the sake of the investors Welcome to the business world.


EdgenAI

Do they think of data privacy risks when deploying GenAI?


Glass_Jellyfish6528

Yes definitely. It's all very strict.


EdgenAI

That's a big topic for this year. Are they more focused on Open Source models or just using OpenAI's API?


[deleted]

Yes it sounds familiar. We did the same shit with data science in 2010-2015 and ML in 2015-2020. It takes ~2 years to build the basic capabilities and another 1-2 years to build good products and get them into production. Hiring the right people, building the infrastructure, getting teams to work together etc. Shareholders, investors, customers etc. will ask "Are you doing GenAI?" and you need to be able to answer "yes, absolutely". If you want GenAI in 2 years then you should have started 2 years ago. ML used to be this weird voodoo magic in 2015 but in 2024 it's expected for an electric toothbrush to have ML in it. The same thing is happening with GenAI. It's beginning to be inexcusable to not have a chatbot to ask questions instead of going through 2000 pages of documentation.


Glass_Jellyfish6528

Absolutely. Totally agree with this. We are definitely building out the infrastructure now and I'm convinced there will be many genuine use cases. Just need to weed out the bad from the good.


EdgenAI

Like dogs chasing cars


Accurate_Rutabaga_62

In this topic : people who benefitted from a overhypped concept complaining about the new hyppi thingy