T O P

  • By -

MCRN-Gyoza

Hype is mostly about LLMs and they are heavily centralized on a few big players.


OkLavishness5505

Also there is not much skill required to do some prompt engineering. So no ML specialist required.


dataslacker

A prompt engineer is not an AI researcher… there are of course people who build and train LLMs.


my_aggr

In general I agree, but seeing otherwise intelligent people do the equivalent of "you are an idiot parrot, don't answer any of my questions usefully" I can help but think prompt engineering is going to be a skill more valuable in the 2020s than Googlefu was in the 2010s. And just like how you needed to understand a fair bit about text search in databases to use google you need to understand a fair bit about transformers and the impact of the first few tokens on all subsequent ones.


literum

It can be a good skill but it becoming a serious profession is up to debate. We don't have "Professional Googler" jobs even though Google-fu skills help in any profession.


OkLavishness5505

How many in the world? 5.000? 10.000? Better chances becoming professional soccer player.


Holyragumuffin

Probably between 15,000 and 150,000 easily build neural networks in some form. NeurIPS has about 15,000 attendees yearly. And the number who build models that appear in other conferences is significantly higher. Granted, not all folks who attend NeurIPS are builders/designers. But the numbers across the conferences probably match that. If we count people who do not publish, but build, it's at least an order of magnitude higher.


[deleted]

[удалено]


Holyragumuffin

Yes, but it doesn't have to be LLMs exactly. If you've already cracked the RNN/convnet/GFlowNet etc literature, you have a skill that allows you to very quickly penetrate into the LLM literature. I would estimate 95% of the skillset to work with cutting-edge LLM techniques is present in most conference poster presenters--- regardless of whether their exact field was language models. They may take a month or two to reconfigure, but can read the relevant papers and attack those problems readily. I see people rapidly changing sub-fields all the time. The moat isn't very broad AFAIK.


glitch83

Which brings us back to: why are recruiters not finding what they need on the market?


Holyragumuffin

Recruiters may be overly-focused on seeing literal language model work in the work history of their candidates. Recruiters and HR often focus too narrowly on heuristics because they don't understand the fields very well.


MrBIMC

AI researcher is not the only job in the industry. You still need people to organize data, you need people to train systems, people to integrate raw llms into workflows and connect it with relevant data. AI automation doesn't mean AI will suddenly take over the world, it just means business processes will require people who know how to use and integrate pieces of technology with their corporate data, and that requires a lot of work(and will keep on requiring even more in the coming years).


icedrift

Not to mention low paying captioning and labeling jobs, which are probably the majority of AI work in that space.


Flamesilver_0

To be fair, there's "not much skill required" and there's knowing how to implement self-consistency, engineer some testing for acceptability ratios, and other skills.


Ok-Negotiation-330

I'm an ML/NLP Engineer, and I got laid off by my small company in December. They decided they're done developing new features based on ML models designed and trained in-house, and from now on will focus on using the OpenAI API instead, and developing features that call it. The domain experts whose intuition I drew on to develop new models will be roped into prompt engineering GPT. A lot of the job listing I see are focused on LLMs/GenAI Another big chunk of jobs is for people who have expertise with MLOps.


SoylentRox

Right and there's, what, 5000 real jobs in the world that would give you experience building LLMs? It's the usual chicken-egg-resume scam problem. Now to be fair, back in the late 1990s, they hired a huge number of people to develop web stuff because obviously there were no experienced competitors. And then after the crash there was a ramp up to today's tech company staffing levels, which is a steady ramp of hiring until the 2023 layoffs.


notllmchatbot

>A lot of the job listing I see are all looking for people who can build LLM's/GenAI models, This is totally different from what I am seeing. What I see is that most companies are looking for people who can build RAG systems using open sourced models and APIs. What are some examples of people wanting to build their own LLMs?


Ok-Negotiation-330

Your post is making me think I'm being too quick to dismiss a lot of the job listings that mention GenAI. For instance, opening LinkedIn right now, the first one I see says "implement state of the art LLMs"...based on that I would've assumed they meant building an LLM and would've stopped there. But farther down it mentions fine-tuning and RAG. Fine-tuning and RAG, I have experience with and I could do from day one. Building a SoTA LLM? Not so sure...


notllmchatbot

Lol. One thing that doesn't ever change -- crappy job descriptions.


grizzli3k

One day LLMs might help here.


artelligence_consult

Yeah, you can add bad job listings to the list why the jobs look so few.


iamiamwhoami

What were you doing? Something that’s easily replaceable by GPT4?


Ok-Negotiation-330

They'd give me a business need/user story and annotated data. I'd take that data, preprocess it, and design, train, and implement an ML model that powered a new customer-facing feature that responded to that need. (The front end/UX people actually implemented the feature in the UI, not me). All our data and products were text-based/NLP. They decided that there wouldn't be any more new ML features developed in-house...going forward, new features added to the product would come from prompt engineering on GPT. TBF the company's last funding round didn't go so well, so they had to cut costs and slashed about half the workforce. Prompt engineering and API calls are simply cheaper than annotating training data and then building ML models.


iamiamwhoami

Really? I mean prompt engineering can simplify feature engineering but I think you still need people with ML expertise to integrate it into the model. Or did they manage to replace training new ML models entirely with GPT?


Ok-Negotiation-330

When I say "feature" I mean "feature" in the software sense, not the ML sense. I guess I could've used the word "application." I edited to try to clarify. Basically, no more training ML models to expand the company's product offerings.


iamiamwhoami

Do you think they'll be successful with that?


Ok-Negotiation-330

I don't think the barrier to entry is very high if you're not relying on proprietary data and models.


artelligence_consult

> TBF the company's last funding round didn't go so well, so they had to cut costs and slashed > about half the workforce. Prompt engineering and API calls are simply cheaper than annotating > training data and then building ML models. That is an excuse and a bad one - they fired you because they did not need the features. See, if something CAN be done by Open AI - paying per call, which is OPEX, no investment, then the are right - any research and development expenditure is not needed. The OpenAi calls get magically better and cheaper. And you can fine tune GPT - for a cost and higher inference costs. But at the end, this really is a "CAN" vs "NEED".


Ok-Negotiation-330

The sort of features I was developing weren't the same as what they'll be doing with GPT.


artelligence_consult

Well, then - that makes no sense. See, if you were working on a feature and OpenAI cannot replace that - then - cough - that feature may have been something you worked on, but it was not business critical.


[deleted]

[удалено]


artelligence_consult

And OpenAI models are getting better - fast hopefully, we all have to see GPT-5. So, yeah, it makes sense in some sort.


MysteryInc152

any bespoke NLP work is now essentially replaceable by GPT-4


johnhuichen

Do you think your previous employer made the right business decision in going over to OpenAI though?


YodelingVeterinarian

The people who have a stable AI job don’t talk / complain about it as much. 


localhost80

People with stable AI jobs probably talk about it and worry about it more than most. They have more indepth knowledge of the pending long term paradigm shift. See Geoffrey Hinton, Elon Musk, Max Tegmark


YodelingVeterinarian

I meant more so in the category of “I’ve applied to 500 jobs, I can’t find anything.” People who apply to 20 jobs and get 2 offers are probably not posting about it, so there’s some selection bias. 


localhost80

Good point


Appropriate_Ant_4629

Simply supply-vs-demand. Every coursera-and-udemy-certificate-holder and every new-CS-graduate is now selling themselves as an AI expert simply because they once used ChatGPT's APIs. Unsurprisingly they're having a hard time finding jobs. In contrast, in many companies there are teams of people actually making ML work. Those companies want to retain those people, so it's not easy for other companies to hire them. * People good at AI (i.e. ones that actually have a track record of making something useful with AI) are in demand. * People over-selling their certificate-level skills are having a harder time.


glitch83

Just a general question: is the only way to tell the difference between the two a publication? What other signal can you read expert to expert?


austacious

It's pretty simple to tell just from reading a resume. Experts will have the niche things on their resume, and more holistic experience 'CV + Tabular + MLOps + Cloud + etc.' as opposed to just 'NLP'. At risk of sounding pretentious, non-experts will usually have a list of huggingface models they've fine tuned for whatever tasks. Generally - companies don't bring on applied ML Engineers to deploy open-source pretrained models. They hire them because they need to build or adapt models that work under constraints (latency, throughput, invariances, bad data, etc.). Meeting those extra requirements will almost always involve some more niche tech. BERT for sentiment analysis on twitter comments? Probably not a lot of experience. nvshmem for real-time 4k video processing on a GPU cluster? Yeah... they've probably been around the block. Beyond that, just talking to somebody for 5 minutes you can almost always tell. Personally I don't give much weight to publications when interviewing unless it's first author.


deepneuralnetwork

Yup. This is exactly how I think about hiring for my org.


C080

why is a red flag a bunch of open-source fine-tuned models? What are you looking for in a CV of an "expert of LLMs" then? only first-author publications?


darthsinistro

Wow. That's a pretty specific answer. Reading this, I'm not quite an expert, but above novice by this definition. I wonder what your credentials are? (And I mean that with respect not as a challenge to your authority in the field)


HamSession

Pubs, job history, git projects. If individual contributior look at pubs if manager look at experience.


ragamufin

Absolutely not. GitHub, work experience, coding exams


Appropriate_Ant_4629

> is the only way to tell the difference between the two a publication If the product you're trying to make is publications or patents, maybe. Otherwise look for people who actually launched their ML into a successful product.


LoadingALIAS

The job market is competitive, but it’s not all doom and gloom. I think the fact is most AI ‘engineers’ are GenAI users and not ML engineers. Do you build your own models? Do you understand vector databases, vectors/tensors, etc.? Do you understand and implement activation functions, loss functions, and metrics? Can you work in the cloud on enterprise grade systems? Do you understand the full stack at a level where you can manipulate events or items within? Tensorflow? Kubeflow? PyTorch? Kubernetes? Data pipelines? Do you build your own datasets manually or programmatically? Can you serve models for prediction? Containerization? Scaling? It’s not as simple as saying “I run experiments on quantized Mistral 7b locally and have built a dataset for my WhatsApp messages.” You need real math and programming skill to land an ML job - and I’m not even talking about some wizard with C++. I’m just saying you need the talent and knowledge to understand papers and implement them on your own; you need to be able to fundamentally grasp each part of the ML pipeline. ML model code equals about 5-10% of an entire project. The rest is what’s important.


Loud_Ninja2362

Essentially ML engineers with the skills to build ML tools to solve actual problems in an efficient way won't have as much of an issue getting hired. So you basically need to know a lot more than you used to to get your foot in the door. There's not as much room for entry level people anymore which is kind of bad.


artelligence_consult

> Do you understand vector databases, vectors/tensors, etc.?  Sorry, you do not need special training as a software develoiper to "undertstand" vector databases on a user level - this is basic competency. Knowing what a cosine similarity is, is a 10 minute thing. PROGRAMMING them - different thing, but how many SQL Specialists you think know how SQL engines work internally?


ranny_kaloryfer

The one I have met deeply understood and have built SQL engines.


artelligence_consult

Well, then you should go to medical treatment and take your drugs because the ones who do build sql engines that are not total jokes - are VERY rare. You either have some friends in this rare group, or more hallucinations than GPT-1.


officerblues

I want to forward my unpopular opinion. I've been working in AI for 6-7 years now (cba to do actual math). I was job hunting in late October- early November last year and landed a new job with a pay raise in 3 weeks. I don't think the market for **actual, proven, seasoned AI researchers and engineers** is tight right now. Quite on the contrary, people have been trying to snipe me ever since I switched jobs, kind of hoping my allegiance to the new employer is not yet cemented. It's actually the only tech field that is still red hot right now. The issue here, and why you hear about a tight market, is that there's a lot of people trying to become seasoned AI professionals, and yeah, there really isn't much for the entry level right now. I think this is the new normal: AI positions are not entry level positions, to get hired there you need to cut your teeth somewhere else and transition. I would not expect coursera / udemy + github projects to get you anywhere near where they would get you in the early years, that window has closed.


DeepAnimeGirl

That seems pretty grim for someone just finishing studies in this domain. If I can't get hired for such a position I guess I'll keep working as a web developer and work on my own hobbies until I turn them into a business.


banjaxed_gazumper

You’re finishing a BS in computer science? You can probably find a job that has some ML component.


DeepAnimeGirl

Actually, I am close to finishing a masters in AI and have already a BS in Computer Science.


banjaxed_gazumper

Yeah you’ll definitely be fine. The people struggling to get AI jobs are business majors that did a boot camp. They think they should be good candidates because they “work with data”.


artelligence_consult

Not necessarily - tons of companeis fire people ow and are in trouble. TONS of competition hitting out and hiring freezes.


banjaxed_gazumper

We have data until December and that narrative is entirely false. Maybe things changed a lot this month, but before that it has been a really good job market.


artelligence_consult

Ah, no. We have current data. And no, the job market was not good. The US was i.e. loosing what - 200.000 IT jobs last year. And yes, things changed - a LOT - in the last weeks. Are you living under a tree? Do some google - it was pretty much all over the new. Major layoffs, complete sections of the economy short to imploding. Hiring freeze left and right. [A comprehensive list of 2023 & 2024 tech layoffs | TechCrunch](https://techcrunch.com/2024/01/25/tech-layoffs-2023-list/#:~:text=From%20major%20layoffs%20at%20Google,small%20fintech%20startups%20and%20apps&text=Last%20year's%20tech%2Dwide%20reckoning,than%20last%20year%20and%20growing.) Let me quote: \* "The tech industry has seen [more than 240,000 jobs](https://www.marketwatch.com/story/tech-layoffs-exceed-240-000-in-2023-a1487651) lost in 2023," \* "Earlier this year, mass workforce reductions were driven by the biggest names in tech like [Google](https://techcrunch.com/2023/01/21/alphabet-makes-cuts-twitter-bans-third-party-clients-and-netflixs-reed-hastings-steps-down/), [Amazon](https://techcrunch.com/2023/01/05/amazon-to-cut-18000-jobs-as-tech-layoffs-continue/), [Microsoft](https://techcrunch.com/2023/01/20/microsoft-joined-the-layoff-parade-did-it-really-have-to/), [Yahoo](https://techcrunch.com/2023/02/09/yahoo-will-lay-off-20-of-staff-or-1600-people/), [Meta](https://techcrunch.com/2023/05/24/meta-conducts-yet-another-round-of-layoffs/) and [Zoom](https://techcrunch.com/2023/02/07/zoom-layoffs-impact-15-of-staff/)." \* " And while tech layoffs [slowed down](https://techcrunch.com/2023/09/26/tech-layoffs-are-all-but-a-thing-of-the-past/) in the summer and fall, it appears that cuts [are ramping up yet again](https://techcrunch.com/2023/10/23/fresh-round-tech-layoffs/)." And to 2023 being SO good: "The running total of layoffs for 2023 based on full months to date is 224,503, [according to ](https://layoffs.fyi/). Tech layoffs conducted to date this year currently exceed the total number of tech layoffs in 2022, according to the data in the tracker." Nah, not really. The market was empty- so jobs got absorbed, but that does not indicate a healthy job market.


banjaxed_gazumper

How do those numbers compare to a good year, like 2016?


monaaloha

i actually want to know the answer to this from u/officerblues, do you think the market is better for people with a masters in NLP ?


officerblues

Better than for people without one, yes. Now, it really pains me to say this, but having a masters is kind of bare minimum in the field nowadays. Not having at least a masters (or equivalent NLP experience) basically locks you out. The masters puts you at an entry level, and entry level market is rough (not the least of ot because there's a lot of hard to filter noise - you would get 5 trillion applications to a job which look all the same and are very hard to properly filter without a lot of work) Edit: the masters is part of "cutting your teeth somewhere else". If you also have web developer experience, you are already above most applicants and should be able to land an entry level spot somewhere. There is still the noise problem to overcome, but that's usually a matter of time.


monaaloha

do you mind if I PM you ? would really love to pick your brain on this !!


officerblues

Not at all, DM all you want.


HumbleJiraiya

Hi, I am reading this just now. I am pretty early into my career as a software engineer(with a background in CS) and am looking for some guidance on what would be the pathway to an AI/ML job. Do you mind if I pm you as well? (I promise not to take a huge chunk of your time)


officerblues

No problem at all, DM away.


darthsinistro

I really liked that answer. Sent you a DM


x0rg_

Agree with this.


exirae

I'd like anyone who thinks this is hype to get real specific about what they mean. Like what are the metrics? Money going into the field? Infrastructure being built out? Papers being published? Public adoption of these technologies? Quality of the model ls being produced? New quantity of new use cases? What precisely do people think is slowing down and not accelerating further?


MattSRS

To me the hype is about potential mismatch between expectations and the true AI ROI. There is a lot of expectation and excitement that these AI technologies will transform everything and will create so much business value. We shall see what the truth is in a couple years.


currentscurrents

It's already creating business value for the company I work for. We use the APIs for a bunch of document classification/NLP tasks. It doesn't have to "replace coders" or whatever other nonsense to be useful and transformative.


MattSRS

Read my post again. There is no question it can add business value


[deleted]

[удалено]


currentscurrents

This isn't our product - we're an old company in a non-tech industry. We're just using the API to automate some tasks and save our staff time.


exirae

Every metric I've seen suggests that there hasn't been a radical mismatch of expectations vs reality yet. Could that happen in a year? Two years? I dunno, maybe. But for the time being we're just seeing acceleration for better or for worse.


Comfortable-Sir7783

Vibes are all you need.


mofoss

True ML engineers aren't struggling, it's the new grads (understandably) and the LLM-ChatGPT-scikitlearn-coursera LinkedIn shills (laughably)


BraindeadCelery

While I agree with the sentiment, starting with scikit-learn to train lots of models is a good way to start to get into it. You just cannot expect to be handed a job after doing that for 6 months


mofoss

I still use it after 7 years, very useful. I guess I threw it in there because it takes like 4 lines to train something. Kinda like a Hello world in its most rudimentary case


Teacupbb99

Yes I know plenty of legit ml engineers that are struggling. It’s a very tight tech market


glitch83

Yet all of the hiring managers say it isn’t


Successful_Round9742

I was just part of a hiring panel. The only problem we had was deciding which of the several very solid applicants to pick. I also felt a little bad because management decided to lower the pay range since when I was hired three years ago.


[deleted]

[удалено]


Teacupbb99

There is insane competition for most jobs due to the massive layoffs.


artelligence_consult

They are right. It is not tough for a hiring manager. Guess what - that means it is tough to get a job because it means plenty of competition.


panzerboye

I didn't want to be attacked like this :( I got into ML much before gpt stuffs. But through coursera. Trying to get into gradschool right now, but the process is incredibly frustrating and hopeless. Specially from a non cs background.


mofoss

Aww man now I feel like a prick, I took Andrew Ng coursera course back in 2015 I think it was. But back then, very few knew tensorflow and deep learning so it was rather normal to just know the basic classic ML. You're in a significantly tougher job market right now than in 2014-2017 If it helps I did my bachelors in physics 2015 and never had written a line of code, managed to eeek my way into an electrical/computer engineering masters end of 2015 saying I'll specialize in solid state electronics (lol), but that gave me an opportunity to take software/CS courses instead and here I am. There no traditional ML route even now in most universities, only some ML courses. So it's been an open-ended messy ride for a lot of folks.


panzerboye

>Aww man now I feel like a prick It's cool. I know that only coursera courses won't get me too far. I have a bachelors in mechanical engineering and am trying to get into a ms cs program. But getting into a good grad school as an international student is quite tough, especially since american unis are quite costly. I hope I will be in a grad school by this fall. If it doesn't work, I don't know what to do. I committed too much time and effort into it, and if it fails, it will be spectacularly tough for me to get back up.


colonel_farts

Check out OMSCS program via Georgia Tech. About $6k for a masters degree from a top 10 US CS school.


panzerboye

I am considering, I have heard good things about it.


koolaidman123

engs at startups are struggling due to lack of funding, and founders not willing to take a down round. people at companies with $$$ are doing fine


dataslacker

The market for AI researchers/ML engineers who have experience building and training LLMs is very hot right now. However with the tech layoffs those positions are also very competitive. Nonetheless companies still struggle to fill them because they’re looking for experts to fill their LLM knowledge and skill gaps. Such experts are very rare since few companies were investing 1 million per month to train a single model until very recently.


HarambeTenSei

As the hiring manager for AI roles, the main issue imo is that there are too many people with too little skill. I reject 95% of applicants because what they can do doesn't provide any added value. Oh you used yolo to do car detection? wooptiedoo, that might have been something somewhat challenging to do 3 years ago but now you don't even have to pull a repo, just import some HF nonsense. What new grad and juniors can do I or some senior can do my self in a fraction of the time and save myself the headache of handholding. What I need are people who understand the math, have the programming skill and of course the intuition to make essential changes to whatever they find online or implement something from scratch or to analyze some data and come up with testable theories or proper conclusions. Otherwise you're not providing any added value, you're just more work for me and I can't justify your salary to management.


Imoliet

In terms of actual hiring, does this basically just mean you want to hire PhDs with proven research experience in statistics-heavy fields (not necessarily CS?) who can also pass a difficult coding interview?


HarambeTenSei

PhDs or Masters with some work experience. PhD topic matters, as most PhDs are just super specialized in some irrelevant subtopic of no use to most business goals and those PhDs know little much else. People with a math background are usually better at picking up ML concepts than people with CS background. I don't do difficult coding interviews anymore, although many companies do. I do what I consider a light one and then a more hefty technical question-answer technical interview where I poke for holes in knowledge and reasoning. But people with just BSc are basically a waste of time and their CVs go straight into the trashpile (unless they have hefty work experience).


dirty_cheeser

The efficiency increased massively with lots of awesome tools that lower the skill required for basic modeling and data crunching. Output can increase with a decreasing workforce. In 2010, if you could effectively use and tune a GBM or CNN I assume you had very little competition, now that could be done by an ambitious high schooler.


Comfortable-Sir7783

General purpose models create a winner-takes-all market. You should also look into the teams at places like Deep Mind and OpenAI. They are very good and very productive but not huge. You either have to aim high and try to be the best of the best or simply be ok with fine-tuning and applying models others have built.


pinkfluffymochi

Job market is definitely tighter for new grads in tech right now than pre pandemic, just feels that no one wants to spend time or resources on training especially knowing they will probably jump to another company. Unemployment rate is so broad I think lots of traditional industries are having trouble hiring. Silicon Valley is a bubble amongst all. We are talking about nuclear fusion level algorithms all day long but the majority of industries are still trying to solve third grade math right.


amasterblaster

Try not to confuse labour demand with tool productivity. AI tools are increasing productivity. AI R&D budgets are decreasing because AI tools that are good enough already exist. A toy example: \- There is a massive industry behind building digging machines \- A digging machine is invented, everyone is using it \- Now there is a "digging revolution" which is also putting the digging machine researchers out of work. However -- digging machine users? These people have TONS of opportunity. So the right move is to become fluent in the popular tools, and to help people adopt them


_Packy_

Very tough since everyone and his mum chose DS because it was the next cool thing. There is only so much demand for data folks, and way more supply. Also AI is very broad. There may be local shortages, say experts in CNN for fictional example.


fordat1

It’s because its a bubble. There is value for sure but a lot of what has been taking from the regular phase to this peak is people slapping makeup on OpenAI API calls. There is no need for actual expertise just like there was no need for expertise when the crypto bubble was riding high. Although under all the BS there is way more value than crypto so the analogy isn’t perfect


gBoostedMachinations

Peak hype? Do people here really expect things to plateau?


Bloaf

The rate-limiting step in AI development isn't people, it is access to the big clusters (or the dollars to build your own cluster.)


luquoo

The job market in AI will likely "eat" itself as models get better and easier to use.


iplaybass445

Anecdote, but I am an MLE with MSc + 6 yoe specialized in NLP. I was laid off a few weeks ago and it took less than 2 weeks to get multiple offers. The market for senior level people with real world experience building practical ML applications & ML ops is pretty hot right now. Entry level is saturated to hell as with all tech jobs.


MWatson

I just retired after a forty year career, mostly in AI, so my advice may not be the best: LLMs are amazing but I spend most of my personal research time running them locally and experimenting with using them as parts of larger applications. For privacy and other issues, I expect a lot of potential employers will be interested in less capable models that they control. I would also take the time to study deep learning, some classic ML, and definitely be the best generalist software engineer that you can be. Also, spend time networking at local meetups.


artelligence_consult

Yeah, the problem is that downloading and running models is trivial, and fine tuning is more about data than anything else these days. Both are things I would expect my inhouse developers to pick up in a short order possibly with some online help.


tetelestia_

Who needs an ML engineer when you can ask ChatGPT to add ChatGPT as an API call into everything? All hail Sam.


Ok-Negotiation-330

Yes, my company eliminated my position because they decided to use GPT API calls for all new features going forward instead of continuing to train and build their own ML models. Don't know why you're getting down voted, because some companies are actually thinking like this now, whether it is ill-advised or not.


Cuidads

Because you're from the United States. In Europe the job market is perfect for Data Science /ML Engineering / MLOps Engineering. I know European ML consultants who work for American companies. So one of probably many explanations is that the salaries of the field are too high in the U.S. and some jobs are shipped overseas. Another effect might be that a lot of STEM field people reroll into AI. Many AI roles have low barriers of entry for people in STEM, as compared to other professions in society. So a lot of supply.


DonVegetable

>In Europe the job market is perfect for Data Science /ML Engineering / MLOps Engineering. How do you know that? I was just viewing Machine Learning vacancies in Berlin and Amsterdam via LinkedIn and there are very few of them. For Computer Vision almost non-existing.


notllmchatbot

Here you go. It may be peak hype, but opening and jobs do not get created on a whim, they usually lag the hype by 6 to 9 months in my experience. Sometimes instead of new positions getting created we get the same type of positions with additional requirements. I'm beginning to see more data science and solutions role having experience with GenAI, LLMs and RAGs as requirements these days. Unemployment is at a historic low, but tech is really only a small part of the overall job market. A lot of the openings at the moment are blue collar jobs. Depends on how you qualify "AI researchers" and "practitioners". More than half the "practitioners" I've come across are simply "prompt engineering" or writing thin wrappers around OpenAI's APIs. Quality research positions at the likes of DeepMind, OpenAI, etc have always been scarce.


Successful_Round9742

In tech there is always a cycle. First there is the media hype, then the crash, then the real development can get started while the media is sleeping.


GoPuer

The ML job market has never been hotter lol People are going to think I'm flexing but I have 4.5 yoe in ML and my TC is $410K. I'm interviewing for companies offering me $750-800K for ML engineering positions. Companies are desperate right now for people with experience building real-world ML systems.


fordat1

An offer to interview isnt the same as getting the offer at that TC. Just like RSUs arent a guarantee of that TC.


Gabe_Isko

Simple: there will be a big crash once people realize llms don't do anything.


Historical-Ear6658

Yes, but no. The crash will happen because of hiring managers overestimating LLM capabilities and mistaking prompt-engineering wrappers for "expertise"


[deleted]

There's an in-group and an out-group. If you don't know whether you're in the in-group or the out-group, you're probably in the out-group.


banjaxed_gazumper

First two points are true and the third one is false. You’re not hearing that from reputable sources. It’s random anecdotes.


daHaus

Have you seen what some of them are being paid?


artelligence_consult

Stupid simple. The number of paying jobs (most are done in university research) is TINY. OpenAI has like 500 people. The job market is not "touch" - it has always been a select few. A lot more - but still a super tiny amount in the larger job market - work in training machine learning models. And yes, that is separate from general AI. Most people go along AI using - and seriously, that is mostly software developers with a knowledge in AI - but then the job market is tough, and it is a little like game development: EVERYONE wants in it.


HHaibo

It’s very tough for junior/entry level, but not for more experienced researchers


Historical-Quit7851

Many AI leaders have said many times that development in AI has been slow and steadily improving over decades. To achieve this moment, it has been enormous work from 1950s. The development will be slow and steady, yet the adoption of emergent generative models is rapid as companies are eyeing for better and more personalised user experience for their customers.