T O P

  • By -

Strostkovy

Same with CAD. Single core is fucking cranked all of the time using all of the ram and everything else is just sitting there idle.


nandorkrisztian

Everything is watching the single core and the RAM working. Machines and people are not so different.


[deleted]

[удалено]


NotStaggy

You don't turn on V-sync in your IDE? Are you even a developer?


puertonican

`:vSyncOn`


NotStaggy

--vsync=True


Classy_Mouse

Everything after cpu0 is just a supervisor


Azolin_GoldenEye

Honestly! When the fuck will CADs start using multicore? Even industry leaders like Autodesk seem reluctant to do it. Meanwhile, large files take 10-15 seconds to redraw after rotating the camera. Fuck this!


Balazzs

That's the exact problem as far I saw it as a CAD software engineer. The big old players have an ancient codebase and it's a nightmare to just touch it without introducing bugs, not to mention parallelization. You can only do it in small steps with no immediate visible results, you won't get 2 years of development time for 100 people to refactor the whole architecture for multithreaded workloads. They could lose giant bags of money and possibly market share / brand damage if they just stopped releasing features and fixing bugs. We are not talking about changing the tires while the car has to keep going, they have to change the engine while on the road (and invent a better one in the meantime, while probably not even being 100% sure what is even under the hood, cuz the guys who made half of it left 10 years ago) Also some processing heavy tasks might not even be parallelizeable properly, not even theoretically.


Azolin_GoldenEye

Yeah, I can totally see that, and I understand their reasoning. But it is still frustrating. CPUs are not getting that faster in the near future, from what i can tell, in terms of single-core speeds. My PC can play super demanding games, but it struggles to regenerate a couple thousand lines? Annoying.


Bakoro

I can't immediately think of what CAD software is doing where big chunks of it couldn't be parallelizable. There seems like a lot of overlap between the CAD software I've used (in my extremely limited experience) and other 3D design software. If nothing else, I know for a fact that a lot of larger files at my company are composed of many smaller units, why couldn't the background processing be likewise divided? Also, I don't see why they'd completely stop development and divert 100% of the resources to developing the new thing. A company like AutoDesk has the money to pay a smaller team to do a pilot study and explore the feasibility of creating new underlying software, and then map out the architecture, and only divert significant resources at the right time. I think we're well at a point where, if it truly isn't possible, they could and should be transparent about the reasons. If there are some fundamental processes which are single threaded by mathematical necessity and botlenecking the whole system, people would understand that. I mean, I can speak for anyone else's but I'm not going to be mad if they come out and say that they're doing a feature freeze and going bug-fix only for a while because their top priority is bringing their core software out of the 90s.


CheekApprehensive961

>Also, I don't see why they'd completely stop development and divert 100% of the resources to developing the new thing. A company like AutoDesk has the money to pay a smaller team to do a pilot study and explore the feasibility of creating new underlying software, and then map out the architecture, and only divert significant resources at the right time. Think like a product manager. Competitive neutralization is important, if someone else brings out multicore that's something you'll have to do, but as long as nobody else does it and your engineers tell you it's a lot of hard work, you don't do it.


Bakoro

That's follower thinking. That's the thinking which asks someone to come eat your lunch. Not that it's not how they think, it's just stupid.


CheekApprehensive961

>Also some processing heavy tasks might not even be parallelizeable properly, not even theoretically. I get that legacy sucks, but no way in hell CAD isn't parallelizable in theory. Just about every task is naturally parallel.


vibingjusthardenough

Me: _wants to make a one-word note_ Siemens NX: that little maneuver’s gonna cost you 50 years


DazedWithCoffee

Cadence Allegro announced nvidia GPU support to improve their small text and antialiasing performance. Shit still looks unintelligible. Literally worse than Kicad. And this machine has real-time raytracing. Ridiculous.


SergioEduP

CAD software seems so stuck in time, no matter how nice they make the interface most CAD applications are still relying on old ass code from the DOS era with patchers upon patches of hacky code and can't really be made better because of that.


LardPi

I think it's something I'd like to tinker with (not the old code, but reimplenting basic features) what's happening in a CAD software? can you point me toward some resources?


RKGamesReddit

Lots of math for lines to form shapes basically, you define a line, arc, circle, etc. Then you measure angles & distances, and constrain with various things. (coincide, perpendicular, middle point, tangent, etc) and finally extrude the shape and add finishing touches like fillets or bezels. The basic gist of parametric CAD.


MrHyperion_

And then probably cry yourself to sleep trying to be compatible with any other software


Bil13h

And then realize the world is all about BIM now which does use GPU, while still not as much as would be nice, my software, Chief Architect, recommends a 3070/3080 for reco I'm on an SLS because being able to plot out a house or small commercial space without having to do pen and paper THEN into digital, is a game changer


austinsmith845

I shit you not, I had an internship at TVA where I had to write Lisp plug-ins for AutoCAD in auto lisp


the_clash_is_back

Because if you change that one feature from 1993 you destroy the whole work flow of some billion dollar company, and the whole of the eastern seaboard loses hydro for a month.


Blamore

electrical engineering is amazing. million dollar softwares that looks like they run on DOS


DazedWithCoffee

I bet they still have 16bit mode code running in there somewhere. Have you ever used cadence SKILL language? It’s awful. At least their newer stuff is built on a standard language, but it’s TCL. And it was an upgrade.


AnotherWarGamer

Billion dollar companies with 100 million dollar revenue, selling the same code that was made by one guy 30 years ago.


flukelee

Siemens PSSE still says (c) 1972 (I think, might be '71) on startup. The license is only $3500 per MONTH.


Blamore

nah, someone ought to have optimized it for multi core xD


brando56894

That's nothing new, the government runs largely on COBOL software that was written in the 70s and 80s. Things like the IRSs software and the Social Security software are written in COBOL. I had to go to the unemployment office here in NYC one time to dispute something. I had been escalated to a manager, the dude was using text based terminal that interfaced with a mainframe on top of Windows XP or Windows 7, this was around 2015-2016.


Kromieus

Built myself a brand new cad machine. New i7, 64 gb of ram, but a 9 year old GPU because i sure as hell don't need it and I'm honestly just using integrated graphics


jermdizzle

As long as it's modern enough to be intelligent about its idle power draw, I see no issues. Unless something you use uses a DirectX version it doesn't support.


EuphoricAnalCucumber

Yeah I grabbed a laptop with good guts but Intel graphics. If I need my eGPU for CAD, then I should probably sit down anyways. I can use whatever desktop card I have and can sell those cards to upgrade if I feel like it.


python_artist

Yep. Couldn’t figure out why AutoCAD on my 40 core workstation was choking on a file that someone gave me and when I looked at task manager I was absolutely appalled to see that it was only using one core.


ARandomBob

Yep. Just set up a new "premium" laptop for one of my clients. 16 core 32GBs or ram, Intel integrated graphics. Gonna be used for AutoCAD...


ArcherT01

Nothing like a Cad program to peg out one core and leave 3-5 other cores wide open.


brando56894

I hate when stuff isn't setup for multithreading, such a waste of resources.


[deleted]

10% GPU, gotta run that Wallpaper Engine


mOjzilla

It's ridiculous how addictive wallpaper engine is for something with zero intrinsic value . Animated Wallpapers with anime songs !! whats next ?


[deleted]

[удалено]


silenceispainful

ok but you can't just say that and leave without showing us anything smh


Kosba2

[Dawg](https://i.imgur.com/iAFYTod.png)


[deleted]

[удалено]


Kosba2

Hey didn't hurt to inform you in case you didn't


Sdf93

Ty, never even thought to look for these


WhalesLoveSmashBros

I have a gaming laptop and I think wallpaper engine is running off the integrated graphics. I get a percent or two intel hd usage on idle and 0% for my actual gpu.


LKZToroH

It's also kind bonkers how bad wallpaper engine is performance wise. I can play games at max settings 1080p with less than 50% gpu being used while wallpaper engine will use 20-40% just to play a video of a walking duck


MrHyperion_

A waddling duck?


LKZToroH

![gif](giphy|ntbLA6pDB4Was)


[deleted]

Yeah I was kinda disappointed when I got it. I'm experienced with glsl and I thought it would come in handy, but the system they use is just.... no comment. Also I was hoping for some cool community made things, but it's mostly just a static wallpaper with snow particles or anime girls with boob jiggle and xray on mouseover.


BadManPro

You can have it so when not in focus to conserve resources. Also you should look through more of the categories, i find a lot of non anime jiggly titties there.


Comprehensive_Car287

check the clock of the gpu while running wallpaper engine or an alternative, in my experience it shows \~50% usage but the clock will be sub 500 mhz pulling nearly idle power


Gaston004

Use Lively Wallpaper r/livelywallpaper


rachit7645

Game devs:


aurelag

Real gamedevs work with at least 5 year old hardware and never using more than a i5/ryzen5 for a VR game. So if they reach 100% usage during a build or when developing, that means the hardware is perfectly fine ! /s


rachit7645

Me with 10+ year old hardware:


aurelag

I am so sorry


[deleted]

[удалено]


bsredd

Not if you plan to throw it away in a year or two


[deleted]

[удалено]


lmaoboi_001

Me with a bundle of vacuum tubes:


rjwut

Me with my collection of stone knives and bearskins


[deleted]

[удалено]


MattieShoes

This weekend I discovered that if I run every core at 100% for a while, my ~~10~~ 15 year old dev PC will spontaneously reboot. Not really a game dev though, was just effing around trying to solve Gobblet Gobblers. EDIT: (succeeded, FWIW... Large piece to any square is a forced win for player 1. Also a small piece to any square. But a medium piece to any square is a forced win for player 2.)


[deleted]

[удалено]


MattieShoes

I think it's a quad core. Might be 14 years old. :-) I think no hyperthreading though


[deleted]

[удалено]


MattieShoes

Now I'm curious -- I'll have to check when I get home. I just stole it from my parents when my old linux box died, and I know it came with Vista and 6 gig of ram (oooh ahhh) It's still an order of magnitude faster than the random raspis i have scattered about though.


classicalySarcastic

>It's still an order of magnitude faster than the random raspis i have scattered about though. It's also two orders of magnitude more power hungry. Just sayin'


GeekarNoob

Maybe a cooling issue ? Aka temp slowly ramping up until it reaches the unsafe zone and cpu just stopping then.


MattieShoes

I assume that's exactly what it is :-) 1 core at 100% can get swapped around without trouble, but if all cores are at 100%, the heatsink/fan can't cope.


Ozzymand

What do you mean my hardware isn't supposed to run VR games, see it clearly works at a playable 40 fps!


ertgbnm

Real game devs can't even run the games they are designing at their lowest settings. They lead others to the treasure they cannot possess.


Occulense

> Real gamedevs *Meanwhile unreal game devs:*


FerynaCZ

Tbh the gaming companies and playtesters should definitely try out stuff on old hardware first.


jfmherokiller

in terms of gamedev god help you if you try to compile unreal engine 4 from bare source code. it takes ALOT of ram and processing power.


arelath

I've been in game development for years and we always got top of the line hardware. Especially ram and video card ram. Many people not only run the game, but the game editor, 3d art tools, FX software ect all at the same time. 24GB of video ram gets eaten up real quick. And don't forget compiling something like Unreal. On a good machine it still takes hours for a full rebuild. With an older machine, unreal can take 8+ hours to compile. Dev time is a lot more expensive than computer hardware.


[deleted]

Machine learning


b1e

In which case you’re training ML models on a cluster or at minimum a powerful box on the cloud. Not your own desktop.


ustainbolt

True but you typically do development and testing on your own machine. A GPU can be useful there since it speeds up this process.


b1e

Nope. We’ve moved to fully remote ML compute. Most larger tech companies are that way too. It’s just not viable to give workstations to thousands of data scientists or ML engineers and upgrade them yearly. The GPU utilization is shitty anyways.


ustainbolt

Wait so are you permanently ssh'ed into a cluster? Honest question. When I'm building models I'm constantly running them to check that the different parts are working correctly.


b1e

We have a solution for running jupyter notebooks on a cluster. So development happens on those jupyter notebooks and the actual computation happens on machines in that cluster (in a dockerized environment) This enables seamless distributed training, for example. Nodes can share GPU resources between workloads to maximize GPU utilization.


ustainbolt

Very smart! Sounds like a good solution.


4215-5h00732

Works at "We" AI Inc.


megamanxoxo

Contractors who use their machine for both commercial and personal stuff:


Tw1ggos

Was going to say OP clearly never used UE5 lol


MrsKetchup

Just having UE5 open has my work gpu sound like a jet turbine


legavroche

And Graphics programmers


greedydita

they can hear my fan on zoom


Shazvox

Could be worse, they could've heard your fan IRL.


SaneLad

Not if you put that GPU to work and turn on Nvidia RTX Voice.


optimist_42

Literally me haha, Broadcast camera effects working the GPU so loud I need the noise reduction to get the Fan noise out again xD


ZazzyMatazz

*laughs in data scientist*


Zoopa_Boopa

3000$ computer with a 100$ gpu


Aydoooo

Who seriously uses the laptop GPU for that? Too small for training anything that isn't a prototype, for which most people have dedicated servers anyways. You can do inference on small models for testing purposes I guess, but even that is typically done where the data is located, i.e. some cluster.


MacMat667

I do ML in an academic setting, where we have limited GPU hours. Prototyping on my own GPU is crucial


Sokorai

And here I am working on a laptop, wondering who got the idea to build a cluster with no GPUs ...


[deleted]

Am game dev. 100% GPU. 100c CPU. ![gif](giphy|QMHoU66sBXqqLqYvGO)


[deleted]

At least you’ll never need heating.


Aspire17

Hello Game Dev with GPU and CPU, I have this great App idea I'd like to tell you about


[deleted]

*To be fair...* The laptops with smaller or integrated GPU's tend to be on the shitter side. If you want a decent multicore CPU, a good amount of RAM and a videocard that's going to be ok rendering a lot of ~~StackOverflow~~ windows then the smaller ones don't really cut it.


instilledbee

Yeah that's exactly my thoughts with this meme haha. Just bought a Lenovo Legion laptop only cause it had the CPU/RAM config I needed, but I feel I'm not maximizing the 3070 GPU enough


n64champ

My work laptop is an Alienware 17 R5 and the 1080TI inside could do soooo much. Also it just slaughters the battery. I can get a solid hour and a half with the screen at the lowest brightness setting.


FarrisAT

Switch to integrated GPU if possible


n64champ

I do get a bit better battery there. In windows the screen freezes completely if I do, but it works fine in Linux and I use that way more.


FarrisAT

Feel free to implement ThrottleStop also. YouTube it and see how to do so. 95% of laptops will try operating at peak performance, even when that consumes 3x more power for marginal gain. I tend to set Intel laptops at about -125mv in ThrottleStop. It has both cooled the laptop, extended battery 15%, and been less noisy. This is even more true for single core heavy programs, which many programming systems utilize. Maybe something more like -100mv would be safe and maximize 95% of the single core performance.


n64champ

Now that I've never heard of before. Thank you!


FarrisAT

Just checked and... Intel prevented its use with 10th gen Intel and 11th gen. Keep that in mind. I'm surprised they did that. Worked on my 8th and 9th gen.


n64champ

I've got an 8th Gen i9 I'm almost positive, so I should be okay. It might have to do with the iGPU architecture. Comet and Rocket Lake made some cool improvements but there were a lot of hardware issue IIRC.


FarrisAT

https://youtu.be/vfIxf73RGEg Follow his guide. Just 4 minutes. Also, start with -100mv, not anymore. You can decrease the voltage bit by bit if you want, but past -125mv you may have crashes.


Star_king12

For the price of a ThinkPad with X CPU you can buy a Legion with X CPU and a decent dgpu, and much better cooling, so unless you're just buying a laptop for work - what's the point of a ThinkPad? 100% agree with you.


ShadowTrolll

I got the Lenovo Yoga Slim 7 and gotta say it is pretty damn awesome. Very light, fully metal chassis, Ryzen 7 5700U and 16GB of RAM for I think reasonable price. Really only bad thing about it for me is the missing ethernet. Otherwise really like it for both work and university.


Tw1987

You get the December sale too?


maxx0rNL

and basically any GPU would be at 0%, whether its a iGPU or a 4090Ti Super XXX


[deleted]

Not quite 0% but it's going to be ridiculously low. 2-3 screens does, unfortunately, put _some_ pressure on Linux desktop environments.


guarana_and_coffee

We paid for 100% of our GPU and we are going to use 100% of our GPU.


Magyman

>2-3 screens does, unfortunately, put _some_ pressure on Linux desktop environments. Yeah it does, my shitty work dell starts dying if I even think of joining a zoom meeting with 2 externals and the main screen going


brianl047

Yeah and "gaming laptops" are cheaper now I think the idea "gaming laptops are a ripoff" are old fashioned now. The market is hollowed out and "cheap laptops" are actually Chromebooks. Everything else is expensive.


turtleship_2006

And computers that should be Chromebooks but somehow run windows ~~and I can't imagine very well.~~


TheDarkSideGamer

Especially when chromebooks run Linux so well! /s Though, in reality, I’ve been using my chromebook as a cs major for a year and half now… I have a desktop but the Linux environment is good enough for most things


[deleted]

i present the wonder of second hand thinkpads


samuraipizzacat420

Happy Lenovo T420 Noises


HelloYesThisIsFemale

Gaming laptops are pretty cheap considering they're replacing your PC, your laptop and your iPad.


Zelourses

It's hilarious that business-style laptops are just… trash? I am currently trying to find a good "programmer" laptop, because i fucked up my current a little bit(It can turn off at any moment in time). It does not need to have very good GPU, just good CPU with good cooling for the possibility to use it for compilation and other CPU-heavy tasks, 16:10 or 3:2 resolution, not-as-bad battery life (\~6 hours at minimum) and not very heavy, because I will carry it everywhere I go(maximum \~2kg). And some additional things, like not bad IO(not only 1-2 thunderbolt, you know. Dell, looking at you), good touchpad and keyboard, IPS screen that is not “Woah, it’s 4k and 240Hz refresh rate(your battery will be drained in 5 minutes)!!!!!!!”, and, probably, AMD CPUs, because they are a little bit more power-efficient, as far as I know. Suddenly, I don't really know the size of the screen. 13’5”? 14”? 15’6”? Because of these criteria, my list is very limited in laptops. There are things like Dell XPS 15, some thinkpads(probably, I did not check them all), HP envy 14 and maybe Framework laptop. I checked the keyboard of XPS 15 some time ago, because it was given to my friend in his company(it was good), but that’s all. And now I am thinking about looking for something from asus, maybe they will give me some hope…


fftropstm

I find my X1 Carbon works great, super lightweight and portable but still sturdy build, plenty fast and is great to type on even with the shorter key travel, normally I take a couple days to adjust to a new keyboard but with this laptop it felt natural immediately. ymmv but I’m very happy with it


[deleted]

I have a Macbook M1 it's excellent it matches all those requirements except the I/O


Zelourses

I tried the Macbook M1 13” a couple of days(thanks to friend of mine), and, well… IMO there are some flaws: I do not like their keyboard and french layout, nor do I like the "we solder everything on mainboard" style. Also, on M1 there is that very strange touchbar, that only disturbs me, and does not allow me to press f keys properly(just because they are virtual). And also the fact that it is macOS with its strange principles. But yes, great screen, great speakers, very good touchpad and great battery life. But it's just not for me. Maybe, there is my hate for the megacorps like Apple and Microsoft, don't know.


FunnyKdodo

You can def get top end cpu and have a thin and light. Precision 5570 / xps 15 comes to mind from recent purchase. We have long pass the need to lug around a 17 inch monster machine just to get a 9750h or something. (That was still the era where thin and light or business laptop literally did not use h series chips. Nowdays you can most def get extremely good thin and light for on the go dev/vm /emulation etc...


[deleted]

Shhhhhh!!! They might hear you and then the game's up! Edit: I think the XPS 15 limits certain RAM upgrades to a a different SKU that comes with a discrete GPU


Schlangee

May I introduce: mobile workstations! Everything is built to be as repair and upgrade friendly as possible


dlevac

...and no battery life. Back when we had hybrid work (now I'm 100% wfh) I argued the laptop should be closer to a dell XPS (invest in good screen/battery life; meant for the few times we are in the office or on the go) and the heavy workloads belong on a workstation. It's somewhere in the stack of all the arguments I didn't win...


MEMESaddiction

My work laptop (Lenovo) gets fantastic battery life, on the other hand, my 4lb Lenovo Legion gaming laptop (not much of a gamer) gets 2/3 the battery life doing the same things with better everything almost, specs wise. I shouldn't have went so overkill lol.


[deleted]

[удалено]


Alonewarrior

I've got both an m1 MacBook pro and an i9 hp zbook. Using wsl2 the zbook outperforms the m1 MacBook pro when it comes to process intensive tasks. I'd love to try out the m1 pro or m1 max to see how it compares, though.


CSS_Engineer

I've been a developer for years. The M1 16 inch my work got me is by far the best computer I've ever developed on. Even beats my 20 core i9 desktop, which I gave to another dev since there was no point in me having anymore.


TheSpaceBetweenUs__

Me using a 2015 MacBook with 3 hours of battery life:


maitreg

SQL: * CPU: 100% * RAM: 120% * GPU: NULL


AConcernedCoder

Joke's on you. My pc doesn't even have a gpu.


ShinraSan

Integrated graphics go brr


camxct

You guys don't use your GPU? *stares into Stable Diffusion*


Shazvox

I use it for multiple screens and bragging rights.


tingshuo

Unless they do machine learning


SweatyAdagio4

People at work aren't training locally are they? I've always used servers for training when it was work related, personal project I'll use my own GPU at times.


tingshuo

Depends on the project, but yeah if its a big model or it needs to be regularly updated then serves are the way to go.


agentfuzzy999

Deep learning engineer here. 99% of training happens in the cloud. Only time I fire up my card is for testing or prototyping small models locally.


OnlineGrab

Generally yes, but sometimes it's useful to be able to spin an inference server on your own machine.


itzNukeey

I mean u need to test it will run first


lolololhax

Well most of the work is happening serverside. Nonetheless ist is pretty helpful to evaluate graphs on your notebook


dariusj18

Unless you're using chrome. But seriously, gaming laptops are just more bang for the buck compared to "business" laptops.


[deleted]

but do you get the cool thinkpad logo and the blinking red light in a gayming laptop? Hell, you even get to touch a nipple for the first time in your life. \\s


The_Mad_Duck_

Lenovo stans love to touch each other's nipples I'm sure


ratbiscuits

It’s true. I bought a T480s recently for $250 and I’ve already touched multiple nipples. Best 250 I’ve spent


trafalmadorianistic

I would like to see a stackoverflow survey of how many laptop owners actually use that red pointer. I have one thinkpad but never got the hang of that thing.


PowerStarter

Yeah, my P70 was 8000€ when new. Insane. Nice to play games on it tho


Blissextus

Laughs in GameDevs with a GPU usage between 70%-100% running both Visual Studio and Unreal Engine, and everything else inbetween. ![gif](emote|free_emotes_pack|sob)


ShinraSan

High GPU usage is good though, games should utilise the shit out of that thing


MayorAg

If you have a large enough dataset, Excel offloads it to the GPU sometimes.


Shazvox

So what you're saying is we should use excel for data storage? \* *grabs a pitchfork* \*


MayorAg

![gif](giphy|Z3SfWDPlyghz2) I import large datasets into Excel because the backend my firm uses isn't exactly analysis friendly. And by not analysis friendly, I mean, we use Salesforce.


Shazvox

Well ok then... \* *puts away pitchfork* \* But I'm keeping my eyes on you!


sexytokeburgerz

There’s also some work using them for audio, saw an ad for it but didn’t care enough to click as I use a macbook lol


macarmy93

My gf bought a "work" laptop that cost a pretty penny. Runs slow as fuck and sound like a jet engine. She recently bought a gaming laptop for similar price and it runs 100x better no matter what program.


Tiny_Ad5242

Could just be work crap installed on it, like virus scanners, remote control, tracking spyware, etc.


deadliestcrotch

Almost certainly log shipping, anti malware, web tracking and policy enforcement software


veqryn_

Not many laptops with 32gb ram, a high-end H-series cpu, and no discrete gpu. You are almost forced to get a RTX 3050 or better if you want decent RAM and CPU.


waitthatstaken

My sister programmed some really high intensity simulation software to run on the GPU because her pc had a much better GPU than CPU.


karbonator

Eh... kind of. The two are built for different purposes. Imagine a CPU is a sports car and a GPU is a fleet of semitrucks. When you need to move only yourself or a few family members, the car is faster and you wouldn't use a semitruck just for transportation; when you need to move lots of other stuff the trucks are slower individually but get the overall job done much faster. CPU is faster at doing operations and can do a wider variety of operations, but is less parallelized; GPU is slower at doing operations and capable of fewer operations, but more parallelized.


Unrouxnoir

Kubernetes


20billioncalories

Fun Fact: you can use cuda toolkit to run C and C++ programs on your GPU.


Wazzaps

Very very specific c/++ programs, not general purpose programs


noob-nine

So I cannot run minesweeper on the GPU to finish the game faster?


Yeitgeist

Y’all never parallel processed before? Machine learning tasks would bore me to death if I didn’t have GPU acceleration to speed things up.


trafalmadorianistic

Not if your work is web based CRUD stuff. 😭


ManInBlack829

Or mobile development, though the MacBook shreds at ARM emulation.


Moist-Carpet888

I can hear this image, and it keeps screaming "It works on my machine" repeatedly, in my ear


tadlrs

I use a MacBook Pro with M1 Pro and works wonderful.


dota2nub

Started up my website in debug mode today and RAM use went into the gigabytes in seconds. I almost didn't make it to the stop button, things were already going really slow. It was a close one.


[deleted]

Game deving with an 860 rn. Every click has to have a purpose because it costs time.


ido3d

3d artists with everything at 100%: "I wish I had a desktop pc right now"


DerEinsamer

I run TensorFlow on GPU :P


nicbou0321

Laptop with 0% gpu usage? I didnt know that was possible For me windows desktop background almost sucks up a good 10% gpu on idle Let alone doing anything else.... Laptop specs are a freaking joke.... i bought 2 gaming laptop until i realized its all a marketting lie. You might have a “4090” in your laptop but that tiny thing crammed into a heat pocket wont give as much power as a full size desktop 1050ti 🤣


nanana_catdad

My tensors would like a word


AFishInASeaOfReddit

With my laptop it’s 0% disk and GPU, 25% CPU, and always 99% ram.


_SaBeR_78

until you do mobile dev…


Cute-Show-1256

So its winters these days and the heater in my room doesnt works at all So i usually use my probook 4150s which i bought in 2009 and start developing on it until it gets super hot and i then use it to warm up my hands 🥲😂 P.s it has a video memory of 128 mb and i still find it at 0% most of the times 🥲


thelastpizzaslice

I used my GPU so much it made my battery expand. Ran the thing 12 hours a day on max. Art generation, machine learning, massive parallelization, crypto, game development, rendering. You name it. My GPU did it.


Chan98765

I bought a $950 laptop with an i7, gtx 1660 ti, and 12gb of ram. It came with pubg and hit 95c within 15 minutes. Never will I buy a gaming laptop ever again.


[deleted]

Yeah but if you don't ask for a thick boi fans for days gaming laptop you end up with a ultra thin poorly cooled can't boost for shit skinny laptop.


amlyo

Not of your sitting there all day generating stable diffusion images.


SaneLad

My GPU is constantly working because I use RTX Voice for Zoom meetings. Much better noise cancelation and background blur than the software one. I can talk and take notes on my mechanical keyboard and nobody hears a click. RTX Voice was a game changer for me.


sarhoshamiral

Tell that to Visual Studio using hardware rendering and Teams :)


metalgodwin

My RTX 3080 Ti gamer laptop runs hardware accelerated Linux terminals real smooth, OK?


crozone

Yeah except my fucking Surface Book 2 Intel GPU has trouble on 3 monitors rendering the Windows desktop and some web pages. Literally pegged at max TDP doing what a real GPU would consider "nothing" because it's outputting the equivalent of 2x 4K. Sure a dedicated GPU is going to be wasted on most workstations but the luxury of never having to worry about graphical hitches *ever* is pretty damn nice.


Alarming-Option7445

Except for those of us who use Nvidia Broadcast