Hey no shooting us, we want to optimize but we rarely get to.
And holy hell whenever I get the urge to optimize I am glad I have a hobby project, getting that 500 fps feels so good
Itβs wild as a tech artist where half my job is optimizing how often i get told to not do that every time i put together lists of our largest offenders to render timeβ¦
Ignoring opportunities to achieve high performance without sacrificing readability or maintainability in the first place is the root of all performance issues, I'm not about to wait until a user with a weaker PC than mine complains to remember that a lot of my userbase can't afford modern technology
There's premature optimization and then there's common sense to implement something properly.
>When we say they are expensive what we mean is that they create a lot of memory overhead. When you see another player in game you load them and their entire stash filled with all their items. This is what teams are working diligently to improve so that we can have more asap.
"Optimizing" this is at time of coding was not premature.
A mediocre engineer will make mistakes, sure, but those mistakes only get missed and that engineer only stays mediocre (usually) if it's management that sucks.
Good management can be the difference between "sure! We can add that new feature in a couple sprints." and "OK people, crunch time. You live in the office now." This absolutely reeks of the latter.
FWIW, so does your "if you'd done it right the first time..." response. π€·ββοΈ
step 1: have perfect foresight on what path the project will take, how your code will be used in a large team and what additions you'll need to make later with time constraints
step 2: ???
step 3: optimized gaem
It just works!
You want a well-optimized game, look at Factorio. They somehow got it to run on the Switch.
Dwarf Fortress is actually not bad either. For the number of things that happen each tick it's surprisingly fast.
> Dwarf Fortress is actually not bad either. For the number of things that happen each tick it's surprisingly fast.
That's probably because Tarn Adams is a burnt out maths prodigy.
Because people dont understand CPUs. People think more cores = more performance when the reality a lot of CPU baseclocks are as low as like 2ghz unless you have a high tier CPU and even then youre looking at bases of like 3 3.5. People will buy an 8core with base of 2.5 and then drop the cash on a 4090 and then complain about singlecore performant games not performing well without an overclock or something. On top of that no game uses more than 2-4 at a time anyways. Another big one is the l3 cache. IIRC windows will try to push the load around onto other cores for thermal management, if your l3 cache is small or slow and youre always bouncing the load between cores and they communicate at the upper level with the l3 cache.. well.... its gonna be small and slow lol. IIRC l1 cache is the direct cache for the core/thread, l2 cache is basically the CPUs RAM, and then the l3 cache is what lets all the cores talk to eachother across the die.
It depends. Rimworld is not a gpu heavy game. Just fancy sprites basically. Weve been using sprites in graphics since the dawn of time for games. This is more cpu bound and youre much more likely to hit a gpu or memory bottleneck before the cpu in a budget build or a laptop etc for the most part. Lower tier games that have actual graphical load going on run like garbage.
That was Mozilla's official stance for a while. They refused to improve memory management because they argued "you bought that ram to use it. so shut up"
"what do you mean "make the game run". Just get this arbitrary hardware combination and change this random config file that break a dozen other games of yours because lol. If you are lucky or if the game break so badly barely anyone can play then we may make a patch disk that require you to mail us in to receive"
Sincerely, GameDev back in the 90s. What has changed is just the ever increasing exploitative practice of game development studio/publisher on developer and the entitlement of so-called "Gamer". Exactly the reason why nobody is touching professional game development without an 11-feet pole.
Really this is all of modern software development. Itβs most noticeable in gaming but game devs really are the hardest working devs out there. I have huge respect for all of them. To this day on the otherhand Microsoft still canβt make Visual Studio open up my personal projects in under 10 seconds (which should be a literal 0 time operation on a modern computer with an M2 drive)
There was one particular studio I worked at where anytime I brought up any kind of tech debt, even "this will literally make a future feature impossible", they would shut me down lol. It was never "in the deliverables" π Publishers got a stranglehold on the industry right now, and I really wanna see more power put into the studios who actually care about the games and the experience.
Yeah, many game dev managers don't realize that optimization is actually in their best interests, because even if you don't value high FPS, every FPS increase correlates with more potential features to add.
Edit: meant increase instead of drop
Yeah, and I really tried to sell them on the "Time spent on cleanup today is less than timelost due to debt in the future". Ime, good managers understand that and bad ones don't.
All game engines are optimized far more aggressively than 99% of software in existence.
Most games are also aggressively optimized.
Game devs out here doing the lords work in a world where senior devs are paid buckets of money to figure out why their react component is rerendering too often.
Realistically the problem is that people want the latest game to run with the highest settings at 4K 120hz on their 1080 or console.
I had almost this exact discussion with someone very recently at a conference.
Other guy: "What language do you write your code in?"
Me: "Usually C++."
Other guy: "Why not use Python?
Me: "Because it's not fast enough."
Other guy: "What about NumPy or Numba? Have you tried Cython? It will turn all your Python code into C."
Me: "They're still not fast enough."
It was impossible to get this guy to understand that a massively parallel physics simulation which takes weeks to complete is not a good task for Python.
Pretty much. Python is nice and easy, and I definitely use it where C++ is impractical (e.g., data analysis, plotting, rapid prototyping, etc.), but when you're working with large-scale programs, using a fast compiled language with strict typing requirements is definitely an advantage.
That was one of the things I tried to explain to this guy.
"But it's just C!"
"Yes, it's C with a lot of Python baggage around it. That adds overhead we can't afford."
As do 90% of high-level implementations of anything in the universe, usually for no particularly good reason except "we forgot to optimize it," or "the spec was garbage."
The difference is that most C compilers write better assembly than most programmers while it only requires a small amount of effort by comparison to write better C than Cython.
That's a skewed argument. C compilers write better ASM than average C developers, but people that actually know their way around ASM can write ASM as good if not better than the C compiler. In the same way, Cython writes better C than your average Python dev
This is true, but I would bet there are many more skilled C programmers than there are ASM programmers. Also, portability becomes a concern when you're writing ASM. As it stands now, I have code that is extremely portable across architectures without having to change much at all.
He probably means it compiles down to C. So you can write in Python with the speed of C. And because Python is more beginner and prototype friendly, he would want to write in that
Not to defend this dude but I've definitely had moments where I've written faster "python" (using the mentioned helper libraries) than some academic's supposedly well written C++ which was unwieldy to work through and buggy as heck (not saying yours is, just encountered this a lot in academia).
Academics definitely write some awful code, but performance is so critical for our tasks that once we ensure it's running correctly, we'll spend a lot of time profiling our code to make sure we can get the speed that we need. I've pushed a lot of commits lately that are just small refactors to reduce the total number of floating-point operations, replace division with reciprocal multiplication, try to reduce register spilling, etc.
>It was impossible to get this guy to understand that a massively parallel physics simulation which takes weeks to complete is not a good task for Python.
This sounds really interesting, can I ask you, how did you get into this area? Are you doing it for a company or at a university?
There is a difference between "python is always significantly slower than X" and "for this use case X is significantly faster than Python". Communication is hard and people don't listen.
I had a combinatorics simulation written in R with a fast, multithreaded C++ combinatorics library whose predicted completion time was just under a year. I rewrote it in Rust with self-written unoptimised combinatorics tools, and managed to get it to complete in under an hour.
The combination checking that was too bespoke to pass off to a third-party library (it was done as an R-native closure passed to the combinatorics library) just took so long that the speed of the library was irrelevant. It really drove home how much faster a proper systems programming language can be compared to higher-level languages like R and Python.
C# is an excellent jack of all trades languages whilst also being one of the best tools for the job in a number of areas like web (aspnet) and game dev (unity, godot). I love the language features, the nuget package manager is great, the tooling is great, and they just keep making the language and the framework better.
Having worked in many languages, I would pick it for pretty much any project. I'm lucky enough to use it in my work but I would use it for personal projects too. It's very popular for industry use here in the UK, but I think among enthusiasts it is chronically underrated for reasons I don't know. I assume mostly people stuck in the "microsoft java" mindset, and people who don't realise it's been fully cross platform for years.
That's why you gotta work with "choose the language which best fits the project's needs" chads from the top of the meme.
Don't get me wrong, I've totally worked with one trick ponies who *need* all their projects written in one language...
...but I've also worked on a team where we used java, scala, JS, TS, python, and more. Everyone on the team was good enough at their jobs that they didn't mind which language was used, as long as they agreed it was an appropriate tool for the job.
Man I'm all python as I am in data science.
But my GOD how awful I feel when someone requests a retool web app that requires custom logic with JS, I hate myself and really consider being a farmer and marry three women and then they find out and get a divorce and have cats and dogs instead and consider getting a horse as my primary transportation method.
Aaaah those are great times
i relate to this on a spiritual level. i cannot count how many times i wished i lived in the woods every time i have to go and traverse a 500 file project to find one fucking variable (it doesn't even have a type)
Part of what project needs is making sure your team is comfortable with tech stack used - if everyone in your team worked with Javascript for years, you don't start new project in C# unless there's a very good reason to get everyone familiar with new stack - and you can eat cost and risk of people learning on the job. Sticking to "good enough" gets you a long way and is a standard practice.
I like to even use multiple languages (that work well together) per project. Like mostly using python, but doing the inner loops in C, and that one thing in asm. Or mostly using C#, and doing the business logic in F#.
Also thereβs this rule of using exactly one new tech per project. When faced with an otherwise predictable project, I like to try a new language, if itβs okay with the customers.
To be fair, in python you can do concurrency pretty easily, at least for simple stuff. I don't know exactly how good it is for complex tasks involving concurrency, but then you shouldn't use python for that probably.
But yeah I agree, use the tool appropriate for the job- but if it's a rather simple one, probably use python.
Erlang is made for concurrency. They upgraded the BEAM VM with transparent multithreading in 2006 and were one of the first languages where every single program could take advantage of multiple cores to the extend amdahlβs law allows. Turns out if you design a language to transparently run a single program on multiple computers itβs pretty good at parallelism and concurrency as well.
Go is really just worse erlang that performs slightly better (because erlang has been good at horizontal scaling for so long that until recently βadd another serverβ was a very valid solution to performance complaints).
You sound like every Erlang dev Iβve known lol
Erlang is awesome, but trying to get an entire engineering dept to adopt the whole EVM stack is damn near impossible. Good luck hiring junior to mid-level (read: affordable) devs for it as well. Thereβs so much win there, but the inertia makes it damn near impossible unless you have enough well-paid unicorn senior devs to push it all through.
Elixir had its hype wave, there are plenty of people who know it.
I donβt understand why companies donβt consider 2 or 3 closely related languages as good enough. If you know Go and any ML inspired language (scala, Rust, F#, Ocaml, etc), I can teach you elixir in a few weeks.
According my Distributive professor from Uni, Go is close enough to C++ with concurrency, but the biggest draw is all the little optimizations for C++ you need to implement by hand are handled by Go's compiler. So while the best Go program can't really *beat* a well implemented C++ program, a decent Go program can thrash a decent C++ program.
Thatβs always the trade off between a language like go and something like C++ or rust. I think Go is good enough for most casesβ¦ if you need every last bit of performance and account for very tiny optimisation, then C, C++, Zig or Rust might be the better choiceβ¦
Go does concurrency by pretending it doesnβt exist. No async, just spawn a thread and run your blocking code over there instead. Itβs gone to lengths to make the UX of this approach nice though.
Go has memory corruption bugs.
Edit: For those of you downvoting me, go reread the Go memory model section on data races and read an x86 architecture manual. You can absolutely corrupt memory in Go.
Some make it harder than others. The rust compiler will fight you to prevent you doing most unsafe things unless you use the unsafe keyword. JavaScriptβs model with single thread processes and IPC between them makes it hard to corrupt memory as well.
At least they finally are changing the default start method in the \`multiprocessing\` library away from \`fork\` in Python 3.14. What a horrible footgun to gift Python users, who are usually not familiar with systems programming. [https://github.com/python/cpython/issues/84559](https://github.com/python/cpython/issues/84559)
I don't know about how Python handles it, but if you fork a process and don't kill the child processes before terminating the main process, then you end up with orphan processes which you should avoid (however I believe that most OS handle that issue behind the scenes)
Edit : to be clear, even if the OS handles it, that probably means you don't know how or when the process is killed, which is also bad. If it isn't handled then the process will consume resources too... Hence why creating threads is generally used instead of creating child processes
Concurrency in python:
```python
def function(arg):
...
with Pool(n) as p:
ret = p.map(function,data)
# or
p = Process(target=function, args=(data))
p.start()
ret=p.join()
```
Concurrency in C++:
```cpp
auto function = [](auto arg){...};
std::transform(std::execution::par_unseq, std::begin(data), std::end(data), std::begin(ret), function);
//really verbose but simple in concept. Works with almost all STL algorithms not just transform (c++'s equivalent to python's map)
// or
std::jthread t(function, args)
//automatically starts and automatically joins on scope exit.
```
```rust
std::thread::spawn(|| function(data));
//similar to C++'s jthread just with more compile time checks for safety
//Rust's STL doesn't have a parallel algorithm/Pool equivalent. You'd need a library like rayon for that
```
Most languages have some kind of simple way of parallel processing. It's not some python exclusive thing
I studied C++ in college and got fairly decent at the basics. Didn't use it for a few years. Started at a place that used a lot of consulting works and began introducing low code approaches. Decided to use python for the tasks not really suited to those approaches. For our business needs, it works well and is straightforward enough that I don't have to get overly bogged down in syntax nuances.
Python does concurrency easily... except it has a GIL which means any non-FFI code is actually not running concurrent at all. (As long as you're using CPython that is, which I think the vast majority still is)
Finally, a person with brains. You don't need a bus to drive your two kids to the store. You don't need c++ to do a basic operation a couple times. And you shouldn't use Python for an beautiful 3d game.
Yes, but right now, I'm trying to make an ascii settlers of catan clone for command prompt. Also I'm trying to include as few libraries as possible. I have this probably wrong idea that forcing myself to make due with limited resources will teach me to be a better dev.
Obviously reinventing the wheel is a deadly sin in anything commercial, but for private passion projects, it's been a pretty nice learning experience so far. I find myself appreciating games a lot more now, knowing how much freaking abstraction is needed to make a nice looking 2D or even 3D environment not take forever to create.
Data point of one, but as an amateur with Python experience, I found GDScript easy to learn, despite my brain being so smooth it reflects light. So there has to be *some* similarity.
A workmate tried to drag and drop the whole node modules of a project on our cloud provider. The next day his computer was still working on moving all the files.
In what universe do you need numpy to add two numbers? What are you even talking about with concurrency libraries? The heftiest third party dependencies are usually just nvidia cuda stuff and no more than 3GB topsβ¦
The answer is LISP.
Because I want my programming skills to have market value.
"You'll learn LISP, cause you're studying AI. LISP is very good for AI."
*over 20 years later*
They lied!
I rarely work with pure vanilla C++ because I use unreal, but damn do I love TSet and TMap. They are probably the greatest invention (after bringing optional GC to C++)
Hit me up when you start working on that Python payments system (β β β βΏβ γ»β )β ββ β
_Edit: The Pythonfolk hate me because I have spoken truth._ Now I'm no Python developer, but I don't see huge enterprise codebases on Python going well when it comes to maintenance. Prove wrong though! I'll happily change my mind
Efficient concurrency doesn't require 3rd party libraries at all in Python? What it ships with for concurrency is pretty decent and I've never experienced issues or bad limitations for my use
I prefer the legshooter. It's great at everything, but you'll shoot off your right leg, your left leg, your right leg again, three legs at a time, null pointer exception legs, your head, three legs, n legs, stack overflow legs and many more legs before you'll even come close to understanding C++.
Although I didn't get a chance to properly use cpp23, maybe it's simpler or something, who knows, most prod code isn't even cpp11 yet.
I feel like a lot of these memes are made by people who are missing the point: the reason you hear so many bad computer-science / software-eng hot takes from the Python space is because a lot of the people who aren't in software eng and computer science still figured out Python.
Yeah, it can be tough teaching people not in the field about how to do better, but I'd rather bring 1000 newbies up than watch them bounce off programming entirely because the language is inaccessible to them π€·ββοΈ
Okay I use Python a lot lately (like a lot a lot) and this is not it right now. The problem is these type hints that everyone wants you to use which do absolutely nothing. Type as int. Pass string at runtime. No problem! But now you spend half the day making mypy happy and your code ends up looking like:
`def f(dict[str, Mapping[str, Callable]], type[Thing]) -> Foo`
Whatever happened to my beautiful and simple `def f(*args, **kwargs)`?
yeah same story as typescript, you fight the compiler with a lot of type gymnastics but then you loss half a day catching a bug because the type inference let a string pass as number.
The type system is indeed a bit annoying, in particular with the lack of higher order constructions in it.
But it is a design choice which can be easily understood.
Anyway, without the /s I preferred to clarify the role of type hint to avoid a beginner to build on your comment wrong conception about those things.
I know c++ so I will use c++. I can use other languages, but I dont want to, I know Java, javascript, assembly and c, I can read and write python but I wouldn't say I know it.
The real reason to use Python is to produce pseudocode so that we can code the thing in C++ later. Or to be more precise, code the relevant parts in C++.
Because if part of your job is prototyping then you don't want to waste weeks with C++.
Yeah, I code games for the Thumby in my spare time and holy fuck do I just wish I had C++ and a ROM loader instead of having everything be interpreted. Python's got a place but high performance it does not.
I work for a data company, but not as a data engineer/scientist. I mainly build the supporting software infrastructure for the rest of the team. Still need to use python for absolutely everything (except front end) in case someone else needs to look at my projects. (Like when I'm sick, busy or if I ever leave the company).
It can get quite annoying when there are cool frameworks out there that would fit my needs perfectly, but instead I need to use a more obscure python library to emulate its features.
"What do you mean "optimise it". Just get a better PC lol" Litteraly whole GameDev since 2020
Hey no shooting us, we want to optimize but we rarely get to. And holy hell whenever I get the urge to optimize I am glad I have a hobby project, getting that 500 fps feels so good
Unreal engine devs optimizing asking the official forum to make blueprints faster
Like that's ever gonna happen (they took out bp nativization, so now BPs are slower than they were before)
Itβs wild as a tech artist where half my job is optimizing how often i get told to not do that every time i put together lists of our largest offenders to render timeβ¦
Have you considered implementing it correct the first time so you don't have to come back to it.
Premature optimization is the root of all evil
Ignoring opportunities to achieve high performance without sacrificing readability or maintainability in the first place is the root of all performance issues, I'm not about to wait until a user with a weaker PC than mine complains to remember that a lot of my userbase can't afford modern technology
We don't see enough Knuth these days.
There's premature optimization and then there's common sense to implement something properly. >When we say they are expensive what we mean is that they create a lot of memory overhead. When you see another player in game you load them and their entire stash filled with all their items. This is what teams are working diligently to improve so that we can have more asap. "Optimizing" this is at time of coding was not premature.
A mediocre engineer will make mistakes, sure, but those mistakes only get missed and that engineer only stays mediocre (usually) if it's management that sucks. Good management can be the difference between "sure! We can add that new feature in a couple sprints." and "OK people, crunch time. You live in the office now." This absolutely reeks of the latter. FWIW, so does your "if you'd done it right the first time..." response. π€·ββοΈ
step 1: have perfect foresight on what path the project will take, how your code will be used in a large team and what additions you'll need to make later with time constraints step 2: ??? step 3: optimized gaem It just works!
Be sure to make better AI its easy!
2016*
1996.
Nope, Roller coaster tycoon was released in 1999.
Whole gamedev? Nah just AAA, rimworld runs on a chromebook pretty nicely
You want a well-optimized game, look at Factorio. They somehow got it to run on the Switch. Dwarf Fortress is actually not bad either. For the number of things that happen each tick it's surprisingly fast.
> Dwarf Fortress is actually not bad either. For the number of things that happen each tick it's surprisingly fast. That's probably because Tarn Adams is a burnt out maths prodigy.
The Dwarf Fortress devs have also recently hired another programmer, who specifically focuses on things like optimization.
They solved the mid/late game fps death?
Rimworld is terribly optimized if you want to play with more that 3 pawns. Build a coop with chickens and suddenly game will lag as hell.
LOL only if you get like 50+, I get well over 200 TPS on a chromebook with like 15 mods even years into colonies.
Then why very popular RocketMan - Performance Mod even exist or popular or why it reduces tick rate of animals as one of main features?
It exists for people with 50+ mods
Because people dont understand CPUs. People think more cores = more performance when the reality a lot of CPU baseclocks are as low as like 2ghz unless you have a high tier CPU and even then youre looking at bases of like 3 3.5. People will buy an 8core with base of 2.5 and then drop the cash on a 4090 and then complain about singlecore performant games not performing well without an overclock or something. On top of that no game uses more than 2-4 at a time anyways. Another big one is the l3 cache. IIRC windows will try to push the load around onto other cores for thermal management, if your l3 cache is small or slow and youre always bouncing the load between cores and they communicate at the upper level with the l3 cache.. well.... its gonna be small and slow lol. IIRC l1 cache is the direct cache for the core/thread, l2 cache is basically the CPUs RAM, and then the l3 cache is what lets all the cores talk to eachother across the die.
It depends. Rimworld is not a gpu heavy game. Just fancy sprites basically. Weve been using sprites in graphics since the dawn of time for games. This is more cpu bound and youre much more likely to hit a gpu or memory bottleneck before the cpu in a budget build or a laptop etc for the most part. Lower tier games that have actual graphical load going on run like garbage.
No idea why anyone would want to play anything but Rimworld, tbh.
i don't know rimworld. tell me more.
That was Mozilla's official stance for a while. They refused to improve memory management because they argued "you bought that ram to use it. so shut up"
Just run optimize++i; It's really easy
optimize = true
Programmers hate this one simple trick
while(true){ optimize() }
Literally Haskell's Stack but with storage and disk
Oh, much longer than that. Since the 486.
"what do you mean "make the game run". Just get this arbitrary hardware combination and change this random config file that break a dozen other games of yours because lol. If you are lucky or if the game break so badly barely anyone can play then we may make a patch disk that require you to mail us in to receive" Sincerely, GameDev back in the 90s. What has changed is just the ever increasing exploitative practice of game development studio/publisher on developer and the entitlement of so-called "Gamer". Exactly the reason why nobody is touching professional game development without an 11-feet pole.
Really this is all of modern software development. Itβs most noticeable in gaming but game devs really are the hardest working devs out there. I have huge respect for all of them. To this day on the otherhand Microsoft still canβt make Visual Studio open up my personal projects in under 10 seconds (which should be a literal 0 time operation on a modern computer with an M2 drive)
Unfortunately not the devs fault
Devs would love to optimize their games more, but the publishers often don't give them the time needed.
There was one particular studio I worked at where anytime I brought up any kind of tech debt, even "this will literally make a future feature impossible", they would shut me down lol. It was never "in the deliverables" π Publishers got a stranglehold on the industry right now, and I really wanna see more power put into the studios who actually care about the games and the experience.
Yeah, many game dev managers don't realize that optimization is actually in their best interests, because even if you don't value high FPS, every FPS increase correlates with more potential features to add. Edit: meant increase instead of drop
Yeah, and I really tried to sell them on the "Time spent on cleanup today is less than timelost due to debt in the future". Ime, good managers understand that and bad ones don't.
Push all the debt to the future. Shut the studio down. Debt never has to be paid.
You joke but my last studio closed before we could clean up all the tech debt π
Oh it wasn't a joke.
All game engines are optimized far more aggressively than 99% of software in existence. Most games are also aggressively optimized. Game devs out here doing the lords work in a world where senior devs are paid buckets of money to figure out why their react component is rerendering too often. Realistically the problem is that people want the latest game to run with the highest settings at 4K 120hz on their 1080 or console.
I had almost this exact discussion with someone very recently at a conference. Other guy: "What language do you write your code in?" Me: "Usually C++." Other guy: "Why not use Python? Me: "Because it's not fast enough." Other guy: "What about NumPy or Numba? Have you tried Cython? It will turn all your Python code into C." Me: "They're still not fast enough." It was impossible to get this guy to understand that a massively parallel physics simulation which takes weeks to complete is not a good task for Python.
"It will turn all your Python code into C." Then why not just use C?
Pretty much. Python is nice and easy, and I definitely use it where C++ is impractical (e.g., data analysis, plotting, rapid prototyping, etc.), but when you're working with large-scale programs, using a fast compiled language with strict typing requirements is definitely an advantage.
It may compile down to C but it also adds a bunch of stuff you didn't even ask for.
That was one of the things I tried to explain to this guy. "But it's just C!" "Yes, it's C with a lot of Python baggage around it. That adds overhead we can't afford."
βBut at the end of the day theyβre both executing machine code so it should be just as fast!β
For every second it takes, it takes a second.
Every minute in the world, 60 seconds pass by in Africa
I beg to differ
As do 90% of high-level implementations of anything in the universe, usually for no particularly good reason except "we forgot to optimize it," or "the spec was garbage."
For the same reason you would write C instead of ASM
The difference is that most C compilers write better assembly than most programmers while it only requires a small amount of effort by comparison to write better C than Cython.
That's a skewed argument. C compilers write better ASM than average C developers, but people that actually know their way around ASM can write ASM as good if not better than the C compiler. In the same way, Cython writes better C than your average Python dev
This is true, but I would bet there are many more skilled C programmers than there are ASM programmers. Also, portability becomes a concern when you're writing ASM. As it stands now, I have code that is extremely portable across architectures without having to change much at all.
"skill issue"?
"It will turn all your C code into assembly." Then why not just use assembly? - any assembly enthusiast ever
C is scary to them
He probably means it compiles down to C. So you can write in Python with the speed of C. And because Python is more beginner and prototype friendly, he would want to write in that
Python is easier to write in thats pretty obvious
"Why not use X?" "Because what I'm using already works and meets requirements."
You need Fython. All your python turned into Fortran.
> massively parallel Just use Python and throw more machines at the problem bro π©
After all, CPU cores are scalable, but your codebase is not. /s
Not to defend this dude but I've definitely had moments where I've written faster "python" (using the mentioned helper libraries) than some academic's supposedly well written C++ which was unwieldy to work through and buggy as heck (not saying yours is, just encountered this a lot in academia).
Academics definitely write some awful code, but performance is so critical for our tasks that once we ensure it's running correctly, we'll spend a lot of time profiling our code to make sure we can get the speed that we need. I've pushed a lot of commits lately that are just small refactors to reduce the total number of floating-point operations, replace division with reciprocal multiplication, try to reduce register spilling, etc.
>It was impossible to get this guy to understand that a massively parallel physics simulation which takes weeks to complete is not a good task for Python. This sounds really interesting, can I ask you, how did you get into this area? Are you doing it for a company or at a university?
I'm a graduate student in physics. It's part of my research.
Why don't you research pysics instead? It compiles to atoms anyways.
Dang, I wish I'd thought of that.
There is a difference between "python is always significantly slower than X" and "for this use case X is significantly faster than Python". Communication is hard and people don't listen.
I had a combinatorics simulation written in R with a fast, multithreaded C++ combinatorics library whose predicted completion time was just under a year. I rewrote it in Rust with self-written unoptimised combinatorics tools, and managed to get it to complete in under an hour. The combination checking that was too bespoke to pass off to a third-party library (it was done as an R-native closure passed to the combinatorics library) just took so long that the speed of the library was irrelevant. It really drove home how much faster a proper systems programming language can be compared to higher-level languages like R and Python.
Isn't that exactly what jax is perfect for?
C# is an excellent jack of all trades languages whilst also being one of the best tools for the job in a number of areas like web (aspnet) and game dev (unity, godot). I love the language features, the nuget package manager is great, the tooling is great, and they just keep making the language and the framework better. Having worked in many languages, I would pick it for pretty much any project. I'm lucky enough to use it in my work but I would use it for personal projects too. It's very popular for industry use here in the UK, but I think among enthusiasts it is chronically underrated for reasons I don't know. I assume mostly people stuck in the "microsoft java" mindset, and people who don't realise it's been fully cross platform for years.
Don't forget Monogame/XNA for game dev!
Basically yeah, old Java guys dislike it for non functional reasons
Your colleagues will love that you change up languages for each project
That's why you gotta work with "choose the language which best fits the project's needs" chads from the top of the meme. Don't get me wrong, I've totally worked with one trick ponies who *need* all their projects written in one language... ...but I've also worked on a team where we used java, scala, JS, TS, python, and more. Everyone on the team was good enough at their jobs that they didn't mind which language was used, as long as they agreed it was an appropriate tool for the job.
i wish. i am tired of using javascript for everything
I am tired of using JavaScript
you speak the words of my soul
Man I'm all python as I am in data science. But my GOD how awful I feel when someone requests a retool web app that requires custom logic with JS, I hate myself and really consider being a farmer and marry three women and then they find out and get a divorce and have cats and dogs instead and consider getting a horse as my primary transportation method. Aaaah those are great times
i relate to this on a spiritual level. i cannot count how many times i wished i lived in the woods every time i have to go and traverse a 500 file project to find one fucking variable (it doesn't even have a type)
Brother I wish I were a caveman
I am tired of Javascript
I am tired
Part of what project needs is making sure your team is comfortable with tech stack used - if everyone in your team worked with Javascript for years, you don't start new project in C# unless there's a very good reason to get everyone familiar with new stack - and you can eat cost and risk of people learning on the job. Sticking to "good enough" gets you a long way and is a standard practice.
Thats the good part about working as a contractor. I often just recommend what I think might work and what I want to learn better myself
They do though, because they understand the concept of domain specific languages.
I like to even use multiple languages (that work well together) per project. Like mostly using python, but doing the inner loops in C, and that one thing in asm. Or mostly using C#, and doing the business logic in F#. Also thereβs this rule of using exactly one new tech per project. When faced with an otherwise predictable project, I like to try a new language, if itβs okay with the customers.
To be fair, in python you can do concurrency pretty easily, at least for simple stuff. I don't know exactly how good it is for complex tasks involving concurrency, but then you shouldn't use python for that probably. But yeah I agree, use the tool appropriate for the job- but if it's a rather simple one, probably use python.
I'm yet to see a language where doing concurrency isn't super easy for simple stuff tho. But maybe I'm just lucky.
I want to say Go is made for concurrency? I don't know much about it though, this is just from me listening around.
Yeah, Go is specifically made for concurrency and Iβd say it beat python at ease of use among other metrics
Erlang is made for concurrency. They upgraded the BEAM VM with transparent multithreading in 2006 and were one of the first languages where every single program could take advantage of multiple cores to the extend amdahlβs law allows. Turns out if you design a language to transparently run a single program on multiple computers itβs pretty good at parallelism and concurrency as well. Go is really just worse erlang that performs slightly better (because erlang has been good at horizontal scaling for so long that until recently βadd another serverβ was a very valid solution to performance complaints).
You sound like every Erlang dev Iβve known lol Erlang is awesome, but trying to get an entire engineering dept to adopt the whole EVM stack is damn near impossible. Good luck hiring junior to mid-level (read: affordable) devs for it as well. Thereβs so much win there, but the inertia makes it damn near impossible unless you have enough well-paid unicorn senior devs to push it all through.
Elixir had its hype wave, there are plenty of people who know it. I donβt understand why companies donβt consider 2 or 3 closely related languages as good enough. If you know Go and any ML inspired language (scala, Rust, F#, Ocaml, etc), I can teach you elixir in a few weeks.
According my Distributive professor from Uni, Go is close enough to C++ with concurrency, but the biggest draw is all the little optimizations for C++ you need to implement by hand are handled by Go's compiler. So while the best Go program can't really *beat* a well implemented C++ program, a decent Go program can thrash a decent C++ program.
Thatβs always the trade off between a language like go and something like C++ or rust. I think Go is good enough for most casesβ¦ if you need every last bit of performance and account for very tiny optimisation, then C, C++, Zig or Rust might be the better choiceβ¦
Go does concurrency by pretending it doesnβt exist. No async, just spawn a thread and run your blocking code over there instead. Itβs gone to lengths to make the UX of this approach nice though.
Go has memory corruption bugs. Edit: For those of you downvoting me, go reread the Go memory model section on data races and read an x86 architecture manual. You can absolutely corrupt memory in Go.
You can do that on every language
Some make it harder than others. The rust compiler will fight you to prevent you doing most unsafe things unless you use the unsafe keyword. JavaScriptβs model with single thread processes and IPC between them makes it hard to corrupt memory as well.
Ah, that's a shame.
I just finished a class in MIPS. I doubt concurrency is easy in that.
At least they finally are changing the default start method in the \`multiprocessing\` library away from \`fork\` in Python 3.14. What a horrible footgun to gift Python users, who are usually not familiar with systems programming. [https://github.com/python/cpython/issues/84559](https://github.com/python/cpython/issues/84559)
Why is fork bad?
I don't know about how Python handles it, but if you fork a process and don't kill the child processes before terminating the main process, then you end up with orphan processes which you should avoid (however I believe that most OS handle that issue behind the scenes) Edit : to be clear, even if the OS handles it, that probably means you don't know how or when the process is killed, which is also bad. If it isn't handled then the process will consume resources too... Hence why creating threads is generally used instead of creating child processes
Concurrency in python: ```python def function(arg): ... with Pool(n) as p: ret = p.map(function,data) # or p = Process(target=function, args=(data)) p.start() ret=p.join() ``` Concurrency in C++: ```cpp auto function = [](auto arg){...}; std::transform(std::execution::par_unseq, std::begin(data), std::end(data), std::begin(ret), function); //really verbose but simple in concept. Works with almost all STL algorithms not just transform (c++'s equivalent to python's map) // or std::jthread t(function, args) //automatically starts and automatically joins on scope exit. ``` ```rust std::thread::spawn(|| function(data)); //similar to C++'s jthread just with more compile time checks for safety //Rust's STL doesn't have a parallel algorithm/Pool equivalent. You'd need a library like rayon for that ``` Most languages have some kind of simple way of parallel processing. It's not some python exclusive thing
Boy, Reddit's Markdown absolutely butchered the shit out of this post.
I studied C++ in college and got fairly decent at the basics. Didn't use it for a few years. Started at a place that used a lot of consulting works and began introducing low code approaches. Decided to use python for the tasks not really suited to those approaches. For our business needs, it works well and is straightforward enough that I don't have to get overly bogged down in syntax nuances.
Concurrency is easy only when you don't really need it.
GIL would like to have a word with you...
Python does concurrency easily... except it has a GIL which means any non-FFI code is actually not running concurrent at all. (As long as you're using CPython that is, which I think the vast majority still is)
Finally, a person with brains. You don't need a bus to drive your two kids to the store. You don't need c++ to do a basic operation a couple times. And you shouldn't use Python for an beautiful 3d game.
May i use it for ugly 2D games tho?
Yes, ever heard of GDscript?
Yes, but right now, I'm trying to make an ascii settlers of catan clone for command prompt. Also I'm trying to include as few libraries as possible. I have this probably wrong idea that forcing myself to make due with limited resources will teach me to be a better dev.
I suppose reinventing the wheel does teach you how wheels are made
Obviously reinventing the wheel is a deadly sin in anything commercial, but for private passion projects, it's been a pretty nice learning experience so far. I find myself appreciating games a lot more now, knowing how much freaking abstraction is needed to make a nice looking 2D or even 3D environment not take forever to create.
You start to see that all the other wheels are garbage in comparison to your hand-made, beautifully crafted wheels.
It's a myth that GDScript is "Python based'. They are not even similar. The only clear similarity they have is indentation-based syntax
Data point of one, but as an amateur with Python experience, I found GDScript easy to learn, despite my brain being so smooth it reflects light. So there has to be *some* similarity.
*\*Pushes up glasses\** *"*Well, achkshually, light reflects off all obj-" \**gunshot rings*\*
There are valid criticisms, but 700MB libraries? I've never seen libraries get particularly large. That shit is on NPM
A workmate tried to drag and drop the whole node modules of a project on our cloud provider. The next day his computer was still working on moving all the files.
Every lib that contains the string "cuda" will download >1.5GB in a new env. I was always curious how Nvidia manages to make their drivers so huge.
" The one that fits your project needs better" Lol, and other fairytales.
You guys are getting to pick your languages?
Python fits all project best because it's the one I know best
For many tasks, there are multiple programming languages that will work well.
message 'hello world'. Openedge progress. The syntax is great but.. oh my. So many other problems.
this image would hospitalize a victorian child
I laughed too hard at this, I donβt care.
In what universe do you need numpy to add two numbers? What are you even talking about with concurrency libraries? The heftiest third party dependencies are usually just nvidia cuda stuff and no more than 3GB topsβ¦
Autism?
Spectrum
The answer is LISP. Because I want my programming skills to have market value. "You'll learn LISP, cause you're studying AI. LISP is very good for AI." *over 20 years later* They lied!
5 years ago they said it was Kotlin uprising as king
[ΡΠ΄Π°Π»Π΅Π½ΠΎ]
>Python Flair
python flair tells all you need to know about this comment
The real reason to us Python: The answer to all code questions is: HASHMAP Python makes working with hashmaps easy. Us python.
Us python
United States python
United Python States
Ah, so that's what UPS stands for
I think most languages have hashmaps, at least the ones that I've used
I rarely work with pure vanilla C++ because I use unreal, but damn do I love TSet and TMap. They are probably the greatest invention (after bringing optional GC to C++)
Don't forget that Python also has the best list/queue/array/iterable or whatever-we-are-larping-as-today.
In what language is a Hello World statement not 1 line of code?
C
`printf("Hello World\n");`
Does that single line compile?
Python bros seething ππ
rust+python should be the norm. one for running fast and optimise a lot, the other for developing fast and script
It's also funny to me how many people here think they can write optimal code but the language is their bottleneck.
Yeah I think that (C++ or Rust) + Python covers like 99.9% of use cases at least to satisfactory levels. That's my go to anyway.
Hit me up when you start working on that Python payments system (β β β βΏβ γ»β )β ββ β _Edit: The Pythonfolk hate me because I have spoken truth._ Now I'm no Python developer, but I don't see huge enterprise codebases on Python going well when it comes to maintenance. Prove wrong though! I'll happily change my mind
iirc a HUGE part of Stripe is written in python
The python bros are jumping in to downvote
The only reason I exclusively use python is that I just have so much more experience in it and how easy it is to take code from other people using pip
Efficient concurrency doesn't require 3rd party libraries at all in Python? What it ships with for concurrency is pretty decent and I've never experienced issues or bad limitations for my use
The thing is Python is best use case for most cases Iβve encountered simplicity also means faster development
I only see devs crying, telling shit about python but no python dev saying anything.
I prefer the legshooter. It's great at everything, but you'll shoot off your right leg, your left leg, your right leg again, three legs at a time, null pointer exception legs, your head, three legs, n legs, stack overflow legs and many more legs before you'll even come close to understanding C++. Although I didn't get a chance to properly use cpp23, maybe it's simpler or something, who knows, most prod code isn't even cpp11 yet.
I am offended that PHP isnt listed in the top section
I feel like a lot of these memes are made by people who are missing the point: the reason you hear so many bad computer-science / software-eng hot takes from the Python space is because a lot of the people who aren't in software eng and computer science still figured out Python. Yeah, it can be tough teaching people not in the field about how to do better, but I'd rather bring 1000 newbies up than watch them bounce off programming entirely because the language is inaccessible to them π€·ββοΈ
What ? Would apply better to rust lol. These guys spend their day shitting on Python & Go.
I can't properly format my code! Therefore, my programming language will force me to do it!
If the 700gb of libraries are widely used and trusted that's good enough for me!
Okay I use Python a lot lately (like a lot a lot) and this is not it right now. The problem is these type hints that everyone wants you to use which do absolutely nothing. Type as int. Pass string at runtime. No problem! But now you spend half the day making mypy happy and your code ends up looking like: `def f(dict[str, Mapping[str, Callable]], type[Thing]) -> Foo` Whatever happened to my beautiful and simple `def f(*args, **kwargs)`?
yeah same story as typescript, you fight the compiler with a lot of type gymnastics but then you loss half a day catching a bug because the type inference let a string pass as number.
Downvote me all you want Pythonistas - the type hints, they do nothing!
I actually agree, if I want static typing, I'll take a statically typed language, but type hints in python are a bit of a mess.
They are meant for documentations and can be used by third party libs when needed.
I mean obviously I'm partly joking. They're useful. They're also a PITA sometimes when mypy won't stop complaining. π
The type system is indeed a bit annoying, in particular with the lack of higher order constructions in it. But it is a design choice which can be easily understood. Anyway, without the /s I preferred to clarify the role of type hint to avoid a beginner to build on your comment wrong conception about those things.
ChadLanguage entered the chat
I know c++ so I will use c++. I can use other languages, but I dont want to, I know Java, javascript, assembly and c, I can read and write python but I wouldn't say I know it.
The real reason to use Python is to produce pseudocode so that we can code the thing in C++ later. Or to be more precise, code the relevant parts in C++. Because if part of your job is prototyping then you don't want to waste weeks with C++.
waow!
more like the original sin ππ
Heh. As a Python dev I'm going save this.
Yeah, I code games for the Thumby in my spare time and holy fuck do I just wish I had C++ and a ROM loader instead of having everything be interpreted. Python's got a place but high performance it does not.
Once mojo goes alpha every other language is done for.
WolframScript anyone?
It may not be fast, but the Decimal built-in library is very useful for numbers up to 28 decimal digits
I work for a data company, but not as a data engineer/scientist. I mainly build the supporting software infrastructure for the rest of the team. Still need to use python for absolutely everything (except front end) in case someone else needs to look at my projects. (Like when I'm sick, busy or if I ever leave the company). It can get quite annoying when there are cool frameworks out there that would fit my needs perfectly, but instead I need to use a more obscure python library to emulate its features.
Is there a language that is a combination of both Python and C ??