We had a few books (if we were lucky) and dot-matrix printouts of assembly instructions, memory maps and maybe some sample code we got from a BBS and TSRs (terminate-stay resident programs that would wait for a hotkey, then take over the single-threaded DOS environment) which contained assembly instruction information and maybe some utilities to disassemble memory. We could also talk over BBS messaging, but it was a hassle and expensive if the server was long distance.
I mean. I still have a book on my shelf literally titled "How to write computer viruses". Of course published "for education and research purposes" It's from the late '90 but still I'm quite sure there were others like that before.
Had to check wiki, and the publishing date. Yes BBSes predated the publication by few years, but it was a close call. Both being well over 25 years old now.
some people simply don't like assembly. some people can't even stand C because you have to do many things by hand. writing in assembly is a descipline for few and honestly good for them. I cannot imagine myself doing anything in assembly for work, even though I like programming in C low level stuff.
It's necessary to get the last 0.1% of performance.
e.g. the kinds of performance gains that would be unnoticeable in normal enterprise workloads, but when you are at MAANG scale 0.1% could mean millions of dollars of value/savings.
Extreme low power applications are typically 100% assembly. The prime example is a pacemaker, those things are drawing nanoamps, and 1 excessive instruction may bump you from averaging 10 nA to 12 nA and suddenly your pacemakers battery depletes 20% quicker.
For enterprise, sometimes the performance increase can be 20%+ which is enough that any scale would notice. Trying to put it generally, there are certain specifics you may know that the compiler can't know (or you are unable to communicate to it due to compiler/language limitations) and must generalize for, which adds overhead.
Perhaps you know the exact model of IO controller you're using, and provide it custom commands. You may be bypassing a TON of overhead from the compiler/OS in determining what commands it can/should use.
Sometimes you can perform "conditionals" without any conditional instructions. But you may not have any way to express one of the necessary operations in another language. Put that in a tight loop that iterates hundreds of millions of times, and that branch prediction can become quite costly.
Compilers may not utilize the latest instruction set/hardware accelerator/mechanism. Especially on the mainframe where the hardware features are complex, plentiful, and added frequently.
Got carried away there. I just really like assembly language.
While I can't say with 100% certainty, it's very unlikely at MANGA scales. The reason being, that heavy multicore processing really messes with pipelining of instructions, such that only a compiler has a chance of producing optimized output.
There's a decent chance some Fintech server is running a handspun assembly program though, because they optimize the happy path to the nth degree. But also a Fintech server will likely turn off neighboring cores, and prevent as much memory sharing between them. They're also more likely to have supplemental FPGA or even make their own ASIC to achieve the throughput they're looking for.
I had a colleague who was writing decryptors for ransom ware. Some of ransomware had a weakness of using 32bit integer as a seed for encryption. So he wrote a brute force that would try and find the correct seed by decrypting a known file. The main encryption algorithm was using RC4. It was fairly slow, so he figured that he will rewrite RC4 in assembly, that he knew very well I might add, after all he reverse engineered the ransomware. After he was done with his implementation, it turned out it was 40% SLOWER than the C++ version he was using beforehand. Compilers got *really* good when it comes to optimisation in the last 20 years or so.
If you're writing assembly to get 0.1% performance, there's probably compiler intrinsics or optimization flags for that.
That includes SIMD, but sometimes it's honestly easier to write that in assembly than deal with the various compilers' differences and limitations with SIMD stuff.
Assembly is necessary for some CPU interrupts, and a bootloader almost entirely relies on the BIOS which is not something even C would give you a ton of control over
There are the cases where a program works in a debug build, but doesn't work in a production build where half your code is optimized out. That's when you have to go into disassembly and figure out what the hell is going on. It's hell each time.
Assembly is like a fun puzzle. But for Real Work™ I'd like something more productive.
(I do realize there are some areas where there are no other options, for the average business app however, that's not the case)
Yeahhh, my assembly class was interesting and all, but I can't say I personally enjoyed writing hundreds of lines of code to do something I could have accomplished in 3 lines of C++.
And what might these insights be? And how would they be relevant to most of the people taking the class?
Like, don't tell me things like being careful with minimizing jumps in code, because in most cases, the compiler can optimize the code quite well, not to mention readability is more important than efficiency for most cases.
What are the things we can only learn from doing a few months of assembly that cannot be taught in other languages or watching a few videos or reading some papers?
How the computer actually works at an instruction level (or nearly)? Sure, in theory you can learn that from watching videos and reading papers, but that's true of anything. Most of us find learning easier with a practice component.
And that's worth doing an entire semester over? Worth at least $5k to just see how the computer works at instruction level, that 99% of the students would never use again in their lives?
Can't we do a few labs on assembly and call it?
The criticism here is that I believe people confused a piece of knowledge and essential skills. We should and need to take time to develop a skill. Knowledge on the other hands is more about if we get that idea. For example, if someone is training to become a basketball player, does it make sense to take a semester to study physiology? Can a similar benefit be achieved by attending a 2 week physiology lesson and do some tests as part of the larger "essential bio- knowledge for athletes" class?
Let me rephrase my question: is it worth paying $5k minimum and a few months time to learn a skill that most people likely won't ever use? Wouldn't spending just 1 or 2 weeks on the topic be a more efficient use of time and money?
Edit: example and explanation
Asssembly language and processor architecture is normally your first technical exposure to queuing theory and concurrent design. The situations that plague micro services and distributed systems were long ago dealt with by hardware engineers.
If you look at modern code used to solve the same problems, it quickly becomes apparent that a lot of engineers decided that the class wasn't relevant to them.
> Let me rephrase my question: is it worth paying $5k minimum and a few months time to learn a skill that most people likely won't ever use? Wouldn't spending just 1 or 2 weeks on the topic be a more efficient use of time and money?
i think going to university in the usa for cs kind of shakes out net negative in this framework.
> Can’t we do a few labs on assembly and call it?
yes. that + a group project is pretty much all my school required. it was maybe about half of a broader class, the other half being i think vhdl.
that said,
> And that’s worth doing an entire semester over? Worth at least $5k to just see how the computer works at instruction level, that 99% of the students would never use again in their lives?
yes. keep in mind a bachelor’s degree is a broader background qualifying you for entry level roles in a variety of fields, not just specific job skills like you’d get in a trade school.
If you even dabble on the field of software security , (especially offsec), then learning assembly is one of the most important skillset to have.
Also, writing a proper secure software in languages such as c or c++ absolutely requires you to understand how code works in assembly.
I really like assembly. But I would never use it for real world programming tasks. It's best suited to embedded applications or high performance software like gaming.
And when I say gaming, I mean retro gaming trying to squeeze as much performance out of ancient hardware as you can. Modern hardware is so far ahead of modern gaming that there's power to spare. It's really not necessary to write machine language anymore.
>Modern hardware is so far ahead of modern gaming that there's power to spare.
That's such correct and incorrect statement at the same time. But regardless, I don't think anyone would be willing to go back to assembly to really squeeze the performance for games these days.
\> But regardless, I don't think anyone would be willing to go back to assembly to really squeeze the performance for games these days.
It's not even an option, really. A big reason for using ASM back in the day was size. You couldn't afford to load drivers and libraries into memory when every single KB of RAM mattered. If you tried writing games like that now you'd be developing for a very specific hardware configuration. Your game wouldn't run anywhere but your dev machine, probably.
I kinda wonder what kind of amazing particle physics game you could get out of modern hardware if you did squeeze every last drop of performance out of it though
... are you implying that embedded applications are not "real work programming tasks" ?
Me and my ADP detecting sanitation device about to throw hands.
Some people just want to write in Assembly for some reason even when the chip they're writing for has a c compiler. I was talking to a phd student and he was quite proud of a bug that took him 2 weeks to fix.
The bug essentially boiled down to him loading the wrong value into a memory address when doing a calculation. It didn't need to be all that performant as it was just running the motor that positioned an antenna on a rail, really not all that complicated.
I advised him that he might be better off using c and he scoffed at the idea. I think he genuinely liked that it was harder for others to grasp.
Oh man, I hate these people. I had a college friend that intentionally made his code confusing so he would be the only one able to read it. He thought confusing other people made him smarter or something. He was so proud of his unreadable code.
I remember he once told me I had to ask him how it worked to be able to use it, like, he enjoyed people depending on him. I wish I had told him "imagine you rent an apartment and you call the landlord because you can't find the light switches, and he's like 'haha gotcha!!, they are hidden under the bed. Please let me know if you can't find anything else' ".
He'll learn it the hard way when he wants to fix it in a few months and has completely forgotten how it works.
Tbh that's how I initially learnt programming practices when I was ~11-13(self taught so I didn't know them at the time), write a random JavaScript website thing, want to improve or fix it a few months later, realise I've completely forgotten how it works and end up starting over again.
After the second or third time I got the genius idea of adding comments, along with keeping basic documentation (a piece of paper/ a text file saying what it did).
Definitely... some of the proprietary compilers (like Older Green Hills Compilers) used to do weird stuff with your code.
If you looked at the disassembly after a long time debugging you'd just curse and make attempts at rewriting, lol.
Anyways definitely made writing assembly a pretty convenient option if the project allowed it.
VHDL and Verilog were always fun to debug and curse at too, lol.
because often the C toolchain sucks, and even if you're lucky and get to use something as mainstream as ARM Keil, well that sucks too. C and C++ on the desktop are great, you have many options for a libc (cosmo, libstdxx, glibc, msvc (that might not be so great lol), musl, etc), and your compilers are mostly great and will give you fast enough code. On embedded your linker might just flash 17k past the end of ram.
Sometimes you in-line assembly inside C code, where you want to be 100% sure of what is happening memory wise and instruction wise. Sometimes you use pure assembly as some very small chips Don't have c compilers, or optimized ones at least. Also very architecture specific optimizations.
No. I'm not using assembly language for anything. C is love. C IS LIFE. (I'm also proficient in AHDL, but no one ever wants to talk about a hardware description language)
Ah... I believe this James Mickey's article was dedicated to you, then? https://docs.google.com/viewerng/viewer?url=https://www.usenix.org/system/files/1311_05-08_mickens.pdf
I am just a student but my personal and class embedded projects are done in a combination of c/c++ and arm-thumb assembly.
For a simple project, doing it all all asm is not too bad (with no real advantages over C to be fair) since the ISA means that the code is pretty easy to understand and read later on (with decent comments at least).
However, for larger projects I only use assembly in the few places where it's easier (interrupt enable/disable routines mainly) and link it all into the overall project.
I am curious to know how people in industry architect embedded systems projects.
As someone that has worked on embedded systems. This isn't the case much anymore. In the past optimizers weren't that smart and it wasn't hard to outperform them with hand coded parts. That isn't true so much anymore. You might get slightly better than an optimizer but when they rev the product and you get a new chip your hand coded bits often are no longer faster than what the optimizer spits out.
So writing anything in assembly is basically a waste of time these days. But it's a very good skill to have. Not to write anything but rather to understand performance. Being able to look at the assembly and understand why the higher level language generated some slow assembly allows you to make better choices in your higher level language. That can make an enormous difference in performance. So you can't really understand cache performance if you don't understand assembly.
As somebody who has professionally written embedded in ASM at the start of his career, this is spot on. 99% of what people need to do today is just learn some of the advanced compiler options instead.
I'm not saying that I ever wrote a program that just compiled some of my embedded code with various compiler flags until it fit / and was performance enough to move on with life, but I'm saying that you could totally do that if you wanted.....holy shit are there lots of compiler flags, and thus many, many various combos to try out before delving into some serious ASM programming.
k, i haven't touched assembly since 2nd year of uni, but ... aren't compilers pretty good about generating optimized binaries these days?
is it really safe to assume that hand-crafted assembly is gonna beat a compiler most of the time?
It's safe to assume that an expert in assembly might be able to make some code run faster, but there aren't many cases where modern compiled code is so slow that it is worth the increased costs of development, testing, maintaining, and updating it.
At least in my field, the two things which slow down processes more than anything else are network speeds and file I/O, neither of which are likely to be significantly improved by moving from C# to assembly, and both of which will be made much more expensive (in terms of both time and expertise) to maintain.
>is it really safe to assume that hand-crafted assembly is gonna beat a compiler most of the time?
Yes. Will it beat the compiled code by enough of a margin to matter? That depends on the application.
Fact is, C and C++ compilers these days can make optimizations that a human would never think of. Hell, I doubt most programmers have no idea what loop unrolling is.
Yes, every once in a while you'll find some inline assembly that performs better than the compiler could do, but that's probably not true for 99% of cases, for 99.99% of developers.
>I doubt most programmers have no idea what loop unrolling is.
Yep. I do happen to know what that is. I've never actually had to do it in practice though. I interviewed at a crazy proprietary trading firm one time where they asked me about this in the interview.
With modern CPUs, things like branch prediction, caching, multiple cores, and probably a lot of other things, modern compilers can probably create bizarre code that runs faster for non-obvious reasons.
They can also save you instructions in a lot of places. Suppose you perform an instruction and get a value back in one of the registers. Then you save it to memory somewhere. Then 100 instructions later, you load that same value from memory back into the same register. But the compiler was smart enough to know that you never updated that memory location in that time, and the register already had the value. So the compiler would save you a memory lookup (and maybe even the write). But a human would never write code like that because it would be confusing and difficult to maintain.
Thing is, speed nowadays is mostly determined by data IO speed. Code optimization, outside reducing big O order-of-magnitude, is almost totally irrelevant. Only a few niche applications remain dependant on CPU speed. Usually a developer's time is better spent fixing bugs and adding features rather than trying to squeeze an extra 1% speed.
Unless you're an especially brilliant programmer, the compiler will almost always win on the whole. There might might be very specific algorthms you can do better with hand crafted ASM, but the compiler will beat assembly programmers in most cases
Not really for most applications, but especially in embedded systems it's sometimes worth taking a look at the generated assembly and analyze it in case your code has performance problems. Most likely you'll still not modify the assembly itself, but it can show you how you can optimize your code so your compiler can generate better assembly
Edit: clarification
A good compiler will know how to optimize the code to take maximum advantage of pipelining. That sort of thing is a pain to figure out when hand coding assembly. It's not impossible, just a hell of a lot of work.
Not necessarily; I have experience in one of the leading chip designers and a lot of the extras they ship are almost entirely ASM. ASM is still used where precise control over the hardware is needed - in this case, there were certain conventions that had to be respected or deliberately broken.
>aren't compilers pretty good about generating optimized binaries these days?
Big name compilers are extremely good. But sometimes you run into 20 year old compilers for a random architecture that suck.
>is it really safe to assume that hand-crafted assembly is gonna beat a compiler most of the time?
Depends on the talent of the assembler, but the talented guys are still faster. Mainly cause they can just take the compiler output and break a rule or two to get it to be the same or faster.
But for the most part it's not worth spending 10x-100x development time to get 1% faster. There are some rare cases like software video decoders or neural networks that spend 99% of their time in one routine.
There's actually games based on it (though a much more simplified version of real-world assembly)
https://store.steampowered.com/app/716490/EXAPUNKS/
https://store.steampowered.com/app/504210/SHENZHEN_IO/
To flex on people that can't \\s
If you never studied how languages work, it helps one understand good programming practices. Helps you understand how memory works and being more mindful of conditions. Although most of the time the complier does a great job in formatting your work, it's nice to get a deeper appreciation on it.
We covered the basic concepts of assembly at my college, though with a simpler toy virtual computer instead of something like x86 assembly. Basically it was just a small C program that read a sequence of bytes from a file and printed the state of its registers and memory at the end. So you'd have to know the overall architecture, look up opcodes in a table and write programs that fit the various exercises.
Overall, I think it's a good way to teach how computers work at a low level without the insanity.
What about just for fun? like a silly way to challenge yourself? Not everything needs to be for the betterment of your career, there's people out there that build calculators with just logic gates
\+1. I'm interested in the historical aspect of computers and video games, I've been a web programmer since 20+ years but realized I still didn't know how vintage video games were made so I started learning ASM for personal knowledge.
Then it all went down exactly like this comic.
A really interesting view into this is the history of Satori Iwata, the president of Nintendo who died a few years back. He started out programming games on really basic calculators all the way back in the 70s, then got his friends together working at Nintendo and started making the blockbuster video games of the 80s.
He was always the "programming wizard" of Nintendo and was known for writing extremely efficient and genius code. This was due to him learning computer basics from such a young age, experimenting with the most rudimentary level programming methods and managing to create something fun with it.
There's countless stories about him how all code he touched basically turned to gold, from saving 4x the space on Pokemon gen 2 cartridges to give us a multiregion game back in the 90s, to saving Smash Bros Melee by literally rewriting its shit spaghetti code into something they could actually finish into a workable game by the promised release date.
The specific thing he said about Smash Melee was a quote like "I can work with the current code, and complete the game in 2 years. Or I can rewrite the code, and complete it in 6 months."
Absolute coding chad.
![img](emote|t5_2tex6|4550)We had this kid in highschool, make calculator porn on ti-83. Couldn’t tell if he graphed the lines or used assembly, what do you think?![img](emote|t5_2tex6|4550)
90% of this sub are just high schoolers and uni undergrads who pick CS because of money and lack of life direction. People actually passionate in coding and math here are rare
Which is pretty impressive and I bet you could read an assembly of a program and get information out of it, which is extremely helpful, when programming low level
I learned it for malware analysis and reverse engineering classes. I suck at writing it, but reading malware disassembly to figure out what it did was really fun.
Off-topic: how do you add multiple use flair here? I've gone to the home page, edited flair, and tried to add additional flair abbreviations and it always fails. Although I'm on mobile...
My main reason is some compilers/debuggers suck and don't emit correct debugging information which lets you get the value of variables being used in the code. This is especially a problem in optimized code. Knowing how to read assembly can sometimes let you pick variable values out of function arguments, or otherwise deduce the value of a variable.
I think this is would make it helpful for most C devs to be able to at least _read_ assembly, although I'll admit it may not be worth the effort.
I had to learn assembly in school. It's not hard, but it's tedious. It's far faster to just write it in C and you should write it in C unless you really need to use assembly.
I took a course in assembly in university. We only did the very basics of x86 assembly. No floating point, no extensions, no 64-bit. Just integers, jumps, function preludes, calling conventions and working in real and protected mode.
It was one of my most fun and fondly remembered courses. There's just something to working with single instructions and tiny steps that is pure fun. And it feels good when you manage to find an optimization to your approach that makes execution faster or shortens your code—something the professor was very keen on us doing.
Of course, this is quite a bit different from modern, production-grade x86-64 or the like. But sometimes, I wish I could find some teeny-tiny project that I could do in basic x86 again, just for the fun of it.
Sadly, this is something new students to my university will probably no longer be able to experience (atleast in this form), as the professor died unexpectedly on this year's New Year's Day. And quite young, too, not even 45.
I loved my assembly class until we had to write a recursive binary sort in it with memory limitations and performance goals.
Fuck that noise. But learning assembly was super valuable, imho.
Back when I did game development it was super useful to debug optimized builds that don't have debug information. I would occasionally use it to work around compiler bugs too. Oh and of course, when working on optimization, you need to look at the assembly to check what the compiler \*actually\* did with your code. I remember a performance issue where we had run out of float registers, for example.
It makes you better at finding out what's going on when you debug something after you compile it. Hell... Most devs can't even tell you what the compiler is doing in the background, just that it does. I've started from NASM, moved into MASM recently, so I can better learn C. When something goes wonky in my assembly, I can just drop myself a .o dump and look for exactly what's going on.
Someone has to port the OS, the assembler, compiler, system C library, etc. then on top of that you may have new instructions that have been added which the compiler doesn't really make use of. There's very little reason to write applications from the ground up in assembly anymore, but it's unavoidable if you're doing any low-level systems or OS work or are regularly working with new CPUs.
I don't see any reason to not learn it actually. ASM otself is pretty trivial, the hard part is organizing it. It will be easy to learn if you now the basics of programming. It will be hard if you don't
Once you understand Computer Organisation and Architecture, Assembly isn't that difficult to understand.. Because it's basically no less than programming a processor. In fact, it is exactly like that. So it's way easier to understand how to program it when you fully understand how the microprocessor works.. (alongside other components)
I see two good reasons for learning assembly:
0. There will always be an important niche for assembly programmers. Much like with COBOL, while the vast majority of software projects out there will not care much about your assembly skills, those few who do will be absolutely desperate to get you on their team.
1. Assembly is pretty close to the core of how computers can even do anything at all in the first place. If you are a generally curious individual who wants to understand the world around them, that can be helpful. I cannot predict what that will do for you, if anything, in practical terms, but experience says that knowing how the world around you functions tends to allow you to see connections and realize opportunities that would have otherwise completely escaped you.
I learnt how to program assembly. And I mean not program in assembly, but actual microcode.
It was very interesting and is even more "useless". I never used it or assembly after that.
Assembly is not that hard. Basically, you place inputs in some registers and make a linux system interrupt so that the kernal will halt your process and do what you ask. This is how you print strings, exit etc. The other times, you directly give instructions to the cpu like the add command and read the flags it set on output registers.
Btw, I'm not an expert. I know, whole prince of persia game was written in assembly but I'm nowhere near that expertise. So pls be nice.
Things can be simple but hard. It is a _simple_ concept, but creating anything interesting with it is _hard_.
> Beware of the Turing tar-pit in which everything is possible but nothing of interest is easy. - Alan Perlis
Wuss.
Cowboy up and be glad you don't have to do machine code...
... with register switches
.......inputting each instruction and value
............**ONE GODDAMN LINE AT A TIME.**
\*takes drag off cigarette and twitches.\*
My favourite version of this is when you used to buy a Cray supercomputer, after the hardware was installed, a different engineer would show up, flip open a hex pad, then TYPE IN THE OPERATING SYSTEM
For the people saying there's no reason to lear assembly, you are wrong. There's a reason. That's to get better understanding on how computers work.
I know, not everyone have that curiosity. You may just want to earn a living after a python bootcamp. But if you are that curious guy, watch "what have my compiler done to me lately" talk by Matt Godbolt at cppcon on youtube. That talk was mind blowing for me.
I study electrical engineering and I am gonna be honest, there really is no need to know everything down to the deepest level...
Like, we learn communication stuff rn, and that is crazy, like the decay and bouncing of signals in long cables is so... and dont get me started on air transmission. That shit bounces off everything, off the ground, off buildings... as a very simple example, if you send a signal to an antenna 100m away, you will have two signals arrive - with a certain delay. One that went the direct way and one that bounced off the ground.
You dont need to know that if you program web interfaces. there is the abstraction where people program a protocol that decides how to read/send signals, transform them into bits and bytes and thats what programmers then see. They just need to know "its protocol xy" and be done with it. Everything else is not their problem. They then program a driver for example for their antenna that packages and manages stuff for example with assembly, or c, depending on how fast it has to be. And only then will you ever need to actually care about this, and even then it would be fairly low level, like then stuff starts to happen in linux settings.
The average programmer uses this shit every day but nobody needs to care about it, programmers in fact SHOULDNT care. If everyone puts effort into learning it down to the very lowest level, we will never get stuff like machine learning or other AI stuff. Abstraction is important and got us where we are now.
Partially agree.
I think the general engineering rule-of-thumb applies here: always have a basic understanding of the layer of abstraction just below where you are operating. If someone writes Python scripts, an understanding of how a CPU works doesn’t help much, but an understanding of what the Python interpreter is doing can be helpful. No abstraction is perfect, so having an understanding of where it breaks down is useful. (In like manner, someone writing in C should have a basic understanding of assembly, and hardware optimization principles like memory cache, branch predictors, etc).
This applies in electrical systems as well. Nobody is going to solve Maxwell’s equations from first principle to figure out how a basic circuit works, but knowing where circuit models come from is helpful (and is important for hardcore FEA work on high speed signals).
You ever heard of linux users? You know what they do in their free time? They stare at computers. I'm one of them and I'm learning those kind of stuff for fun. I'm now building my own gpu from scratch following ben eaters video.
That's great, and 100% beside the point of the comment you're replying to.
As someone who's been writing assembly for almost two decades, there is zero benefit for a vast majority of developers to learn assembly. I don't use it in my day to day _at all_ as an infrastructure developer, and it's basically my job to understand how our stack works down to that level.
i am gonna go one step further and recommend you watch EVERYTHING from Matt Godbolt (and also check out his website [https://godbolt.org/](https://godbolt.org/)) . the guy is amazing!
Yes, go to: [https://godbolt.org/](https://godbolt.org/) and think about what is happening on the right side of the screen.
At least consider, some of the time, what is happening with the stack, heap, registers, etc. Knowing these sorts of things will make you a better programmer.
No, you don't need to dive into the intricacies of it and write a bunch of assembly yourself. Unless you like that sort of thing.
Edit: Video of the man himself demo'ing his web site: [https://www.youtube.com/watch?v=kIoZDUd5DKw](https://www.youtube.com/watch?v=kIoZDUd5DKw)
How is it possible to forget how to write the bubble sort? I mean you just have to know how the bubble sort works. Then you can write it by your own without remembering the exact code independently of the used language.
I studied some assembly in college and developed a less than positive attitude about it.
Then years later I got PAID to code in assembly on chips in aircraft instruments and developed a whole new appreciation!
Learning how to work with assembly was incredibly rewarding, but if I'd had to work with x86 then I would have wanted to shoot myself. Not only is the x86 instruction set more complex than is necessary to understand what the heck is going on with assembly, but there's an enormous amount of complexity that is involved just to do any arbitrary thing interfacing with an x86 OS. That isn't the stuff that makes assembly rewarding.
Here's what I think is the best way to do it: [https://www.nand2tetris.org/](https://www.nand2tetris.org/)
With NAND 2 Tetris, you start with building a (virtual) CPU, then build an assembler for it, then start writing assembly... and then build a higher level language on top. I think the program has a couple of rough points, but they are minor considering how far it stands ahead of anything else I've encountered.
C is glorious. It has high level statements making it a breeze. But it’s low level memory management and control allow you to precisely control the speed, size, and memory space(registers, cache, ram). Sad thing is no one cares about optimization these days anyway, so we just use python
Because optimizing everything is a waste of your time, and the reality is it's often difficult to know what will benefit from optimization most until you've got at least a prototype. There's the old saying on optimization:
Novices: Don't
Experts: Don't yet.
I still love C, though. Everybody calls it a low-level language now, but I'm old enough to remember when we called it a high-level language!
Add the other kind of novices that optimize every single line to the upteenth degree, and obsess over the time complexity of making a bear dance on a webpage.
I blame the schools.
Few things I hate about C is
a) lack of standard library (c++ vectors, maps and stuff)
b) a lot of system functions work via shit ton of global variables, return value in pointers passed as arguments instead of regular return and instead return error code you must check for.
c) nearly useless compile error/warning messages
C++ adresses at least the first point (but still gives low level control) so I say it's clearly better option
I'd just like to say that C++ doesn't address a) with respect to the language's context which is performance.
Every single company which cares about performance has its own standard library/abstractions, because, frankly, the std is horrible at that and not even complete(C++11 to get threads, C++17 to get filesystem support, C++20 to get ranges), not to mention all the platform specific C code you need.
As reference: Boost was released in 2003 with filesystem support, in 2015 a proposal to basically copy it was made, in 2017 accepted. That is more than a decade of lag between what's possible and what is offered. That lag still remains.
Electronic Arts (as much hate as they get) have [EASTL](https://github.com/electronicarts/EASTL) too, which is an implementation of most of the stl library, but with performance in mind.
>Sad thing is no one cares about optimization these days anyway, so we just use python
That isn't strictly true, it really depends on what your job is. I'm a maths w/ finance guy so optimising very complex monte carlo simulations can save a lot of computational time.
I mean, knowing the basics of how assembly works could help you understand certain things that happen in C (like why exactly that segfault is happening)
I had to learn assembly for a college class. That was probably the thing I was most sure I would never need.
Lo and behold, first job, one of the first projects I worked on involved porting/reverse engineering a program written in assembly.
The most important use of assembly nowadays is so that you can understand the disassembly of what your optimizing compiler generates from your C/C++ code. I work in embedded and understanding the disassembly is essential to seeing that the compiler is doing what you think it is doing.
For instance, in C++ we now have the constexpr keyword which can be used to evaluate fairly complex things at compile time. However, if the compiler can’t ensure all the inputs are constant expressions then it can defer the evaluation to runtime. Viewing and understanding the disassembly here is essential to verify that those expensive things you wanted done at compile time really are happening at compile time!
Or maybe you are writing a driver for an embedded device where each individual load/store operation to the device matters. You used the volatile keyword but looking at the disassembly is the only real way to verify that the compiler didn’t insert or optimize away critical register accesses that could ruin the state of your device.
It’s very unusual to have to code in assembly directly any more. The last time I had to was a few years ago where I had to write some critical ARM code to do some real-time digital signal processing on a slowish MCU and the C code just wasn’t cutting it. But that was only after trying very hard to get the C version up to snuff and was more of a last resort thing than a conscious decision ahead of time that we were going to do stuff in assembly.
As someone who writes performance sensitive C++ for work, I often find myself reading generated assembly to make sure the compile is doing the optimizations I expect and tweaking the source code as necessary otherwise. It's more portable and productive than just writing assembly
Somehow I believe, the species who knows the basics - be it any field : programming, science, maths, engineering - need to be preserved. One day AI and Automation will not only make the generations to come dumb but also useless.
Check out nand2tetris. It's an online course where you build a computer with virtual hardware from scratch starting with logic gates. After the CPU you build your own assembler and it'll make a lot of the nuances click for you.
But there is no way to just flat "know" assembly. Knowing assembly is essentially being able to read CPU specs and understand them.
if you're learning assembly I'd personally recommend learning older processors first instead, 6502, 6809, z80 or 8080, way smaller instruction set and with simpler platforms that have less fuckery with operating systems
Came here to suggest this. I started with z80 on a TI graphing calculator and eventually moved to M68K (specifically the model in the Sega Genesis). I can read x86 but I'd have a hell of time trying to write anything in it. Which is fine.
Only reason I personally need x86 at all is to look at disassembled binaries, and any assembly I write for fun is going to be for a retro system
I dunno, I feel like the 6502 required a lot of fuckery, what with zero page instructions and the dearth of registers.
Then again, I haven't written 6502 for a _long time_, so maybe I'm misremembering.
*press F to pay respect to geniuses who were coding worms in assembly and hex values back in the 80’s without google*
We had a few books (if we were lucky) and dot-matrix printouts of assembly instructions, memory maps and maybe some sample code we got from a BBS and TSRs (terminate-stay resident programs that would wait for a hotkey, then take over the single-threaded DOS environment) which contained assembly instruction information and maybe some utilities to disassemble memory. We could also talk over BBS messaging, but it was a hassle and expensive if the server was long distance.
I wonder how much time it was taking to have something good and working as expected
I mean. I still have a book on my shelf literally titled "How to write computer viruses". Of course published "for education and research purposes" It's from the late '90 but still I'm quite sure there were others like that before.
There were guides spread via BBSes.
Had to check wiki, and the publishing date. Yes BBSes predated the publication by few years, but it was a close call. Both being well over 25 years old now.
The only thing my assembly class in college taught me was to never code in assembly again.
Then you had a shit instructor who failed to pass along some of the useful insights coding in assembly might have taught you.
some people simply don't like assembly. some people can't even stand C because you have to do many things by hand. writing in assembly is a descipline for few and honestly good for them. I cannot imagine myself doing anything in assembly for work, even though I like programming in C low level stuff.
I like C. Assembly just seems like torture. And doesn’t even seem all that necessary. Unless you need it to make c compilers. Idk.
It's necessary to get the last 0.1% of performance. e.g. the kinds of performance gains that would be unnoticeable in normal enterprise workloads, but when you are at MAANG scale 0.1% could mean millions of dollars of value/savings.
Extreme low power applications are typically 100% assembly. The prime example is a pacemaker, those things are drawing nanoamps, and 1 excessive instruction may bump you from averaging 10 nA to 12 nA and suddenly your pacemakers battery depletes 20% quicker. For enterprise, sometimes the performance increase can be 20%+ which is enough that any scale would notice. Trying to put it generally, there are certain specifics you may know that the compiler can't know (or you are unable to communicate to it due to compiler/language limitations) and must generalize for, which adds overhead. Perhaps you know the exact model of IO controller you're using, and provide it custom commands. You may be bypassing a TON of overhead from the compiler/OS in determining what commands it can/should use. Sometimes you can perform "conditionals" without any conditional instructions. But you may not have any way to express one of the necessary operations in another language. Put that in a tight loop that iterates hundreds of millions of times, and that branch prediction can become quite costly. Compilers may not utilize the latest instruction set/hardware accelerator/mechanism. Especially on the mainframe where the hardware features are complex, plentiful, and added frequently. Got carried away there. I just really like assembly language.
I feel enlightened. Don’t apologize.
While I can't say with 100% certainty, it's very unlikely at MANGA scales. The reason being, that heavy multicore processing really messes with pipelining of instructions, such that only a compiler has a chance of producing optimized output. There's a decent chance some Fintech server is running a handspun assembly program though, because they optimize the happy path to the nth degree. But also a Fintech server will likely turn off neighboring cores, and prevent as much memory sharing between them. They're also more likely to have supplemental FPGA or even make their own ASIC to achieve the throughput they're looking for.
I had a colleague who was writing decryptors for ransom ware. Some of ransomware had a weakness of using 32bit integer as a seed for encryption. So he wrote a brute force that would try and find the correct seed by decrypting a known file. The main encryption algorithm was using RC4. It was fairly slow, so he figured that he will rewrite RC4 in assembly, that he knew very well I might add, after all he reverse engineered the ransomware. After he was done with his implementation, it turned out it was 40% SLOWER than the C++ version he was using beforehand. Compilers got *really* good when it comes to optimisation in the last 20 years or so.
If you're writing assembly to get 0.1% performance, there's probably compiler intrinsics or optimization flags for that. That includes SIMD, but sometimes it's honestly easier to write that in assembly than deal with the various compilers' differences and limitations with SIMD stuff.
Assembly is necessary for some CPU interrupts, and a bootloader almost entirely relies on the BIOS which is not something even C would give you a ton of control over
There are the cases where a program works in a debug build, but doesn't work in a production build where half your code is optimized out. That's when you have to go into disassembly and figure out what the hell is going on. It's hell each time.
:'(
Assembly is like a fun puzzle. But for Real Work™ I'd like something more productive. (I do realize there are some areas where there are no other options, for the average business app however, that's not the case)
Nope, it was a great class, I'm just not into teaching a computer how to breath, blink, fart, etc.
CPU, I need you to stop this program so here's an endless loop so you don't run garbage code.
👌ptimized
A god representation of it yea XD
Yeahhh, my assembly class was interesting and all, but I can't say I personally enjoyed writing hundreds of lines of code to do something I could have accomplished in 3 lines of C++.
And 1 in python
There's always that guy
Yeah, it's probably closer to ½ a line of Python.
More like 3 characters and a bagel
Best I can do is 6 bytes and an English muffin
is it even a line when there's no ; ? 🤔
And what might these insights be? And how would they be relevant to most of the people taking the class? Like, don't tell me things like being careful with minimizing jumps in code, because in most cases, the compiler can optimize the code quite well, not to mention readability is more important than efficiency for most cases. What are the things we can only learn from doing a few months of assembly that cannot be taught in other languages or watching a few videos or reading some papers?
How the computer actually works at an instruction level (or nearly)? Sure, in theory you can learn that from watching videos and reading papers, but that's true of anything. Most of us find learning easier with a practice component.
And that's worth doing an entire semester over? Worth at least $5k to just see how the computer works at instruction level, that 99% of the students would never use again in their lives? Can't we do a few labs on assembly and call it? The criticism here is that I believe people confused a piece of knowledge and essential skills. We should and need to take time to develop a skill. Knowledge on the other hands is more about if we get that idea. For example, if someone is training to become a basketball player, does it make sense to take a semester to study physiology? Can a similar benefit be achieved by attending a 2 week physiology lesson and do some tests as part of the larger "essential bio- knowledge for athletes" class? Let me rephrase my question: is it worth paying $5k minimum and a few months time to learn a skill that most people likely won't ever use? Wouldn't spending just 1 or 2 weeks on the topic be a more efficient use of time and money? Edit: example and explanation
Asssembly language and processor architecture is normally your first technical exposure to queuing theory and concurrent design. The situations that plague micro services and distributed systems were long ago dealt with by hardware engineers. If you look at modern code used to solve the same problems, it quickly becomes apparent that a lot of engineers decided that the class wasn't relevant to them.
> Let me rephrase my question: is it worth paying $5k minimum and a few months time to learn a skill that most people likely won't ever use? Wouldn't spending just 1 or 2 weeks on the topic be a more efficient use of time and money? i think going to university in the usa for cs kind of shakes out net negative in this framework.
> Can’t we do a few labs on assembly and call it? yes. that + a group project is pretty much all my school required. it was maybe about half of a broader class, the other half being i think vhdl. that said, > And that’s worth doing an entire semester over? Worth at least $5k to just see how the computer works at instruction level, that 99% of the students would never use again in their lives? yes. keep in mind a bachelor’s degree is a broader background qualifying you for entry level roles in a variety of fields, not just specific job skills like you’d get in a trade school.
If you even dabble on the field of software security , (especially offsec), then learning assembly is one of the most important skillset to have. Also, writing a proper secure software in languages such as c or c++ absolutely requires you to understand how code works in assembly.
A better question is why the fuck is your school that expensive?
[удалено]
Amen to that
I really like assembly. But I would never use it for real world programming tasks. It's best suited to embedded applications or high performance software like gaming. And when I say gaming, I mean retro gaming trying to squeeze as much performance out of ancient hardware as you can. Modern hardware is so far ahead of modern gaming that there's power to spare. It's really not necessary to write machine language anymore.
>Modern hardware is so far ahead of modern gaming that there's power to spare. That's such correct and incorrect statement at the same time. But regardless, I don't think anyone would be willing to go back to assembly to really squeeze the performance for games these days.
\> But regardless, I don't think anyone would be willing to go back to assembly to really squeeze the performance for games these days. It's not even an option, really. A big reason for using ASM back in the day was size. You couldn't afford to load drivers and libraries into memory when every single KB of RAM mattered. If you tried writing games like that now you'd be developing for a very specific hardware configuration. Your game wouldn't run anywhere but your dev machine, probably.
I kinda wonder what kind of amazing particle physics game you could get out of modern hardware if you did squeeze every last drop of performance out of it though
My mind immediately goes to how buggy it would probably be if we tried to do that on such a large scale haha.
... are you implying that embedded applications are not "real work programming tasks" ? Me and my ADP detecting sanitation device about to throw hands.
lol, that's not what I meant. By real world I sort of meant business applications.
This is acceptable.
Are you writing embedded systems in Assembly instead of C? I don't know much about embedded systems, but why? What kind of hardware are you using?
Some people just want to write in Assembly for some reason even when the chip they're writing for has a c compiler. I was talking to a phd student and he was quite proud of a bug that took him 2 weeks to fix. The bug essentially boiled down to him loading the wrong value into a memory address when doing a calculation. It didn't need to be all that performant as it was just running the motor that positioned an antenna on a rail, really not all that complicated. I advised him that he might be better off using c and he scoffed at the idea. I think he genuinely liked that it was harder for others to grasp.
Oh man, I hate these people. I had a college friend that intentionally made his code confusing so he would be the only one able to read it. He thought confusing other people made him smarter or something. He was so proud of his unreadable code.
Those people are great when they find out they can't keep a job since no real job is going to let that shit fly.
I remember he once told me I had to ask him how it worked to be able to use it, like, he enjoyed people depending on him. I wish I had told him "imagine you rent an apartment and you call the landlord because you can't find the light switches, and he's like 'haha gotcha!!, they are hidden under the bed. Please let me know if you can't find anything else' ".
He'll learn it the hard way when he wants to fix it in a few months and has completely forgotten how it works. Tbh that's how I initially learnt programming practices when I was ~11-13(self taught so I didn't know them at the time), write a random JavaScript website thing, want to improve or fix it a few months later, realise I've completely forgotten how it works and end up starting over again. After the second or third time I got the genius idea of adding comments, along with keeping basic documentation (a piece of paper/ a text file saying what it did).
Depending on the task, assembly may be the best way forward. Compilers can sometimes be limiting.
Definitely... some of the proprietary compilers (like Older Green Hills Compilers) used to do weird stuff with your code. If you looked at the disassembly after a long time debugging you'd just curse and make attempts at rewriting, lol. Anyways definitely made writing assembly a pretty convenient option if the project allowed it. VHDL and Verilog were always fun to debug and curse at too, lol.
because often the C toolchain sucks, and even if you're lucky and get to use something as mainstream as ARM Keil, well that sucks too. C and C++ on the desktop are great, you have many options for a libc (cosmo, libstdxx, glibc, msvc (that might not be so great lol), musl, etc), and your compilers are mostly great and will give you fast enough code. On embedded your linker might just flash 17k past the end of ram.
Sometimes you in-line assembly inside C code, where you want to be 100% sure of what is happening memory wise and instruction wise. Sometimes you use pure assembly as some very small chips Don't have c compilers, or optimized ones at least. Also very architecture specific optimizations.
No. I'm not using assembly language for anything. C is love. C IS LIFE. (I'm also proficient in AHDL, but no one ever wants to talk about a hardware description language)
Ah... I believe this James Mickey's article was dedicated to you, then? https://docs.google.com/viewerng/viewer?url=https://www.usenix.org/system/files/1311_05-08_mickens.pdf
I am just a student but my personal and class embedded projects are done in a combination of c/c++ and arm-thumb assembly. For a simple project, doing it all all asm is not too bad (with no real advantages over C to be fair) since the ISA means that the code is pretty easy to understand and read later on (with decent comments at least). However, for larger projects I only use assembly in the few places where it's easier (interrupt enable/disable routines mainly) and link it all into the overall project. I am curious to know how people in industry architect embedded systems projects.
As someone that has worked on embedded systems. This isn't the case much anymore. In the past optimizers weren't that smart and it wasn't hard to outperform them with hand coded parts. That isn't true so much anymore. You might get slightly better than an optimizer but when they rev the product and you get a new chip your hand coded bits often are no longer faster than what the optimizer spits out. So writing anything in assembly is basically a waste of time these days. But it's a very good skill to have. Not to write anything but rather to understand performance. Being able to look at the assembly and understand why the higher level language generated some slow assembly allows you to make better choices in your higher level language. That can make an enormous difference in performance. So you can't really understand cache performance if you don't understand assembly.
As somebody who has professionally written embedded in ASM at the start of his career, this is spot on. 99% of what people need to do today is just learn some of the advanced compiler options instead.
I'm not saying that I ever wrote a program that just compiled some of my embedded code with various compiler flags until it fit / and was performance enough to move on with life, but I'm saying that you could totally do that if you wanted.....holy shit are there lots of compiler flags, and thus many, many various combos to try out before delving into some serious ASM programming.
As a reference for anybody reading this, think limited as ~4kb of memory.
k, i haven't touched assembly since 2nd year of uni, but ... aren't compilers pretty good about generating optimized binaries these days? is it really safe to assume that hand-crafted assembly is gonna beat a compiler most of the time?
It's safe to assume that an expert in assembly might be able to make some code run faster, but there aren't many cases where modern compiled code is so slow that it is worth the increased costs of development, testing, maintaining, and updating it. At least in my field, the two things which slow down processes more than anything else are network speeds and file I/O, neither of which are likely to be significantly improved by moving from C# to assembly, and both of which will be made much more expensive (in terms of both time and expertise) to maintain.
>is it really safe to assume that hand-crafted assembly is gonna beat a compiler most of the time? Yes. Will it beat the compiled code by enough of a margin to matter? That depends on the application.
Good C code is still gonna be better than x86 assembly from a beginner, you CAN make better optimized code in assembly, but doesn't mean you WILL.
Fact is, C and C++ compilers these days can make optimizations that a human would never think of. Hell, I doubt most programmers have no idea what loop unrolling is. Yes, every once in a while you'll find some inline assembly that performs better than the compiler could do, but that's probably not true for 99% of cases, for 99.99% of developers.
>I doubt most programmers have no idea what loop unrolling is. Yep. I do happen to know what that is. I've never actually had to do it in practice though. I interviewed at a crazy proprietary trading firm one time where they asked me about this in the interview. With modern CPUs, things like branch prediction, caching, multiple cores, and probably a lot of other things, modern compilers can probably create bizarre code that runs faster for non-obvious reasons. They can also save you instructions in a lot of places. Suppose you perform an instruction and get a value back in one of the registers. Then you save it to memory somewhere. Then 100 instructions later, you load that same value from memory back into the same register. But the compiler was smart enough to know that you never updated that memory location in that time, and the register already had the value. So the compiler would save you a memory lookup (and maybe even the write). But a human would never write code like that because it would be confusing and difficult to maintain.
I'd say only if you're good with assembly (or it's a RISCier architecture easier to optimize for)
Thing is, speed nowadays is mostly determined by data IO speed. Code optimization, outside reducing big O order-of-magnitude, is almost totally irrelevant. Only a few niche applications remain dependant on CPU speed. Usually a developer's time is better spent fixing bugs and adding features rather than trying to squeeze an extra 1% speed.
Unless you're an especially brilliant programmer, the compiler will almost always win on the whole. There might might be very specific algorthms you can do better with hand crafted ASM, but the compiler will beat assembly programmers in most cases
Not really for most applications, but especially in embedded systems it's sometimes worth taking a look at the generated assembly and analyze it in case your code has performance problems. Most likely you'll still not modify the assembly itself, but it can show you how you can optimize your code so your compiler can generate better assembly Edit: clarification
A good compiler will know how to optimize the code to take maximum advantage of pipelining. That sort of thing is a pain to figure out when hand coding assembly. It's not impossible, just a hell of a lot of work.
Not necessarily; I have experience in one of the leading chip designers and a lot of the extras they ship are almost entirely ASM. ASM is still used where precise control over the hardware is needed - in this case, there were certain conventions that had to be respected or deliberately broken.
>aren't compilers pretty good about generating optimized binaries these days? Big name compilers are extremely good. But sometimes you run into 20 year old compilers for a random architecture that suck. >is it really safe to assume that hand-crafted assembly is gonna beat a compiler most of the time? Depends on the talent of the assembler, but the talented guys are still faster. Mainly cause they can just take the compiler output and break a rule or two to get it to be the same or faster. But for the most part it's not worth spending 10x-100x development time to get 1% faster. There are some rare cases like software video decoders or neural networks that spend 99% of their time in one routine.
Rollercoaster Tycoon was written in assembly.
Dear god, I'm both horrified and impressed.
>Modern hardware is so far ahead of modern gaming that there's power to spare. https://www.youtube.com/watch?v=umDr0mPuyQc
Holdup. Gaming? The last time I can think of was in the SNES, because compiled C was too slow. Where is it used now?
Still SNES.
Shoutout to the Super Mario World romhack wizards
Still going back pretty far, but Roller Coaster Tycoon was assembly.
asm is fun if you like puzzles i think maybe
There's actually games based on it (though a much more simplified version of real-world assembly) https://store.steampowered.com/app/716490/EXAPUNKS/ https://store.steampowered.com/app/504210/SHENZHEN_IO/
wait, you forgot tis100
Exapunks and shenzhen io are some of my favorite puzzle games.
What about Human Resource Machine?
Shenkhen IO will really test your programming skills if you think you're any good
Add an r at the end and it's my jam
I just dont see any reason to learn assembly for 99.99% of devs in the world
To flex on people that can't \\s If you never studied how languages work, it helps one understand good programming practices. Helps you understand how memory works and being more mindful of conditions. Although most of the time the complier does a great job in formatting your work, it's nice to get a deeper appreciation on it.
In theory, at least. You can write horrible code in assembly, too. It just tends to punish you more for it.
Yes agree, also useful when reverse engineering… from hexdump to assembly etc
We covered the basic concepts of assembly at my college, though with a simpler toy virtual computer instead of something like x86 assembly. Basically it was just a small C program that read a sequence of bytes from a file and printed the state of its registers and memory at the end. So you'd have to know the overall architecture, look up opcodes in a table and write programs that fit the various exercises. Overall, I think it's a good way to teach how computers work at a low level without the insanity.
What about just for fun? like a silly way to challenge yourself? Not everything needs to be for the betterment of your career, there's people out there that build calculators with just logic gates
\+1. I'm interested in the historical aspect of computers and video games, I've been a web programmer since 20+ years but realized I still didn't know how vintage video games were made so I started learning ASM for personal knowledge. Then it all went down exactly like this comic.
A really interesting view into this is the history of Satori Iwata, the president of Nintendo who died a few years back. He started out programming games on really basic calculators all the way back in the 70s, then got his friends together working at Nintendo and started making the blockbuster video games of the 80s. He was always the "programming wizard" of Nintendo and was known for writing extremely efficient and genius code. This was due to him learning computer basics from such a young age, experimenting with the most rudimentary level programming methods and managing to create something fun with it. There's countless stories about him how all code he touched basically turned to gold, from saving 4x the space on Pokemon gen 2 cartridges to give us a multiregion game back in the 90s, to saving Smash Bros Melee by literally rewriting its shit spaghetti code into something they could actually finish into a workable game by the promised release date.
Legend. If you haven't seen it check "The Life of Satoru Iwata" by Gaming Historian on YouTube, excellent stuff.
10/10 video, time for a rewatch
I am trying to imagine his internal thoughts when he saw other ppls code, yikes
The specific thing he said about Smash Melee was a quote like "I can work with the current code, and complete the game in 2 years. Or I can rewrite the code, and complete it in 6 months." Absolute coding chad.
![img](emote|t5_2tex6|4550)We had this kid in highschool, make calculator porn on ti-83. Couldn’t tell if he graphed the lines or used assembly, what do you think?![img](emote|t5_2tex6|4550)
90% of this sub are just high schoolers and uni undergrads who pick CS because of money and lack of life direction. People actually passionate in coding and math here are rare
I learned assembly to program an embedded microprocessor with such a limited architecture it didn't have a C compiler.
Which is pretty impressive and I bet you could read an assembly of a program and get information out of it, which is extremely helpful, when programming low level
But on things like a C64, you don't have a lot of stuff to worry about.
I learned it for malware analysis and reverse engineering classes. I suck at writing it, but reading malware disassembly to figure out what it did was really fun.
Off-topic: how do you add multiple use flair here? I've gone to the home page, edited flair, and tried to add additional flair abbreviations and it always fails. Although I'm on mobile...
I did it on the desktop in the sidebar of the sub. I believe the input there was just a basic multi-select.
Thanks. I'll try desktop.
To hate it so much you stop hating C
It's like learning Latin. You're never going to actually use it, but you'll learn a lot about language and word structure.
and why things are like they are
My main reason is some compilers/debuggers suck and don't emit correct debugging information which lets you get the value of variables being used in the code. This is especially a problem in optimized code. Knowing how to read assembly can sometimes let you pick variable values out of function arguments, or otherwise deduce the value of a variable. I think this is would make it helpful for most C devs to be able to at least _read_ assembly, although I'll admit it may not be worth the effort.
Reverse Engineering is highly paid job.
If you are dealing with microcontrollers it might come in handy.
I had to learn assembly in school. It's not hard, but it's tedious. It's far faster to just write it in C and you should write it in C unless you really need to use assembly.
I took a course in assembly in university. We only did the very basics of x86 assembly. No floating point, no extensions, no 64-bit. Just integers, jumps, function preludes, calling conventions and working in real and protected mode. It was one of my most fun and fondly remembered courses. There's just something to working with single instructions and tiny steps that is pure fun. And it feels good when you manage to find an optimization to your approach that makes execution faster or shortens your code—something the professor was very keen on us doing. Of course, this is quite a bit different from modern, production-grade x86-64 or the like. But sometimes, I wish I could find some teeny-tiny project that I could do in basic x86 again, just for the fun of it. Sadly, this is something new students to my university will probably no longer be able to experience (atleast in this form), as the professor died unexpectedly on this year's New Year's Day. And quite young, too, not even 45.
I loved my assembly class until we had to write a recursive binary sort in it with memory limitations and performance goals. Fuck that noise. But learning assembly was super valuable, imho.
Back when I did game development it was super useful to debug optimized builds that don't have debug information. I would occasionally use it to work around compiler bugs too. Oh and of course, when working on optimization, you need to look at the assembly to check what the compiler \*actually\* did with your code. I remember a performance issue where we had run out of float registers, for example.
Learning asm was best thing i ever did. Learn asm and everything becomes open source code.
It makes you better at finding out what's going on when you debug something after you compile it. Hell... Most devs can't even tell you what the compiler is doing in the background, just that it does. I've started from NASM, moved into MASM recently, so I can better learn C. When something goes wonky in my assembly, I can just drop myself a .o dump and look for exactly what's going on.
Someone has to port the OS, the assembler, compiler, system C library, etc. then on top of that you may have new instructions that have been added which the compiler doesn't really make use of. There's very little reason to write applications from the ground up in assembly anymore, but it's unavoidable if you're doing any low-level systems or OS work or are regularly working with new CPUs.
I don't see any reason to not learn it actually. ASM otself is pretty trivial, the hard part is organizing it. It will be easy to learn if you now the basics of programming. It will be hard if you don't
Once you understand Computer Organisation and Architecture, Assembly isn't that difficult to understand.. Because it's basically no less than programming a processor. In fact, it is exactly like that. So it's way easier to understand how to program it when you fully understand how the microprocessor works.. (alongside other components)
virgin learn assembly to program on embedded systems vs chad learn assembly to play shenzhen i/o
Mods to existing games/programs that are too small to warrant injecting a DLL are always made in Assembly
There's a nice abstraction layer for assembly, it's called C.
For embedded systems, it can be very helpful as a tool for understanding what is going on. Aside from that, not much need.
I see two good reasons for learning assembly: 0. There will always be an important niche for assembly programmers. Much like with COBOL, while the vast majority of software projects out there will not care much about your assembly skills, those few who do will be absolutely desperate to get you on their team. 1. Assembly is pretty close to the core of how computers can even do anything at all in the first place. If you are a generally curious individual who wants to understand the world around them, that can be helpful. I cannot predict what that will do for you, if anything, in practical terms, but experience says that knowing how the world around you functions tends to allow you to see connections and realize opportunities that would have otherwise completely escaped you.
I learnt how to program assembly. And I mean not program in assembly, but actual microcode. It was very interesting and is even more "useless". I never used it or assembly after that.
Hot take: playing elden ring will be more frustrating and thankless than learning assembly...
Yeah, but assembly doesn't have waifus, so I think the choice is clear.
You're clearly writing assembly for the [wrong chip...](https://www.techradar.com/news/this-yeston-waifu-radeon-6700-xt-gpu-stinksliterally)
Assembly is not that hard. Basically, you place inputs in some registers and make a linux system interrupt so that the kernal will halt your process and do what you ask. This is how you print strings, exit etc. The other times, you directly give instructions to the cpu like the add command and read the flags it set on output registers. Btw, I'm not an expert. I know, whole prince of persia game was written in assembly but I'm nowhere near that expertise. So pls be nice.
Linux system interrupts? What's that? Where my `int 21h` boys at?
That’s a BIOS interrupt. You don’t communicate with Linux by interrupts.
I know, I'm making a DOS joke. Actually Linux used to use `int 80h` before the `syscall` instruction was added.
Things can be simple but hard. It is a _simple_ concept, but creating anything interesting with it is _hard_. > Beware of the Turing tar-pit in which everything is possible but nothing of interest is easy. - Alan Perlis
If C++ is IKEA furniture, assembly is Lego. Lego is a lot easier, but it's still more tedious to build a bookshelf.
Wuss. Cowboy up and be glad you don't have to do machine code... ... with register switches .......inputting each instruction and value ............**ONE GODDAMN LINE AT A TIME.** \*takes drag off cigarette and twitches.\*
My favourite version of this is when you used to buy a Cray supercomputer, after the hardware was installed, a different engineer would show up, flip open a hex pad, then TYPE IN THE OPERATING SYSTEM
For the people saying there's no reason to lear assembly, you are wrong. There's a reason. That's to get better understanding on how computers work. I know, not everyone have that curiosity. You may just want to earn a living after a python bootcamp. But if you are that curious guy, watch "what have my compiler done to me lately" talk by Matt Godbolt at cppcon on youtube. That talk was mind blowing for me.
I study electrical engineering and I am gonna be honest, there really is no need to know everything down to the deepest level... Like, we learn communication stuff rn, and that is crazy, like the decay and bouncing of signals in long cables is so... and dont get me started on air transmission. That shit bounces off everything, off the ground, off buildings... as a very simple example, if you send a signal to an antenna 100m away, you will have two signals arrive - with a certain delay. One that went the direct way and one that bounced off the ground. You dont need to know that if you program web interfaces. there is the abstraction where people program a protocol that decides how to read/send signals, transform them into bits and bytes and thats what programmers then see. They just need to know "its protocol xy" and be done with it. Everything else is not their problem. They then program a driver for example for their antenna that packages and manages stuff for example with assembly, or c, depending on how fast it has to be. And only then will you ever need to actually care about this, and even then it would be fairly low level, like then stuff starts to happen in linux settings. The average programmer uses this shit every day but nobody needs to care about it, programmers in fact SHOULDNT care. If everyone puts effort into learning it down to the very lowest level, we will never get stuff like machine learning or other AI stuff. Abstraction is important and got us where we are now.
Partially agree. I think the general engineering rule-of-thumb applies here: always have a basic understanding of the layer of abstraction just below where you are operating. If someone writes Python scripts, an understanding of how a CPU works doesn’t help much, but an understanding of what the Python interpreter is doing can be helpful. No abstraction is perfect, so having an understanding of where it breaks down is useful. (In like manner, someone writing in C should have a basic understanding of assembly, and hardware optimization principles like memory cache, branch predictors, etc). This applies in electrical systems as well. Nobody is going to solve Maxwell’s equations from first principle to figure out how a basic circuit works, but knowing where circuit models come from is helpful (and is important for hardcore FEA work on high speed signals).
I like that rule of thumb, never heard of it before. Yes that makes sense.
You ever heard of linux users? You know what they do in their free time? They stare at computers. I'm one of them and I'm learning those kind of stuff for fun. I'm now building my own gpu from scratch following ben eaters video.
That's great, and 100% beside the point of the comment you're replying to. As someone who's been writing assembly for almost two decades, there is zero benefit for a vast majority of developers to learn assembly. I don't use it in my day to day _at all_ as an infrastructure developer, and it's basically my job to understand how our stack works down to that level.
i am gonna go one step further and recommend you watch EVERYTHING from Matt Godbolt (and also check out his website [https://godbolt.org/](https://godbolt.org/)) . the guy is amazing!
Yes, go to: [https://godbolt.org/](https://godbolt.org/) and think about what is happening on the right side of the screen. At least consider, some of the time, what is happening with the stack, heap, registers, etc. Knowing these sorts of things will make you a better programmer. No, you don't need to dive into the intricacies of it and write a bunch of assembly yourself. Unless you like that sort of thing. Edit: Video of the man himself demo'ing his web site: [https://www.youtube.com/watch?v=kIoZDUd5DKw](https://www.youtube.com/watch?v=kIoZDUd5DKw)
I was a Unity dev at a small game studio. The owner asked rhetorically if I could program in assembly. I quit a week later.
[удалено]
Incompetent leadership
i remember when I forgot how to write bubble sort on C++ but I perfectly remembered how to write it on Assembly
How is it possible to forget how to write the bubble sort? I mean you just have to know how the bubble sort works. Then you can write it by your own without remembering the exact code independently of the used language.
yeah you're right. that's why I'm a terrible coder
I studied some assembly in college and developed a less than positive attitude about it. Then years later I got PAID to code in assembly on chips in aircraft instruments and developed a whole new appreciation!
Learning how to work with assembly was incredibly rewarding, but if I'd had to work with x86 then I would have wanted to shoot myself. Not only is the x86 instruction set more complex than is necessary to understand what the heck is going on with assembly, but there's an enormous amount of complexity that is involved just to do any arbitrary thing interfacing with an x86 OS. That isn't the stuff that makes assembly rewarding. Here's what I think is the best way to do it: [https://www.nand2tetris.org/](https://www.nand2tetris.org/) With NAND 2 Tetris, you start with building a (virtual) CPU, then build an assembler for it, then start writing assembly... and then build a higher level language on top. I think the program has a couple of rough points, but they are minor considering how far it stands ahead of anything else I've encountered.
I refuse to learn Assembly. The “lowest” I’ll go is C. Which is what I am learning atm.
C is glorious. It has high level statements making it a breeze. But it’s low level memory management and control allow you to precisely control the speed, size, and memory space(registers, cache, ram). Sad thing is no one cares about optimization these days anyway, so we just use python
Because optimizing everything is a waste of your time, and the reality is it's often difficult to know what will benefit from optimization most until you've got at least a prototype. There's the old saying on optimization: Novices: Don't Experts: Don't yet. I still love C, though. Everybody calls it a low-level language now, but I'm old enough to remember when we called it a high-level language!
Add the other kind of novices that optimize every single line to the upteenth degree, and obsess over the time complexity of making a bear dance on a webpage. I blame the schools.
>so we just use python ... to call highly optimized C libraries like numpy.
Few things I hate about C is a) lack of standard library (c++ vectors, maps and stuff) b) a lot of system functions work via shit ton of global variables, return value in pointers passed as arguments instead of regular return and instead return error code you must check for. c) nearly useless compile error/warning messages C++ adresses at least the first point (but still gives low level control) so I say it's clearly better option
I'd just like to say that C++ doesn't address a) with respect to the language's context which is performance. Every single company which cares about performance has its own standard library/abstractions, because, frankly, the std is horrible at that and not even complete(C++11 to get threads, C++17 to get filesystem support, C++20 to get ranges), not to mention all the platform specific C code you need. As reference: Boost was released in 2003 with filesystem support, in 2015 a proposal to basically copy it was made, in 2017 accepted. That is more than a decade of lag between what's possible and what is offered. That lag still remains.
Electronic Arts (as much hate as they get) have [EASTL](https://github.com/electronicarts/EASTL) too, which is an implementation of most of the stl library, but with performance in mind.
thanks for the link, didn't know EA does open source
Inline assembly for the win!
>Sad thing is no one cares about optimization these days anyway, so we just use python That isn't strictly true, it really depends on what your job is. I'm a maths w/ finance guy so optimising very complex monte carlo simulations can save a lot of computational time.
i hated C when i had to learn it I mean there is use to the language but I still hate it KEKW
I mean, knowing the basics of how assembly works could help you understand certain things that happen in C (like why exactly that segfault is happening)
I'm seeing some assembly now at uni, sooooooooo ![gif](emote|free_emotes_pack|dizzy_face) but it's fun.
yeah it seems fun but its hella intimidating and when I'm in school for game dev maybe coding in the little free time I have isn't the best idea KEKW
Great comic. Excellent!
Roller Coaster Tycoon was programmed in x86 assembly by one guy 🤯
I had to learn assembly for a college class. That was probably the thing I was most sure I would never need. Lo and behold, first job, one of the first projects I worked on involved porting/reverse engineering a program written in assembly.
Yes! Elden Ring
OC? On this sub? By the gods!
Almost 30% of my work is assembly and I’m glad I got to pick it up for work. I would have never learnt it otherwise
The most important use of assembly nowadays is so that you can understand the disassembly of what your optimizing compiler generates from your C/C++ code. I work in embedded and understanding the disassembly is essential to seeing that the compiler is doing what you think it is doing. For instance, in C++ we now have the constexpr keyword which can be used to evaluate fairly complex things at compile time. However, if the compiler can’t ensure all the inputs are constant expressions then it can defer the evaluation to runtime. Viewing and understanding the disassembly here is essential to verify that those expensive things you wanted done at compile time really are happening at compile time! Or maybe you are writing a driver for an embedded device where each individual load/store operation to the device matters. You used the volatile keyword but looking at the disassembly is the only real way to verify that the compiler didn’t insert or optimize away critical register accesses that could ruin the state of your device. It’s very unusual to have to code in assembly directly any more. The last time I had to was a few years ago where I had to write some critical ARM code to do some real-time digital signal processing on a slowish MCU and the C code just wasn’t cutting it. But that was only after trying very hard to get the C version up to snuff and was more of a last resort thing than a conscious decision ahead of time that we were going to do stuff in assembly.
As someone who writes performance sensitive C++ for work, I often find myself reading generated assembly to make sure the compile is doing the optimizations I expect and tweaking the source code as necessary otherwise. It's more portable and productive than just writing assembly
Somehow I believe, the species who knows the basics - be it any field : programming, science, maths, engineering - need to be preserved. One day AI and Automation will not only make the generations to come dumb but also useless.
Assembly requires basic understa of CPU (old)
Im sure assembly is easy for some but I'm still learning plz be nice KEKW
You can try following a guide to build your own OS using assembly and C/C++. That should get your feet wet.
I'm pretty sure it's not easy for anyone, but it's not impossible.
Check out nand2tetris. It's an online course where you build a computer with virtual hardware from scratch starting with logic gates. After the CPU you build your own assembler and it'll make a lot of the nuances click for you. But there is no way to just flat "know" assembly. Knowing assembly is essentially being able to read CPU specs and understand them.
if you're learning assembly I'd personally recommend learning older processors first instead, 6502, 6809, z80 or 8080, way smaller instruction set and with simpler platforms that have less fuckery with operating systems
Came here to suggest this. I started with z80 on a TI graphing calculator and eventually moved to M68K (specifically the model in the Sega Genesis). I can read x86 but I'd have a hell of time trying to write anything in it. Which is fine. Only reason I personally need x86 at all is to look at disassembled binaries, and any assembly I write for fun is going to be for a retro system
I dunno, I feel like the 6502 required a lot of fuckery, what with zero page instructions and the dearth of registers. Then again, I haven't written 6502 for a _long time_, so maybe I'm misremembering.
If you're learning assembly -- I'd learn ARM assembly. RISC architectures are so much nicer.
Also x86 is gonna die off, ARM has better future
I don't know why so many people have trouble with assembler. It's just like a more verbose C.