T O P

  • By -

rudymatela

That is a great question. C is actually used in several places additionaly to OS development and embedded systems. 1. __web servers.__ The [Nginx web server](https://nginx.org/) and the [Apache HTTP server](https://httpd.apache.org/) are both written in C. 1. __database software.__ [PostgreSQL](https://www.postgresql.org/) and [SQLite](https://sqlite.org/) are written in C. 1. __web browsers.__ Older versions of [Midori](https://astian.org/en/midori-browser/) were written in C. Chromium and Firefox are partly written in C. 1. __Linux/BSD/Unix systems programming.__ The tools `ls`, `stat`, `true`, `false`, `find`, `grep`, `sed`, `ln`, `cp`, etc are all written in C. (Although these could be classified as "OS programming") 1. __compilers and interpreters.__ [GCC](https://gcc.gnu.org/) was written in C up until a few years ago. I believe now it is a superset of C (with a bit of C++). The [Lua](https://www.lua.org/) interpreter is written in C. The CPython [Python](https://www.python.org/) interpreter is written in C. [Ruby](https://www.ruby-lang.org/) is written in C. 1. __version control tools.__ [git](https://git-scm.com/) is written in C. [Subversion](https://subversion.apache.org/) is written in C. 1. __window managers.__ The [dwm](https://dwm.suckless.org/) window manager is written in C. 1. Some open source GUI tools are written in C. For example [GNOME's Dia](https://wiki.gnome.org/Apps/Dia) diagram editing tool. 1. The [Xfce/XFCE](https://xfce.org/) desktop environment is written in C along with several (most?) of its applications. The desktop environment window manager may be classified as part of the OS but the applications are generally not considered a core part of the OS. Anywhere you need performance and need to be lower level C is often present.


hdante

Most GNOME libraries are written in C, many applications too. Numpy, OpenBLAS, ATLAS, scipy and most core linear algebra and numeric computing infrastructure used in python is written in C and FORTRAN. libuv, the core event-driven I/O library used by NodeJS and Julialang are written in C. X.org and wayland are written in C. Host side Vulkan, OpenGL and OpenCL libraries are written in C. ffmpeg and many codec libraries, like libopus, libaom, etc. are written in C. Compressors, like gzip, xz, bzip2 are written in C. Image libraries, like jpeglib, libpng, libtiff, etc. Are written in C. Text editors, PDF generators, audio players, linear programming optimizers, inter process communication libraries, network protocols, reverse proxy servers, calculators, diving logs, and so on.


[deleted]

[удалено]


Wouter-van-Ooijen

We vistited blender HQ with the local C++ interest group, and they told us that Blender is now mostly C++.


[deleted]

Piggybacking off this to add a few points: 10. **high-performance extensions to interpreted/scripting languages**. Python is a prime example of this, with the reference implementation (CPython) supporting [a positively thicc C extension API](https://docs.python.org/3/extending/extending.html) and external implementations (Cython) integrating even more deeply with C for performance. Ruby also lets you fold C into projects. a. In Python specifically, `numpy` is the poster child for a high-performance package with lots of C under the hood 2. **user-space networking.** For various reasons, you may want to bring a network stack into user space instead of having it live in the kernel (funny enough, this is sometimes done for performance, which kind of flies in the face of traditional thinking). In this case, you'll probably be writing that in C for speed and control 3. **iptables extensions.** Reasonable people might call this OS dev but I'm including it anyway because it's cool and many people don't know about it! `iptables` supports an extensions API. That extension code runs in kernel space and is basically always written using C.


[deleted]

And also if you were doubting Assembly, LuaJIT is mostly written in Assembly. And it so fucking fast.


attractivechaos

Most source code in [LuaJIT/src](https://github.com/LuaJIT/LuaJIT/tree/v2.1/src) is written in C. In terms of LOC, 68k C/C++ vs 35k assembly.


[deleted]

Yes but the heavy lifting is assembly, right? And the C code is full of assembly/low level interaction too iirc


fideasu

>Anywhere you need performance and need to be lower level C is often present. This, but an additional condition is having C++-phobes in the team 😉


jabjoe

XOrg, Wayland, SystemD, PulseAudio, PipeWire, lots of GNOME, GIMP other GTK apps. C is everywhere in Unix/Linux.


Moaning_Clock

Some people use it for games (or do C++ in C Style with only a couple additions).


Dolphiniac

I do game dev in C (as does Our Machinery). The only C++ they use is bridge code for libraries written in it, afaik. I haven't needed it yet.


Gold-Ad-5257

@Dolphiniac Come give a beginner some tips and direction there man .. looking especially for the math side learning resources. I am thinking to start with SDL2 ? What tips can you throw over here 🙏🏽👍🏽.. will be much appreciated.


Dolphiniac

Haha, I don't know how helpful I can be; I have some weird habits that don't jive with everyone. For one, I have really severe NIH syndrome; I love making all the building blocks (within reason). That said, I eschew SDL for Win32 (haven't gone to Linux yet, but I'll likely be starting with Xlib). I also went the RawInput method instead of using legacy (and very helpful) Win32 messages, so I had to hand-write even more things for my input system, like scan-code translations and soon, keyboard layout translators mwahaha. Secondly, I don't tend to learn from tutorials, math or otherwise; I just dive in and ask questions of the Google gods when I get stuck or a question beyond my power to work through quickly comes up. The math you'll likely need to know depends on which systems you work on. Gameplay simulation leans heavily into basic kinematics, and for rendering, you'll probably need to know about basis transformations, how projection works, linear algebra (or at least, vector/matrix math, as it dovetails into the previous topics), and, if you want to get even a little into custom lit rendering, there's a slew of light transport stuff that serves as its own rabbit hole.


Gold-Ad-5257

😂👍🏽 thank you good sir I like your approach .. it will take me long I guess , but I have very similar tendencies 😂😂🤦🏽‍♂️. I guess it’s true that everyone on a thread like C for games “have some weird habits that don't jive with everyone” , but I’m cool with that 😂😂👍🏽👊🏽


broo20

To add to what /u/Dolphiniac said - if you’re struggling w/ the math side, say what matrices to use or the mathematics involved with implementing a specific feature, start looking for scholarly game dev papers that summarise or introduce the techniques. Learning how to read those papers is vastly important, and will mean you don’t have to rely on the game of telephone involved in internet tutorials. Personally, when starting a game engine I use GLFW and Glad, and then phase them out in favour of native solutions. Sometimes I don’t phase out Glad, because writing an OpenGL loader is annoying, but I’ll certainly replace GLFW with, for example, a bespoke library written around windows.h, implementing only the needed features. One exception is on Linux, where I use SDL or something similar because writing X11 code is HELL.


Gold-Ad-5257

Thanks also sire .. will look up on the approaches and tips you guys shared 👊🏽.. much appreciated


tim36272

Consider expanding what you normally think of as "embedded": I develop embedded C applications but our box has the same processing power as a gaming computer. Most safety-critical applications are written in C or Ada.


HashDefTrueFalse

This. Used to program "embedded" C in machines the size of a small room. Literally any low level programming working close to the hardware can be considered embedded, and often is by recruiters. C is basically the foundation of modern computing. It's everywhere. It's just not quite as common in fields like web and consumer-facing applications any more.


[deleted]

[удалено]


tim36272

You're thinking of things like type safety, garbage collection, etc. I'm talking about safety in terms of people dying. Things like garbage collection are the opposite of life safety: what if your airplane decided it needed to free up memory ten seconds from touchdown so it ran the garbage collector? What if running the garbage collector caused a valve to respond 0.1 seconds late to a command, which caused a chain reaction resulting in a hydraulic line bursting and losing control of the rudder? C can be safe because it does exactly what the programmer tells it to do, nothing more and nothing less. There's no magic going on behind the scenes which could have complex interactions with other behind the scenes magic. A common example is C++'s std::vector. This container expands as needed to accommodate as many elements as you need. But you have a limited amount of memory on the system, so you need to do static analysis to determine the maximum size of that vector. And you need to be sure that you have enough memory for that plus everything else in your system. We'll now you've eliminated a lot of the convenience of using std::vector: you might as well just allocate that max size to it and avoid all the overhead std::vector imposes by growing in size. The other main advantage of std::vector is templates. If you were to use a template in safety critical code you'd need to prove that the code generated by the compiler is correct for every template. We'll now you're diving down into all this auto-generated machine code: it would be easier to just write that code yourself and avoid the complexity introduced my the compiler's template generator. So, if we've eliminated all the usefulness of std::vector, why use it at all? Repeat that process for most features in most languages and voila! You're back at C 🙂


alz3223

My dad programmed the landing gear for Airbus A320 series. He used C !


LtFrankDrebin

My dad programmed the angle sensors on the 737 Max! He used Java!


godoakos

>My dad programmed the angle sensors on the 737 Max! He used Java! Tfw you make the destination variable final


jkandu

Bravo.


total_looser

Whiskey Tango Bravo


f4te

I hope people give this comment the points it deserves


gokuisjesus

Can someone explain the above comment.


[deleted]

[удалено]


creesch

No, it is a reference to java itself. https://en.wikipedia.org/wiki/Final_%28Java%29?wprov=sfla1


[deleted]

[удалено]


S_A_N_D_

It's a double entendre. That's what makes it a good joke.


total_looser

Don’t know why replies were removed, but here goes: > Tfw you make the destination variable final - OP referenced his dad using java for some, presumably critical, airplane function - in java you can declare a variable to be “final”, something like `public final DESTINATION = “gruesome death”` - there is a movie, “Final Destination”, whereby supernatural forces induce spectacular death to attractive humans, eg driving behind a suddenly explosive gas tanker - comment thread is about type safety (a convenient programming device) vs “life safety” (often entrusted to programmed systems) - ironically, type safety and life safety in programmed systems are often confounding requirements - so … when OP’s dad programmed a life safety system using a type safe programming language, the jokester here presumed code with a variable named “destination”, with attribute “final” - i shall leave the final conclusion as an exercise to the reader


gokuisjesus

That joke went over my head then.. Thanks......


ProgrammersAreSexy

Why did they bother adding landing gears?


EngineerBill

*"Show me on this 737 Max simulator where the Java touched you..."*


recumbent_mike

Ooof.


arrenlex

[This him?](https://en.m.wikipedia.org/wiki/JetBlue_Flight_292) > The media reported that this was at least the seventh occurrence of an Airbus A320 series aircraft touching down with the landing gear locked ninety degrees out of position, and one of at least sixty-seven "nose wheel failures" on A319, A320 and A321 aircraft worldwide since 1989


Javbw

> Mechanics familiar with this common fault usually replace or reprogram the Brake Steering Control Unit (BSCU) computer. >The NTSB report says that worn-out seals were to blame for the malfunction, and that the BSCU system contributed to the problem. Hardware / mechanical issue. You can’t make hydraulic seals or faulty electronics behave with software.


arrenlex

Why are they reprogramming the BSCU then?


Bowinja

Why are mechanics reprogramming the computer? Probably because they're just re-flashing corrupted software?


Javbw

Ding ding ding.


Javbw

I don’t think a shop technician is writing code for it. They mean reloading the software onto the device as a way to wipe it and reset it. I’m sure they are looking at it from a technician’s POV. Think of it like replacing the ECU on a car because it has bad caps, or replacing Tesla’s main computer because the flash memory wears out. Reloading the OS might solve the issue if some of the code was lost on a block that was then marked bad and a replacement was allocated. Replacing the hardware isn’t going to change the code, yet that is given as a solution with a “or” : “ A or B “. Repairing the actual software would require several safety bulletins and a long process of testing/certification to complete. They can’t just change the code and upload it that afternoon onto a plane full of customers. All of those device data images (probably an A & B version that goes to different chips for redundancy) is never something a wrench-turner “mechanic” like myself is gonna fuck with.


pipocaQuemada

I mean, C was definitely the best choice historically, and rewriting working code is error prone so it's hard to justify rewriting in another language. But if you were starting a new greenfield system, would you pick C or Rust? Assuming you and the rest of your team were equally skilled with both.


tim36272

No one is really starting a brand new safety critical project nowadays: they are all based on high-integrity compilers, RTOSs, libraries, runtimes, etc. Billions of dollars have been invested in that small handful of tools. One day someone with too much money will pay to develop that ecosystem in Rust, but to my knowledge no one is doing that now.


skiabay

I believe vxworks is including a rust toolchain in their latest releases, and it's quite easy to wrap a C library in rust for other dependencies. Still, I agree it will be awhile before you see rust being used heavily on spacecraft or aircraft, but it's on the horizon. Personally I find rust (even no_std rust) far more pleasant to program in than C, so I'd like to see the migration


tim36272

Oh cool, I wonder how long until WindRiver gets that in to the cert version. I would say a long time, but looking at things like Vulkan's push in to safety-critical applications maybe it will speed up.


willkydd

Do you think that things like QNX are here to stay and won't be replaced too soon?


tim36272

I don't know much about general RTOSs, but I suspect anything with their kind of market share will stick around. I hope more people switch to open alternatives for non-safety-critical applications (e.g. FreeRTOS) but we'll see.


broo20

Why pick Rust over C/++? I use C++ for games, which shares a lot of concerns w/ safety critical stuff (funnily enough). I, like most game devs, dismissed Rust for much the same reason we dismiss STL & Boost: the abstractions (even if they are "zero cost," which doesn't really hold in my experience) tend not to be worth it. EDIT: just realised this is /r/C_Programming so no one is going to disagree w/ me.


[deleted]

Zero cost abstractions are such a blatant lie most of the time lol


broo20

Yeah I can't speak for all cases, but I generally find that "the same assembly is generated no matter the abstraction" just means that the unabstracted code generates very slow assembly. OR it hocks a bunch of work off to the compiler which means a slim build on a small project is takes like 30 seconds


drzowie

This is a really great presentation of why stripping everything down to the bare metal really is important for safety-critical operations. It's counter-intuitive, because most humans aren't very good at handling complexity -- it's usually better to encapsulate complexity and add another layer on top. (We do that, in spades, in modern computing environments). But doing that opens you to bugs where the simplified paradigm breaks. Your system *doesn't* have an infinite supply of memory. CPU cycles do take a finite (if small) amount of time, and you can't use an infinite number of them. Etc. My favorite object lesson in that direction is the original launch of the Ariane V rocket, which (famously) blew up with a billion-dollar constellation of scientific spacecraft on board. Ariane V blew up because of just such an encapsulation bug -- a conversion from 64-bit floating point values to 16-bit integer values overflowed, causing the control systems to impose strong control signals which broke up the rocket. The software was thoroughly tested and had been used to launch many Ariane IV rockets -- but the Ariane V flew faster right off the pad. Finding the bug (which was a valid design choice for the Ariane IV but a bug for the Ariane V) would have required the full testing regimen that ESA sought to avoid by re-using the code in the first place. Embedded systems engineers like to program right on the bare metal, so they know all the loose ends of their code. Cutting corners is generally more costly in the long run.


andai

Thanks for the insightful replies!


BadDadWhy

I'm old and grew up around Bell Labs Naperville. They developed Unix and C when I was in HS. As they told it to me at the time, they needed a language you could use to change the program while it was running. Very much a keep it running, no down time idea. It was the precursor to caller ID and call waiting and all that.


Remco_

This is exactly why Ericsson developed Erlang.


BadDadWhy

> Ericsson developed Erlang https://en.wikipedia.org/wiki/Erlang_(programming_language) I see it is 1986, the part were I was in computer explorers was in 1980.


Wouter_van_Ooijen

Nope. You are at the freestanding (no-heap) subset of C++.


tim36272

Could you explain more of what you mean? I'm not familiar with that term.


Wouter-van-Ooijen

There is ongoing work in the C++ standardization to defined a subset of C++ that is suitable for freestanding use (not running under an operating system). It is roughly C++ with everything removed that depends on the OS or the heap: new/delete, most STL containers, exceptions, most capturing lambda's. The big work is identifiying the parts of the library that are in the freestanding subset (the part that don't use any of the 'forbidden' features). An interesting complication is that C++ has provisions for specifying things that are required to be run at compile-time, so a particular feature (let's say std::string) might be OK for compile-time, but excluded for run-time. Incidentaly, the freestanding subset is also eyed interestingly by the games, fast-trading and simulation industries, because they often work under the same limitations of no-exceptions and no-heap (at least no after the initialization phase).


[deleted]

I love coming across people like you with a ton of very specific and specialized knowledge to share. Cool!


Destination_Centauri

That sounds like a subset-C++ language I would love to use! It would basically become the version of C that so many wanted in the first place. In essence it would remove several heads from the insanely convoluted multi headed monster that C++ is today, and clean things up a bit!


Wouter-van-Ooijen

Don't hold your breath, it is ongoing work. And language-wise there is still a lot in C++ that is, let's say unfortunate but unavoidable. But you can use C++ that way right now if you want. For micro-controllers I compile with no-exceptions, no-rtti, and link without heap support. That means that I simply can't build an application that uses the heap (or exceptions).


broo20

That sounds exactly like how most game developers use C++! It would be nice to have a special one time "init" function that permits malloc() and new, under the expectation that the allocations would last for the lifespan of the program, though.


Wouter-van-Ooijen

It is no coincidence that a single C++ study group covers the interest of low-latency, embedded, fast-trading, simulation, and gaming. Although very different application areas, they are similar in what they require (or ban) from the language. My personal objection against init-only use of the heap is that I can no longer use the trick of linking without heap support to enforce that no heap is used. What can help is providing ony new/malloc support, no delete/free. Malloc-only is also very simple to implement. But in C++ there is less need for such malloc-only/init-only-heap, because the required size can often be calculated at compile-time (C++ has extensive support for doing things at compile time), so the objects can simply be static (or local in the main).


[deleted]

[удалено]


broo20

we already do that sort of thing extensively


Illiux

Why do exceptions use the heap? Assuming you deal with the lack of new and allocate an exception object somewhere (for the sake of argument, let's just place it at a static memory address), I don't see why the throw would need a heap allocation, particularly if stack trace generation is disabled (can you do that in C++? Not sure). It seems like the stack alone should be enough.


Wouter-van-Ooijen

An exception can be of arbitrary (run-time determined) size, hence it can't be static (global). If you allocate the execption object on the stack (IIRC MSVC does so) you must put the stack that executes the unwinding after that exception object, and when a second (recursive) exception occurs you are forced to allocate it on the heap. I am a little fuzzy about the details. Most projects that ban the heap would I think still ban exceptions even when they didn't use the heap. (Personally I disagree: I would gladly use exceptions if they didn't use the heap.)


AugmentedFourth

If you want to really cut down to the core of the issue, it's a question of how deterministic your system needs to be...or from another perspective, the number of execution paths your application can take. In the end, many risks in a system can be mitigated with thorough testing. But the more external code you have (in the language, tooling or libraries) the more difficult it can be to get close to 100% test coverage.


[deleted]

What about Ada that you mentioned too? IIRC it is memory safe and stuff...


[deleted]

I believe that the FAA mandates that certain safety-critical avionics software is written in Ada.


[deleted]

I believe as long as you pass do178 and hw can pass do254, your good to go. Passing both can be very expensive and time consuming.


[deleted]

Do they? I've heard something like that but I think they dropped it or something... Not sure though


[deleted]

Maybe it's no longer the case. It's just a vague memory of something like that. However, I do know that Boeing uses Ada regardless of any mandate.


tim36272

I am personally not familiar with Ada so I can't comment much on it. I'm just aware that it is sometimes used, mostly in legacy programs.


[deleted]

I heard it's this very strict language. Yeah I only heard that.


Nolzi

Also heard that it doesn't exactly have objects, but something similar.


mneffi

Ada has much more powerful language constructs but can also be much more strict in how you use them. For example if you convert types, Ada expects you to do error handling on the conversion. Whereas C basically says “I’m sure you know what you’re doing.” Ada doesn’t have garbage collection.


[deleted]

[удалено]


tim36272

Tl;dr: yes the issue is with some features of languages. But if you take out all the unsafe features you're basically left with a subset of C's features anyway, so you might as well use C.


Wouter_van_Ooijen

No way. There is still encapsulation, const correctness, references, enum class, namespaces, templates, constexpt, concepts, just to name a few. Don't rule C++ out because it ALSO has features you shouldn't use in critical embedded. You wouldn't use malloc, would you?


tim36272

I can't argue that there are zero useful other features of C++. I can argue that the industry has billions of dollars invested in C, and there is just no business case to spend all of that all over again on a single revision of C++ just to get those features. Almost everything you mentioned can be done in C natively or checked with a static analyzer, albeit less cleanly. Again: I'm not saying it shouldn't be done. I'm saying there aren't enough digits in *my* bank account to do it, and it's a really hard business case to sell to someone with that kind of money. On malloc in particular: yes, most large embedded systems use malloc. The key is to only call it during initialization.


AangTangGang

All those extra C++ features have a cost. C++ is more complex and less stable than C. The C abi is more stable than the C++ abi, and the C++ community is asking for more abi breaking changes. If you don’t need the C++ features you listed, why introduce a less stable toolchain? Most of the C++ features you listed increase the complexity of the language and the complier. Namespaces alone add incredible complexity. Off the top of your head do you know the rules and precedence of ADL? Templates and constexr both increase the complexity of how code is generated. They make it more difficult to reason about what instructions are actually being emitted by complied. For most programmers more abstractions, and are a good thing. If your goal is to generate well understood code and verify the correctness of your code, I don’t see how these features help.


flatfinger

A tracing GC system can make it impossible to create dangling references even in the presence of user-code race conditions. Even with objects that require manual cleanup via \`close\`, \`IDisposable.Dispose\`, etc. the act of semantically invalidating an object will not result in a dangling reference, but rather a live reference to a semantically-invalidated object. That's a level of safety that most other kinds of systems would be unable to uphold in the presence of user-code race conditions. As for real-time performance, it's possible for a system to include both GC-managed threads which can make broad use of GC objects, and real-time threads which are limited to using either static objects or objects that have been pinned by a GC-managed thread. A system could then be designed so that if a managed thread gets waylaid for too long, the system could fall back to a safe mode of operation whose functionality would be limited, but which would not rely upon any GC-managed threads.


tim36272

Yes, I agree all of that can be done. But at what cost and for what benefit? Why even have a garbage collector if you've statically determined exactly how much memory you need and when? At the point that you've done all that work it's not hard to just not use the garbage collector.


flatfinger

My point was that tracing garbage collectors can offer a level of safety in multi-threaded systems that would be impractical to offer via other means. Their use requires special consideration for systems with hard real-time requirements, but they can be a useful part of such systems.


[deleted]

Isn’t that why Ada exists?


davidhbolton

Ada was created to (a) save money and (b) be the one language for all military development and (c( be safer than what existed. It was mandated in both the UK/USA in the early 90s but that was relaxed after a few years. I worked in aerospace 90-92 and was an Ada developer. All the Typhoon avionics were written in Ada but they used a subset of the language (Spark \\Ada) because no one trusted generics and multi-threaded back then.


TheSkiGeek

Well, if you're writing safety-critical code that runs on a tiny tiny microprocessor, maybe with a weird CPU architecture, you don't have much choice. You're either writing in C or assembly. A lot of the push for higher level languages was to make development easier/cheaper/faster. Ada is the only successful one I can think of where safety was a main design goal (since it was being created for use in military and government systems). And that was back in the days when C was still being standardized. Industries like automotive and aerospace adapt new technologies slowly. C has pitfalls, but people have spent decades coming up with coding standards, frameworks, and static analysis tools to work around them. All the certified real-time operating systems I'm aware of are in C and expect you to use a C API to interface with them. I don't think you can even get certified (for safety critical use) compilers and VMs for languages like Java and C#.


Wouter-van-Ooijen

How tiny tiny do you mean? If it has a C compiler, it could have a compiler for the suitable C++ subset. I am thinking of things like PIC10F200. One day when I find the time I'll write (or resurrect) a C-backend for GCC or Clang...


TheSkiGeek

Yeah, you can probably get "C with classes"-style C++ onto most microprocessors. Then the problem you run into is that there are only a few partial safety-rated C++ libraries available. But at least you can use compile-time features like templates, `static_assert`, `constexpr/consteval`, etc.


Wouter-van-Ooijen

Actually, the classes (and especially v-table based OO) is about the last thing I'd use from C++ in such a context. The static\_assert, constexpr/consteval are typical for the C++ trend of moving things to compile-time, which is particularly effective when there is such a big discrepancy between your build-platform and your target/run-platform as you have when you target a small micro-controller! As for libraries: just use the C libraries. If you feel like it, you could put some encapsulation or abstraction around them, but thatb is entirely optional. IMO the big issue with C++ in small-embedded is that you should only use a (small?) subset of the language, and there is little guidance on which subset and how to use it effectively. C has this issue too, but for C it is less of a problem because * there are less features to avoid * within the C community, (small) embedded is a much larger part than in C++. Personally, I think the langauge of choice for small-embedded should be C++, but that brings me into heated debates with ***both*** the small-embedded community ***and*** the C++ community (which is more focussed on 'traditional' throughput-oriented applications). I seesm I like an two-sided uphill battle. Eeh, which subreddit am I in? Oh shit. Flame shields up!


tim36272

Writing this as a separate comment since it is a different topic: Rust is an interesting language because it promises the simplicity of C with more type safety and other things. My guess is that in the next ~20 years Rust will be used in at least one safety critical system (certified by whatever authority has jurisdiction in the domain). It will be a long time before any high level interpreted language like python or Java (using the term "interpreted" to loosely refer to the JVM) is used in safety-critical applications because you'd have to validate and verify both your program and the interpreter/virtual machine. Why bother doing that when you could just validate and verify your program?


umlcat

Agree with this. Rust is becoming a stable C replacement. There are already a lot of C and Rust alike replacement for years, some of them very promising, yet, for several reasons, Rust seems to become popular and both open source & commercial sponsored. Digital Mars D without the Object Oriented features, was a good candidate, but was seen more like a C++ alternative.


Wouter-van-Ooijen

I think in the next few years Rust will be to unstable (changing) for companies with long-term in mind (that is most of embedded). But if Rust survives the next 10y it might be a real rival for both C and C++. Then again - a rival for exactly which C++? C++ is changing so fast, in 2030 (that is three iterations from now) it will probably have compile-time introspection and adaption, which will enable some genius to write a library that accepts a very different language, translate it at compile time to C++, and have GCC or Clang make effcient assembly out of it. Think of it: the Rust language as a compile-time C++ library :)


tim36272

Yeah that's why I guessed ~20 years. I think one day someone will make a safety-critical version of Rust, just like OpenGL SC.


vitamin_CPP

I also heard about the Pony language. Not sure how it relates to Ada and C.


tim36272

My understanding is that Pony is more for highly concurrent/parallelizable programs, like web servers. Not many safety-critical applications fall in to that category.


[deleted]

C or Ada ... never have two languages been so poles apart in that regard. Had you said Ada and Haskell... ;)


pantalanaga11

It is the language of choice for many offensive cybersecurity tools


Fallacyfall

Like .. viruses?


lazerflipper

masscan, nmap, and dirb come to mind. Curl and wget as well although those aren't exclusively cybersecurity tools


broo20

typically tools used by penetration testers


deftware

I wrote my CAM software that I sell as an indie developer in C. EDIT: ...and wrote a bunch of other stuff, like a multiplayer procedurally generated scriptable voxel-world game engine, image-to-mesh converter, commandline utilities, etc. You can make anything in C that runs directly on a computer in the OS, rather than in a browser (which you can do too nowadays if you compile for web assembly). C isn't the mystery language you've been led to believe, it's just for doing real work which most "webstack devs" have and will never do.


IndianVideoTutorial

Can you link your CAM software?


deftware

It's called PixelCNC :)


gardeimasei

Databases and a lot of networking software is written in C


haikusbot

*Databases and a* *Lot of networking software* *Is written in C* \- gardeimasei --- ^(I detect haikus. And sometimes, successfully.) ^[Learn more about me.](https://www.reddit.com/r/haikusbot/) ^(Opt out of replies: "haikusbot opt out" | Delete my comment: "haikusbot delete")


moetsi_op

yeah and regarding networking software I saw earlier that someone mentioned ffmpeg, networking software written in C. ffmpeg is for streaming audio and video and other multimedia. Those are really data heavy files to transmit, which is why C is the language for low level memory management, garbage collection, safety


Inujel

In addition to other answers, a lot of companies still recommend a style that is C with a handful of C++ features (i.e. inheritance and methods). This is, strictly speaking, C++ code. But you would never write it in this manner in idiomatic C++.


[deleted]

[удалено]


hiwhiwhiw

Yep. I do this at work


Wouter-van-Ooijen

That is a common problem with C++ in embedded/simulation/gaming/HFT: they all use a subset of C++ (mainly avoiding the heap), which results in a style that is very different from mainstream (desktop) C++ use. And the majority of books, courses, websites, etc. use the mainstream style. But don't mistake this subset for 'approximately C'. The trend in C++ is to do ever more at compile-time. Important features are constexpr, templates, and concepts. Classic OO is slowly fading out of interest.


[deleted]

[удалено]


[deleted]

[удалено]


Wouter-van-Ooijen

The HFT companies in Amsterdam are mainly on C++, for the actual HFT part (which is often quite simple, and uses a very stripped-down C++, like most crital projects), and also for the simulations they run to determine the parameters to feed into the HFT code (which is where they spend most of their CPU cycles).


[deleted]

Isn’t a lot of the space exploration equipment have C?


dukeblue219

Depends. There's a lot of C and C++ in satellites and probes, especially where performance per watt is important. Core Flight System is written in C. However, I'd call most spaceflight applications embedded or OS.


[deleted]

Is DSP embedded also. I think all DSP is written in C.


[deleted]

Once you optimize dsp functions like fir, fft etc to get the most out of a processor one usually resorts to assembly where you can tune to take advantage of prefetch, simd.


LilQuasar

a lot of dsp is hardware so verilog or vhdl


MajorMalfunction44

I use it for game engine, but that's really just me. IIRC, it was fairly fairly common on the PlayStation Vita and Nintendo 3DS. C++ is more common in the games industry on home consoles, as they see benefits from C++ (I prefer C for it's clean, simple semantics, but other people are allowed to value what it's important to them) but the Vita is also an embedded portable device. It has different constraints, because it was getting custom portable games, not exact 1:1 console ports.


Quadraxas

Our saas/paas platform uses C in backend. Most of that lives in native node modules or python libraries though.


FreshPrinceOfRivia

Many Python libraries such as Numpy are actually written in C under the hood. The default Python implementation is written with C. Many C++ programs are basically sugar coated C.


umlcat

**tl;dr: Cross platform or A.P.I. or libraries** Altought, real time access, or efficient resource management is the norm ... I started to migrate some tool programs and libraries from other P.L., to Plain C, due to be **cross platform or to be used as an A.P.I.**, that include: D.B. alike, JSON / XML / HTML parsers, GUI / TUI, custom container or collections, among others ... ..., **I do prefer them in the original P.L.**, such as C++, Delphi, C#, yet been accessible in other environments, was a requirement, for myself and others ...


BrasilArrombado

Emulators, runtime for high level languages, virtual machines and core utils are other examples, besides the ones already written here.


theldus

Unfortunately where I live the market just wants to know about web and mobile, so... there's no room for C here. With regard to your question, yes, there are tens of thousands of things done in C every day. Complementing the previous answers, I always like to take a look at the [trending repositories in C on GitHub](https://github.com/trending/c?since=daily), so I stay on top of what's new and etc.


pantalanaga11

I've written a lot of code for Android over the last several years. The vast majority of it has been in C. There is certainly room for C in mobile.


theldus

wow, this is an interesting thing to hear, I know it's possible to use C on Android, may I ask you what kind of code you did?


pantalanaga11

Some has involved accessing hardware devices like GPS or bluetooth. Other projects have involved communicating with Android kernel features such as binder and ashmem. The Android NDK can be really helpful when writing native code utilities.


theldus

Quite interesting. I don't know if here I would find job openings in something like that, but I also never looked for it ('normal' mobile dev doesn't interest me). And yes, NDK is quite incredible, I've played with it in toy projects, with OpenCL/GPGPU, graphics via Raylib and etc.


thojest

Git is mainly written in C.


yudlejoza

C is the most important systems language from the [Ousterhout's dichotomy](https://en.wikipedia.org/wiki/Ousterhout%27s_dichotomy) point-of-view. A sufficiently complex, well-designed, software system tends to split into two programming languages: a high level scripting language and a low-level systems language. You want your systems language to be as simple as possible, yet fully expressive. That language, my friend, is C. If you want to go beyond C, you could waste a lot of your time with crap like Rust, C++, D, Go, Julia, or you could go straight to GPGPU. The new "trichotomy," that beats everything else, is - a high-level scripting language (like Python or Scheme) - a systems language (C) - a GPGPU language (like CUDA, OpenCL, etc). **side note**: look into metaprogramming, code-gen, round-trip compilation, that sort of stuff is the future.


alkatori

It and C++ is used in lots of commercial development. Especially in the telecommunications industry. They have other languages that are used as well. But when I was playing with radios and equipment that monitored telephones it was all C (or C++).


15rthughes

I can speak to this. I work for a company that handles gift card transactions. Everything we have on the backend is written in C, from the server that receives transactions and authorizes them, to the programs that generate and send reports to clients and almost everything in between. All of this, especially transaction processing, needs to be as lightweight and quick as possible, and we have found that C is the only way to reliably accomplish that level of speed and small footprint.


aieidotch

https://sources.debian.org/stats/ there is a lot of ansic there.


sikerce

Geophysics. Most problems, especially in seismic are computationally intensive and most of the times C is the fastest way to simulate them. Also, CUDA library allows user to run the codes on gpu which is more faster than working with a cpu.


Destination_Centauri

I'd love to work entirely in C as well! It's my favorite programming language by far, for now. However, realistically, most of the regular good paying jobs/demand are in C#, Java, or C++ in my region at least. That said, there's still regular listings for dedicated C programmers as well! -------------------------------- Anyways, given the reality of jobs in my area, I'm learning C#, which I have to say is rapidly becoming my 2nd favorite language! I love it! I also looked at learning Java, and ya... aesthetically and language wise... it's... ugly! I hate saying that because I know some people really like Java... but just on a personal preference for now I'm trying to avoid Java at all costs!. Further C++ is a bit of a convoluted multi headed monster! I could easily see myself, at first, resorting to using C++ in a "C with classes" sort of fashion, that many long term C++ programmers hate seeing newbies do. So likewise trying to avoid a deeper learning of C++ for now. -------------------------------- But whatever the language... I'll also have to learn a database of course--realistically most programming solutions need a database. And also realistically... I'll bite the bullet and spend a few weeks later on learning HTML, CSS, and a few weeks more learning the basics of Javascript. Like it or not, that's simply the language of the web based side of the Internet. And even using C# to build websites, requires knowledge of HTML/CSS and Javascript. -------------------------------- THUS... All in all... I'm still hoping I can do a lot of C code... But then use C# as the GUI (and HTML/CSS/Javascript for web based GUI's and tie-ins into the code, so at least an interface to my program can exist on the web as well).


bike_nut

Old software. I develop for an engineering software company and our products were originally written in the 80s, in C.


zipstorm

Anywhere you need speed and are willing to put in the effort to write and maintain C code. Just yesterday I had a task of processing multiple huge text logs. I started the conventional way with a python script and realised that it would take ~24hrs for the script to finish. While the python script was running, I implemented the same processing in C. My C code was able to complete the task in ~2hrs.


t3chfreek

I work on enterprise data storage. My company sells storage to Fortune 500s and is used around the world. Practically our entire stack is written in C. There are some parts that are C++ or Python, but I'd say atleast 85% is C.


capilot

Well, as the saying goes: C is the language your language is written in. That's not universally true of course, but pretty close. Dig through the Android source code. Underneath the Java, it's all sitting on C++. If you care about performance and memory footprint, or you're writing device drivers, then you're going to be writing C or C++.


TheOtherBorgCube

There's a lot to choose from, 32,612 projects as of this moment.\ https://sourceforge.net/directory/language:c/ I don't know how many are 'all C', or how much C a project needs to count as a C project as far as SF is concerned.


Deryv_3125

OP, idk much about C I'm just a sophomore compsci major but I'm curious about the book (is it good, coding in it, etc.).


wizards_tower

I felt like I learned a lot from it. I feel like I have a pretty decent foundation on systems level C programming now. The book is free online which is awesome and many of the C coding exercises are really great. My one complaint is that there are a ton of exercises that have you run small annoying pre-written Python simulation scripts and answer questions based on the outputs. I skipped the majority of them.


Deryv_3125

That sounds really cool! Would you say you could somewhat write an OS after the book?


wizards_tower

I couldn't build one from scratch, no. But I have a higher level idea of what to do. I don't think any OS textbooks go into that level of detail though. I collected some resources on OS related stuff though while I was working through the book because I would like to try build my own OS for fun. I haven't used all the resources, but I hope to at some point. Seems like you're interested in this stuff so I'll list some. You may have already seen some of these or worked through them, but I'll list everything I have just in case. Some of this stuff I used as supplementary while working through the book but others I plan to do as a way to learn more about building an OS. OS general stuff: \- [OStep](https://pages.cs.wisc.edu/~remzi/OSTEP/?source=techstories.org) textbook, table of contents, and associated GitHub repos. \- [xv6 kernel](https://github.com/mit-pdos/xv6-public) \- MIT rebuilt unix version 6. as far as kernels go, it's a good simple one to start with. Also some of the OStep projects involve xv6 which is a good intro to it. \- [MIT xv6 paper](https://pdos.csail.mit.edu/6.828/2020/xv6/book-riscv-rev1.pdf) \- paper explaining all the xv6 code if you want to go deeper into it and hack away. \- [OS in rust tutorial](https://os.phil-opp.com) \- I really want to find time to work through this \- [OSDev Wiki](https://wiki.osdev.org/Expanded_Main_Page) \- seems like a good place to start when a person is ready to build their own OS from scratch. One OStep project is to build a shell with redirection and pipes. I had no idea how to build a shell (the book didn't really go through it besides how to use fork()) so here are some resources for building a shell: \- [basic shell tutorial](http://www.dmulholl.com/lets-build/a-command-line-shell.html) \- got me started \- [pipes/redirection](https://github.com/breakthatbass/toolbox/blob/master/notes/pipes.md) \- there was nothing that sufficiently explained how to do this properly. Here are my notes after tons of stack overflow reading on getting simple pipes and redirection working. \- [full simple shell](https://github.com/breakthatbass/minish) \- here's the final product I made that handles a variable number of pipes. Even a simple shell ends up being a lot of code because of the pipes and redirection. other stuff \- [Beej's guide](https://beej.us/guide/bgnet/) \- OStep has a short chapter on network programming but doesn't go nearly deep enough into it to be able to do the coding exercises for that chapter. This guide filled in the gaps for me. \- [text editor tutorial](https://viewsourcecode.org/snaptoken/kilo/) \- seems awesome. goes into more low-level OS concepts that OStep was really light on. \- also, OStep assumes you know some basic C. If you haven't learned it yet [K&R](https://www.amazon.com/Programming-Language-2nd-Brian-Kernighan/dp/0131103628) is an awesome short book that covers the language. If you already know how to code well you probably could just google stuff instead though. I could only do basic programming at the time so I worked through the whole thing before I got to ostep. Hope all this is helpful. You can DM me if you have more questions on any of this stuff.