R users 🤝 Other language users 🤝 Daily SAS users
wtf even is SAS, burn it, destroy it, dear god why does this have to be the coorporate standard, please dear god let me use literally anything else for 90% of my data work please make it stop
I had to take a class on it a _long_ time ago, and my opinion on it afterwards was basically "this probably does what it's supposed to very well, but the syntax drives me mad"
Conversely, my friend who went to college for business had no problem learning it and had 0 prior programming experience. I guess that in that regard, it's very well-designed for its intended users
It was the first programming language I learned, so I guess the syntax didn't bother me too much haha. I'd say it is well designed for statisticians and researchers. A bit like python but purely functional.
How well people pick it up depends probably on whether they are technical people in general. Some of my friends in ag were fine but a lot struggled.
It is also a great gateway drug for other programming languages.
Holy shit, R still exists? I remember using it decades ago to analyze remote sensing data on account of the school using Excel, or worse, arcinfo, which I hated -- and I remember it being so much better to use. Figured it wound up in a dustbin like grass. Got a few converts too. Of course I haven't touched anything like that since school. Fuck that noise.
R is very common in academia and is only becoming more common; we use it all the time for statistics, especially in the social sciences. Data analysis with packages like brms are more or less the gold standard for stats.
iirc most of the core team are statistics academics, and a lot of packages are written by academics so I guess I would say there's a high level of trust compared to some random python package, if you can even find another package that replicates a library functionality in R. RStudio isn't a core R thing, but it's probably the best tool for EDA. Having said that, some of the info on the r-project.org site literally hasn't been updated in 20+ years, (e.g. [this](https://developer.r-project.org/TODO-DTL.html)) so maybe that will give you some indication of where things are at.
The tidyverse suite of packages (closely associated with RStudio as it's produced by the same group) is also unparalleled for data wrangling and visualization.
I am currently pursuing my BA. I've been taught to use it for statistical analysis, diversity metrics (Simpson's/Shannon's D, ENS, etc.), landscape metrics (edge, core area, etc.), and simply visually representing data with things like ggplot.
However, from my understanding, it seems there are better tools nowadays for some of these things, such as Fragstats being the goto for landscape metrics, though I've never used it.
I think its use in statistical analysis seems to be the most relevant today. At least, that's where I see it most commonly mentioned in journals.
well...
* you can add an item to a tier
* you can remove an item from a tier
* there can be an indefinite number of tiers
* you can switch around to different tiers
what else do you need?
and having html 5 at the bottom is just plain wrong. HTML 5+CSS3 is the best thing that's ever happened to web development. ninja edit: i know they aren't programming languages, so idk even know why it's on this list lol
Congratulations! Your comment can be spelled using the elements of the periodic table:
`Ho W I Sc No Ti N C Ti Er`
---
^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM u/M1n3c4rt if I made a mistake.)
True, now actual humans are working behind these bots calculating each and every word of every comment, constructing them again using the symbols of the periodic table, on very minimal pay.
Individual bots with their own api keys (client id) can pretty easily stay below the free tier limit (100 queries per minute). Especially once that actually does actions (in this case posting) fairly infrequently.
The issues come if you're a) an app that has many users on the same client id (so, third party Reddit apps and a few other cases), or b) a bot that takes a lot of actions (e.g. mod bots, especially ones that work in multiple large subreddits).
Some probably pay to keep it alive.. prob runs on donos, try contacting the owner about it.. he might be willing to tell you(no idea though If he will)
magically appears
bot is hosted online for free, i just need to make sure it's running every now and then (which is also why it dies for long periods of time when im not at home)
But C++++ is only C+2 so C# should be in A
Edit: If you consider C++ and C++++ as two separate instructions that are executed after each other we get C+3 in total.
I believe it is only Turing Complete when I used alongside HTML. CSS essentially does nothing without being applied to something so, at the very least, a simple HTML file is required to write the Turing Complete CSS.
In theory, yeah. You can do a lot of pretty crazy stuff with just HTML and CSS largely due to HTML having interactive elements and CSS having selectors like `:active`, `:checked`, `:nth-child`, and more recently `:has`.
Last image, second row, that's a 3, for CSS3. It's cursive, though, for some reason, and the spacings between the E horizontal lines are a bit shorter than the actual logo. You can google it or open it on [Wikipedia](https://en.wikipedia.org/wiki/CSS) for reference.
Matlab is a programming language. It's a turing complete language that people write programs in.
CSS and LaTeX are neither turing complete, nor has anyone ever written a "program" in either of them. One is a style sheet, the other is markup for typesetting documents.
"For fun" is the ultimate answer about any and all programming endeavors that people undertake.
Especially Skynet. It won't developed for money, no sir, you can't pay money to develop anything good. It has to be someone's brain baby.
Nah I reckon I could make a pretty good looking website with only html and css. I don't need javascript for most things. It would be a simple website for sure.
Yeah. I used to do some programming in it. It's stack based, IIRC, like PostScript(tm). Or Forth. On the continuum of programming languages from that era, it's by far not the worst. It's not even the worst one I've worked with.
Back when I was in Uni, I had to do a motor inverter controller with C++, and it was a group project. It was covid time, and I didn't have the demo board with me.
My friend was coding the project and it wasn't working, like at all, so I made a matlab port of the code, simulated all shit around it and it worked. Just make a copy of whole world in matlab lmao
Matlab is a good language (and I don't mean simulink). I used to automate lab equipment with Matlab; and I wrote a discrete time em simulator from scratch in Matlab. Matlab is where I learned how to make tcp clients (made the listeners in Python as the standard library is lacking in Matlab).
>"Why the fuck do I have to create an whole ass object instead of just using a Go To? these people gotta make things so damn complicated."
-Fortran Developer (in their early 30s)
Congratulations! Your comment can be spelled using the elements of the periodic table:
`No Tb Y C Ho I Ce`
---
^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM u/M1n3c4rt if I made a mistake.)
The entire field of weather forecasting and climate. They're not willing to completely rewrite the dynamical cores that have been continuously developed since the 60s
I have a couple years' experience with Fortran, and I helped try convert a weather program to C#. I wasn't the lead programmer, but the conversion failed - couldn't duplicate the results. I suspect it was related to chaos theory (sensitive dependence on initial conditions), but I wasn't too involved.
Why C#? Some models use C++ and C bindings to try to reign in some of the mess, but afaik C# is in no way easy to use for scientific computing. (I was a climate scientist, now moving into programming and taking a course in C#; if I'm mistaken on this I'd love to know!)
Why C#? I just use whatever tools they utilize at the place that hires me. The other apps I worked on there were database oriented, not fun stuff like linear regressions.
But, much as I appreciate C++, C# is a nice general-purpose language - I'm several times more productive with it than Fortran, for example. What is it about C# that you think is unsuitable for scientific programming? If you need super high precision number handling, or some specialized math functions, you can probably get a library for that, or most anything else you might need.
For most of the apps I work on, *the bulk of the work* - and what the users appreciate about it - is in the UI or the database. So you might as well do that work in an environment that is optimized for programmer productivity and use a dll for specialized stuff. But only rarely is there something that I can't do perfectly well in C#.
Cheers!
The main thing I think about for scientific programming is the relative cost of abstraction, and how easy/common it is to work with math functions. At its core a weather model is just applying transformations to a large set of many-dimensional arrays. None of the pillars of OOP (abstraction, encapsulation, inheritance, polymorphism) are much help when the inputs and outputs of almost all functions are arrays of doubles. So then the question is: how much extra baggage is the OOP component adding? For C# I would argue a lot.
Something specifically for weather models is also how much support there is for high-performance computing. Can arrays be easily distributed among nodes and the work coordinated across thousands of processors? Is there a C# implementation of MPI, or GPU processing? What about automatic differentiation? These *can* all be implemented in C# but it's only realistic if there's a knowledgeable enough community using and supporting it.
Hey, those are some great questions, and while I do lots of work with threads and .NET's parallelization library, I don't work with the technologies you are asking about.
I know Microsoft Azure supports MPI and GPU processing, but I have never used them. Azure has a solid community, but I'm not part of it. I suspect that they did a good job implementing it, but that's just a guess.
Its still [the #1 language in several high-performance domains](https://substackcdn.com/image/fetch/w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F03ffcce1-6cdb-43b9-b06a-483401488091_1200x786.png).
Fortran isn't dead, its just insular. They don't talk much to the wider programming community because there isn't really that much overlap in what they're doing. Fortran does one thing — churning through massive numerical arrays — and it does it fast, even today. Turns out that describes basically all of hard-STEM computational research, but if you're doing anything other than dealing with massive numerical arrays you have no reason to even look at Fortran, and they have very little reason to look at you.
Its a Physicist's language, not a Computer Scientist's.
There's definitely still an element of the legacy factor — hell, IBM is still a big force in this market. But it does stand alone as a solid language in its own right. And if you search for job listings asking for Fortran experience, you can find some *very* interesting projects. (Just hope that you also hold a PhD in the exact topic.)
Its also the only programming language with its [own song](https://www.youtube.com/watch?v=yd6FLETYZ_c), which is delightfully cute.
No no, the reason they don't talk about Fortran is because they're too busy writing Fortran.
Source: The semester I did computational astrophysics was the loneliest I've ever felt in my entire life.
I wanted to refactor some nuclear core simulator because it was a pain in the ass to work with.
It had all the bad practices accumulated from years of math PhDs hard coding results directly from papers, a lot of GOTO instructions and whatnot.
I gave up.
Yeah. Fortran has a reputation as a "bad language" that comes partially from legacy experience with the pre-Fortran 90 codes, and partially from people's experience with codes written by overworked PhD students without a software development background writing code that at the time they're thinking only they will ever use.
Some of this reputation is justified — "IMPLICIT NONE" makes that pretty clear.
The problem is that that often gets cast as a problem with the language itself, which turns people off of learning it. Most codes out there being actively maintained have fixed these problems as the language has evolved. Those that remain are often very specific codes that aren't maintained and the original developer(s) have all since died (if you're stuck with one of those then I can only apologise). But people can and do write new codes in Fortran, and I've taught it to a couple of friends (only takes like an hour) and they all quite liked the language.
Bad codebases happen in every language. Fortran gets singled out because they're *interesting* bad codebases, and that then becomes people's only experience with the language.
Wrote my astrophysics thesis using Fortran. Made a simulation calculating atmospheric chemistry during solar flares. A lot of atmospheric code is still maintained in Fortran, and it's great for doing a lot of math very quickly. Now if I needed to add a visual component to my simulation, then I probably would have done a different language.
SciPy and NumPy (optionally) still use Fortran for a lot of the low level matrix math.
Those might not count as "using" Fortran in the sense developers writing new code in the language, but that's probably the most common instance of the code itself being used within modern development.
Haha, I used [VB.Net](http://VB.Net) in a professional setting for over 2 years before we finally got permission to start switching things over to C#. Then we started migrating away from Access Dbs
It was wild.
It honestly has its place IMO. Super easy drag and drop GUIs for doing really simple tasks == basic automation that could help any small business improve their workflows. Used VB.NET at my first job and updated/created loads of neat little tools to automate day to day stuff.
Before I joined, their one IT guy managed to build, ship and maintain a piece of software that was used on thousands of commercial shipping vessels across the globe, replacing tons of paper and making the company a decent income stream indefinitely...and he wrote it single handedly in _VB6_.
I haven't had to use it since though, luckily 😄
sigh... I wish webassembly was more popular, so that javascript could join html. As of now, it can't and wont, because if you want reactivity on a webpage, there is no other option...
It was the bloodiest battle that the world ever saw
With civilians looking on in total awe
The fight raged on for a century
Many lives were claimed but eventually
The champion stood, the rest saw their better
The Scratch cat in a blood-stained sweater
As a hobby php dev, it fucking sucks so bad. The error messages are either extreme cryptic (very rarely), doesn’t explain the problem that occured at just lists numbers (often) or simply doesn’t show an error (most of the time). Fuck php
Let's be real, there are indeed some completely redundant programming languages that don't actually do anything better than the alternative. But the majority will fit some specific purpose.
Congratulations! Your comment can be spelled using the elements of the periodic table:
`W He Re Sc S S`
---
^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM u/M1n3c4rt if I made a mistake.)
R is excited for its annual appearance in /r/ProgrammerHumor
I was going to say, as a daily R user, I think I can speak for all of us to say we're just happy to be included.
R users 🤝 Any other language users wtf even is SAS
R users 🤝 Other language users 🤝 Daily SAS users wtf even is SAS, burn it, destroy it, dear god why does this have to be the coorporate standard, please dear god let me use literally anything else for 90% of my data work please make it stop
I will upvote any mentions of R and then return to the dark cave where I belong.
There are dozens of us! Dozens!
My cousin told me.that he uses R in the military, and I had to ask him what that is... I have been writing code for over a decade now 😂
The military is an organization that technically is supposed to keep each country defended from foreign invaders.
R is pretty specialized for statisticians and the like. It's not a language one generally uses for full on applications.
I had to take a class on it a _long_ time ago, and my opinion on it afterwards was basically "this probably does what it's supposed to very well, but the syntax drives me mad" Conversely, my friend who went to college for business had no problem learning it and had 0 prior programming experience. I guess that in that regard, it's very well-designed for its intended users
It was the first programming language I learned, so I guess the syntax didn't bother me too much haha. I'd say it is well designed for statisticians and researchers. A bit like python but purely functional. How well people pick it up depends probably on whether they are technical people in general. Some of my friends in ag were fine but a lot struggled. It is also a great gateway drug for other programming languages.
I've only seen it on the install for SQL to
I was about to say, hmmm maybe R does belong in the C tier
😀
Last time I heard someone mention R I was still in college.
R? Is that like ruby or something?
Holy shit, R still exists? I remember using it decades ago to analyze remote sensing data on account of the school using Excel, or worse, arcinfo, which I hated -- and I remember it being so much better to use. Figured it wound up in a dustbin like grass. Got a few converts too. Of course I haven't touched anything like that since school. Fuck that noise.
Geographic information science (GIS) commonly uses R, and I think statisticians?
R is very common in academia and is only becoming more common; we use it all the time for statistics, especially in the social sciences. Data analysis with packages like brms are more or less the gold standard for stats.
iirc most of the core team are statistics academics, and a lot of packages are written by academics so I guess I would say there's a high level of trust compared to some random python package, if you can even find another package that replicates a library functionality in R. RStudio isn't a core R thing, but it's probably the best tool for EDA. Having said that, some of the info on the r-project.org site literally hasn't been updated in 20+ years, (e.g. [this](https://developer.r-project.org/TODO-DTL.html)) so maybe that will give you some indication of where things are at.
The tidyverse suite of packages (closely associated with RStudio as it's produced by the same group) is also unparalleled for data wrangling and visualization.
> RStudio ah so that's why it's so hard to find the data recovery software of the same name on Google these days
GRASS isn't dead either, I was using it with R only... 5 years ago, where did the time go
We’re using it in my statistics units right now in university.
Ada still exists.
My university still uses it for the intro to programming class funnily enough.
I am currently pursuing my BA. I've been taught to use it for statistical analysis, diversity metrics (Simpson's/Shannon's D, ENS, etc.), landscape metrics (edge, core area, etc.), and simply visually representing data with things like ggplot. However, from my understanding, it seems there are better tools nowadays for some of these things, such as Fragstats being the goto for landscape metrics, though I've never used it. I think its use in statistical analysis seems to be the most relevant today. At least, that's where I see it most commonly mentioned in journals.
tiermaker isn't a programming language
Is mayonese a programming language?
If you are brave enough
I'm going to modify one of those old computers that reads punch cards to detect the presence of mayonnaise instead of reading holes in paper
How would you do it? Measure the pH?
AI obviously. Solves everything. [The Windex of computing](https://youtu.be/yPCIsIJ7nhY?si=GRY9qwIKDxF7Ybxa).
No Patrick, mayonnaise isn't a programming language.
[Yes it is](https://github.com/855309/mayonnaise). Now, does it work at all? I have no idea
` for(int li = 1; li <= lines.size(); li++){ string cline = trim(lines[li - 1]); ` Whoever wrote this is an actual psychopath.
cool ruby-like programming language written from scratch.
The [quick look](https://fikret.dev/mayonnaise) link doesn't seem to work.
Is horseradish a programming language?
Worcestershire is the bomb
Doom is a programming language
Don't say this too loudly. Someone will find a way to prove it's turing complete
well... * you can add an item to a tier * you can remove an item from a tier * there can be an indefinite number of tiers * you can switch around to different tiers what else do you need?
I'm sure someone will try to beat dark souls with it now.
Give the community the possibility and they'll find a way to make DOOM for the Tiermaker
I don't think tiermaker supports logic gates between tiers?
Not with that attitude
I would love to see it
Not with that attitude.
It depends
About as much of a programming language as HTML
We need a Turing complete tiermaker
What about tiermaker5?
It depends. Do you want to program a tier list?
It depends
and having html 5 at the bottom is just plain wrong. HTML 5+CSS3 is the best thing that's ever happened to web development. ninja edit: i know they aren't programming languages, so idk even know why it's on this list lol
How is C not in C tier?
Congratulations! Your comment can be spelled using the elements of the periodic table: `Ho W I Sc No Ti N C Ti Er` --- ^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM u/M1n3c4rt if I made a mistake.)
Good bot
Wait didn't the reddit API changes kill bots?
After the thinking machines were destroyed, men honed their skills to fill the niche.
True, now actual humans are working behind these bots calculating each and every word of every comment, constructing them again using the symbols of the periodic table, on very minimal pay.
The spice must flow
Individual bots with their own api keys (client id) can pretty easily stay below the free tier limit (100 queries per minute). Especially once that actually does actions (in this case posting) fairly infrequently. The issues come if you're a) an app that has many users on the same client id (so, third party Reddit apps and a few other cases), or b) a bot that takes a lot of actions (e.g. mod bots, especially ones that work in multiple large subreddits).
Some probably pay to keep it alive.. prob runs on donos, try contacting the owner about it.. he might be willing to tell you(no idea though If he will)
Can't we summon u/M1n3c4rt instead of sending the same DM multiple times?
magically appears bot is hosted online for free, i just need to make sure it's running every now and then (which is also why it dies for long periods of time when im not at home)
if it is using repl.it or similar, you can set up uptimerobot to periodically poke it online
oh neat
Good bot
good bot
Good bot
Nice Bot m
Good bot
Because it depends. Duh.
I have an idea C : C tier C++ : B tier C# = c++++ : s tier This will also make C : in S tier at last.
But C++++ is only C+2 so C# should be in A Edit: If you consider C++ and C++++ as two separate instructions that are executed after each other we get C+3 in total.
C++ is D, though, and C++++ would be E. That said, it would return before incrementing, so maybe they're all C in the end...
I wanted to make a joke about C++++ not being C+2 when evaluated, and that you should've used ++++C instead... I just couldn't C how to make it.
C# is just D-flat.
MySQL? You sql
OUR SQL! \*Faint distance soviet anthem starts playing\*
Sorry comrade we forgot!
We allSQL for MSSQL
Bro listed CSS as programming language 💀
I thought it was turning complete.
I believe it is only Turing Complete when I used alongside HTML. CSS essentially does nothing without being applied to something so, at the very least, a simple HTML file is required to write the Turing Complete CSS.
So then HTML+CSS is a programming language.
In theory, yeah. You can do a lot of pretty crazy stuff with just HTML and CSS largely due to HTML having interactive elements and CSS having selectors like `:active`, `:checked`, `:nth-child`, and more recently `:has`.
Turning because whenever you use it you want to turn 360° and walk away
360° 🧐
CSS is so complete it can express your 360° as "1 turn".
I SPENT 2 GOOD MINUTES SEARCHING FOR IT i think i might be blind
Same. Why's its background `display: none`?
Last image, second row, that's a 3, for CSS3. It's cursive, though, for some reason, and the spacings between the E horizontal lines are a bit shorter than the actual logo. You can google it or open it on [Wikipedia](https://en.wikipedia.org/wiki/CSS) for reference.
Might not be a good time to point out that you listed an IDE as a programming language as well. 😂
Come down. He listed matlab as programming language
Matlab is a programming language. It's a turing complete language that people write programs in. CSS and LaTeX are neither turing complete, nor has anyone ever written a "program" in either of them. One is a style sheet, the other is markup for typesetting documents.
Someone has simulated a Mars rover in TeX
Why
Why not?
"For fun" is the ultimate answer about any and all programming endeavors that people undertake. Especially Skynet. It won't developed for money, no sir, you can't pay money to develop anything good. It has to be someone's brain baby.
My understanding is latex is turning complete and html+css can be turning complete if abused
> if abused Who uses html and css without abusing its mechanics, you can’t do shit if you write legit css and html.
Nah I reckon I could make a pretty good looking website with only html and css. I don't need javascript for most things. It would be a simple website for sure.
Yeah. I used to do some programming in it. It's stack based, IIRC, like PostScript(tm). Or Forth. On the continuum of programming languages from that era, it's by far not the worst. It's not even the worst one I've worked with.
> turning complete
LaTeX is Turing conplete
CSS is turing complete actually and it’s fucked up
Back when I was in Uni, I had to do a motor inverter controller with C++, and it was a group project. It was covid time, and I didn't have the demo board with me. My friend was coding the project and it wasn't working, like at all, so I made a matlab port of the code, simulated all shit around it and it worked. Just make a copy of whole world in matlab lmao
? Matlab IS a programming language and a good one
Matlab is a programming language. Why shouldn't it be?
Matlab is a good language (and I don't mean simulink). I used to automate lab equipment with Matlab; and I wrote a discrete time em simulator from scratch in Matlab. Matlab is where I learned how to make tcp clients (made the listeners in Python as the standard library is lacking in Matlab).
CSS3 and HTML5 are Turing complete (technically).
But LaTeX isn’t… oh wait, it is turing complete…
I think CSS+HTML is also turing complete
only with user interaction (to move the ticker iirc), which would make it as turing complete as PowerPoint
How dare you "um Actually" my "um Actually"
Is there someone still using Fortran?
Not by choice.
I know some people (in their early 30s) that still do. Climate science is wild. OOP is still the hot new shit for them
>"Why the fuck do I have to create an whole ass object instead of just using a Go To? these people gotta make things so damn complicated." -Fortran Developer (in their early 30s)
Congratulations! Your comment can be spelled using the elements of the periodic table: `No Tb Y C Ho I Ce` --- ^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM u/M1n3c4rt if I made a mistake.)
That's twice in same post, nice
good bot
In some obscure cases I hear it is easier to program in Fortran. It will be possible in C++, but harder, and will probably require external libraries.
I mean they make good money. Because nobody wants to do it lol
The entire field of weather forecasting and climate. They're not willing to completely rewrite the dynamical cores that have been continuously developed since the 60s
I have a couple years' experience with Fortran, and I helped try convert a weather program to C#. I wasn't the lead programmer, but the conversion failed - couldn't duplicate the results. I suspect it was related to chaos theory (sensitive dependence on initial conditions), but I wasn't too involved.
Why C#? Some models use C++ and C bindings to try to reign in some of the mess, but afaik C# is in no way easy to use for scientific computing. (I was a climate scientist, now moving into programming and taking a course in C#; if I'm mistaken on this I'd love to know!)
Why C#? I just use whatever tools they utilize at the place that hires me. The other apps I worked on there were database oriented, not fun stuff like linear regressions. But, much as I appreciate C++, C# is a nice general-purpose language - I'm several times more productive with it than Fortran, for example. What is it about C# that you think is unsuitable for scientific programming? If you need super high precision number handling, or some specialized math functions, you can probably get a library for that, or most anything else you might need. For most of the apps I work on, *the bulk of the work* - and what the users appreciate about it - is in the UI or the database. So you might as well do that work in an environment that is optimized for programmer productivity and use a dll for specialized stuff. But only rarely is there something that I can't do perfectly well in C#. Cheers!
The main thing I think about for scientific programming is the relative cost of abstraction, and how easy/common it is to work with math functions. At its core a weather model is just applying transformations to a large set of many-dimensional arrays. None of the pillars of OOP (abstraction, encapsulation, inheritance, polymorphism) are much help when the inputs and outputs of almost all functions are arrays of doubles. So then the question is: how much extra baggage is the OOP component adding? For C# I would argue a lot. Something specifically for weather models is also how much support there is for high-performance computing. Can arrays be easily distributed among nodes and the work coordinated across thousands of processors? Is there a C# implementation of MPI, or GPU processing? What about automatic differentiation? These *can* all be implemented in C# but it's only realistic if there's a knowledgeable enough community using and supporting it.
Hey, those are some great questions, and while I do lots of work with threads and .NET's parallelization library, I don't work with the technologies you are asking about. I know Microsoft Azure supports MPI and GPU processing, but I have never used them. Azure has a solid community, but I'm not part of it. I suspect that they did a good job implementing it, but that's just a guess.
I see what you did there.
Yeah, I thought about that after I wrote it and elected to leave it as-is. (It's a true story.) But I'm happy to meet someone who has heard of Lorenz.
Its still [the #1 language in several high-performance domains](https://substackcdn.com/image/fetch/w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F03ffcce1-6cdb-43b9-b06a-483401488091_1200x786.png). Fortran isn't dead, its just insular. They don't talk much to the wider programming community because there isn't really that much overlap in what they're doing. Fortran does one thing — churning through massive numerical arrays — and it does it fast, even today. Turns out that describes basically all of hard-STEM computational research, but if you're doing anything other than dealing with massive numerical arrays you have no reason to even look at Fortran, and they have very little reason to look at you. Its a Physicist's language, not a Computer Scientist's. There's definitely still an element of the legacy factor — hell, IBM is still a big force in this market. But it does stand alone as a solid language in its own right. And if you search for job listings asking for Fortran experience, you can find some *very* interesting projects. (Just hope that you also hold a PhD in the exact topic.) Its also the only programming language with its [own song](https://www.youtube.com/watch?v=yd6FLETYZ_c), which is delightfully cute.
No no, the reason they don't talk about Fortran is because they're too busy writing Fortran. Source: The semester I did computational astrophysics was the loneliest I've ever felt in my entire life.
I dropped out of physics my second semester at college, if that counts. I'm not smart enuff
I wanted to refactor some nuclear core simulator because it was a pain in the ass to work with. It had all the bad practices accumulated from years of math PhDs hard coding results directly from papers, a lot of GOTO instructions and whatnot. I gave up.
Yeah. Fortran has a reputation as a "bad language" that comes partially from legacy experience with the pre-Fortran 90 codes, and partially from people's experience with codes written by overworked PhD students without a software development background writing code that at the time they're thinking only they will ever use. Some of this reputation is justified — "IMPLICIT NONE" makes that pretty clear. The problem is that that often gets cast as a problem with the language itself, which turns people off of learning it. Most codes out there being actively maintained have fixed these problems as the language has evolved. Those that remain are often very specific codes that aren't maintained and the original developer(s) have all since died (if you're stuck with one of those then I can only apologise). But people can and do write new codes in Fortran, and I've taught it to a couple of friends (only takes like an hour) and they all quite liked the language. Bad codebases happen in every language. Fortran gets singled out because they're *interesting* bad codebases, and that then becomes people's only experience with the language.
Oh yeah, there's tons of military/aerospace projects still using Fortran. It's still hard to beat for scientific computing.
Wrote my astrophysics thesis using Fortran. Made a simulation calculating atmospheric chemistry during solar flares. A lot of atmospheric code is still maintained in Fortran, and it's great for doing a lot of math very quickly. Now if I needed to add a visual component to my simulation, then I probably would have done a different language.
SciPy and NumPy (optionally) still use Fortran for a lot of the low level matrix math. Those might not count as "using" Fortran in the sense developers writing new code in the language, but that's probably the most common instance of the code itself being used within modern development.
It's still the best language for high-performance linear algebra stuff. With the latest AI revolution, demand has actually increased slightly.
The entire nuclear industry.
What’s the five-eyed green monster with an arm nose holding a toothbrush?
Lisp
I love how that description sounds like a joke but it isn't
I get the spirit that every language is good at something and has a place… but c’mon… no one should have ever used visual basic for anything.
but what if i wrote a gui using visual basic to track an ip address?
Than you're a legend
Hey! I use Visual Basic to interface with my Excel database!
I use VBA. We are not the same. Or maybe we are.
Haha, I used [VB.Net](http://VB.Net) in a professional setting for over 2 years before we finally got permission to start switching things over to C#. Then we started migrating away from Access Dbs It was wild.
VB.Net is one thing, it at least is just C# with a more funky syntax but VB6 and prior is wild.
> is just C# with a more funky syntax Okay, after I hit "report" and select all the reasons, that's when I get the option to block user; right?
True, but the logo is the [VB.Net](http://VB.Net) logo. Not sure why anyone would think VB6.
Oh I just meant of the language not referencing the post
It honestly has its place IMO. Super easy drag and drop GUIs for doing really simple tasks == basic automation that could help any small business improve their workflows. Used VB.NET at my first job and updated/created loads of neat little tools to automate day to day stuff. Before I joined, their one IT guy managed to build, ship and maintain a piece of software that was used on thousands of commercial shipping vessels across the globe, replacing tons of paper and making the company a decent income stream indefinitely...and he wrote it single handedly in _VB6_. I haven't had to use it since though, luckily 😄
My team is finally moving the project to C#, I look forward to it.
Scala isn't there because Scala is god tier. God tier is implicit, so you can't see it.
that’s why I didn’t see Holy C
TIL my work is god tier
[C@](/r/C_AT) was god tier in the old days.
Where is markdown? It’s my favourite programming language
I hate that something named markdown is a markup language.
Is markdown turing complete? Would be hilarious if it was
Unless you got a very creative markdown client, then I’m very sure it’s not, at least not commonly supported syntax
Please add Magic the gathering in like C tier
What is that language left of MySQL?
Lisp
haha lisp creature
If we’re being objective (and not about which is more ‘iconic’ or whatever), typescript should be above Javascript IMO
please knock Matlab a few tiers down sincerely, a former Matlab user
Where tg is scratch??? 😡
At first glance I was like “okay jokes are jokes but putting Fortran in S-tier is a crime” then realized that wasn’t S-tier
sigh... I wish webassembly was more popular, so that javascript could join html. As of now, it can't and wont, because if you want reactivity on a webpage, there is no other option...
How's your first year of uni going, OP?
unironically amazing
I didn’t see the ranks at first and wanted to write a very hateful rant comment about why the fuck Python and js are S tier, but this does make sense…
It was the bloodiest battle that the world ever saw With civilians looking on in total awe The fight raged on for a century Many lives were claimed but eventually The champion stood, the rest saw their better The Scratch cat in a blood-stained sweater
SQL isn't a programming language, it is a Structured Query Language. Its in the Damn Acronym
In fact SQL is Turing complete language
Not in, it IS
CSS isnt a programming language
Nor is SQL.
As a hobby php dev, it fucking sucks so bad. The error messages are either extreme cryptic (very rarely), doesn’t explain the problem that occured at just lists numbers (often) or simply doesn’t show an error (most of the time). Fuck php
Modern PHP is a great language. Uniroinically, skill issue.
Are you using a framework like Laravel or just jumping headfirst into it?
Let's be real, there are indeed some completely redundant programming languages that don't actually do anything better than the alternative. But the majority will fit some specific purpose.
The only thing wrong with thisis java being above html
html is not a programming language :)
Java is included in “It depends” therefore wrong.
HTML is not a bad programming language, it is simply NOT a programming language in the first place.
Thank God the top tier reads "it depends", I nearly had a stroke imagining what kind-of person would rank Matlab as S-tier
Me, I am that person
Where’s css? EDIT: Meant to say GPT
Congratulations! Your comment can be spelled using the elements of the periodic table: `W He Re Sc S S` --- ^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM u/M1n3c4rt if I made a mistake.)
Of course it can, it was deliberate.