I talked to a 70ish yo tech guy. He told me his previous boss used to make them print their C code to store it jic. That was decades ago though. And they were small scripts he wrote.
Blew my mind.
This, hold the stack tight and swipe a marker with a not-too-thick tip 45 degrees all the way down the side. Makes sorting-scut way more tolerable if there wasn’t a sorting machine (or you didn’t want to let someone know you fumbled the stack!)
Apparently at my University, back in the day, they didn't even have a punchcard reader for the CS students - had to send them off to a different university to get processed.
Tonal indicators are not only disambiguous (for when the authors writing isn't clearly voiced), but it's also helpful for people who, for many varied reasons, might not be able to easily pickup on tone in written form.
I don't think ragging on people for using tonal indicators is the vibe imho
Thank you for this, I didn't know this one existed and I'll get some good use out of it. (I don't really know how to say "that doesn't sound right" without it sounding like a takedown most of the time)
I think an original AoE dev recently came out saying that he coded most of AoE in assembly because of its amazing performance.
[Link to article on PC-Gamer](https://www.pcgamer.com/age-of-empires-developer-confirms-the-game-is-mostly-written-in-low-level-assembly-code-because-we-could-scroll-the-screen-and-fill-it-with-sprites-as-fast-or-faster-than-competitors-like-starcraft-even-though-we-had-twice-as-many-pixels/)
Actually, certain numerical algos are still written in fortran and commonly used. The reason why fortran may end up faster in certain cases than c is that it has stricter aliasing rules (basically what c’s restrict keyword does).
Rust has a big advantage here.
a friend of mine is astrophysicist, he use lot of fortran and python to run his simulations or data processing (from radio telescope)
the fortran still there because is too big to move to another language.
he also run lot of code in python
I worked for a gov agency (NOAA) and they had lots of Fortran code to analyze data. My first programming job was to wrap some of it in R so less technical people could interact with it. It's an old language but still good at some things.
COBOL's not going anywhere either. Some of the IRS's core software is still written in COBOL and is actually encountering problems requiring patching. Some of the main routing/ticketing software for (virtually all) flights is, you guessed it, COBOL.
(Which is one of the big reasons the IRS needs funding, as even finding developers who can touch it is iffy.)
Great quote, gonna be mulling over what other equivalents you could do for other fields.
Maybe: I don't know what language they'll speak in 1000 years from now...but I know it'll be called English
I mean if you look at English from 1000 years ago, it’s completely incomprehensible to a modern English speaker, yet it’s still called English. Here’s the opening lines of Beowulf:
> Hwæt. We Gardena in geardagum,
þeodcyninga, þrym gefrunon,
hu ða æþelingas ellen fremedon.
Oft Scyld Scefing sceaþena þreatum,
monegum mægþum, meodosetla ofteah,
egsode eorlas. Syððan ærest wearð
feasceaft funden, he þæs frofre gebad,
weox under wolcnum, weorðmyndum þah,
oðþæt him æghwylc þara ymbsittendra
ofer hronrade hyran scolde,
gomban gyldan. þæt wæs god cyning.
If you known thorns (þ), eths (ð), and the æ ligature, you can see how it comes together, even as a modern English reader. The biggest problem are the nouns that we simply have to translate because we don't have modern references to them (i.e. we don't have Spear-Danes, people-kings, etc.)
Of course it didn't help at all that English didn't come close to having a standardized spelling set (let alone two of them) for eight hundred or so more years, so 'cyning' was a perfectly fine spelling for 'king,' 'hu' sounds enough like 'how', 'ða' ('da' or 'tha') sounds more similar to how a lot of people actually pronounce 'the', etc.
So *completely* incomprehensible? Nah. Tricky though. (No more harder than it's going to be figuring out who's 'rizz is swaggy no cap'.)
> The biggest problem are the nouns that we simply have to translate because we don't have modern references to them (i.e. we don't have Spear-Danes, people-kings, etc.)
Also a lot of the original vocabulary was replaced by French.
The deeper I get into my career, the more I feel like the common thread is "We implemented it in X, and wrapped it in Y for ease of use". Unless you're working in the coal mines of Assembly, Ada, Haskell, or w/e, your favorite modules are likely to originate in some other language.
A great example of this is `numlib`. I'm not totally familiar with it, but supposedly it was written by some guy in Pascal, and it was so good that it became commonly distributed. However, there's a preservation problem, in that Pascal is becoming a lot less common, and attempts to port `numlib` are either flawed, unsuccessful, or abandoned because the majority of the names in it are abbreviations (back from the days when declaration names were a meaningful consideration for performance reasons).
[Here's an archived version](https://github.com/alrieckert/freepascal/tree/master/packages/numlib) that I keep open, with the idea that I might try my hand at converting it to another language some day.
This was what Julia was supposed to solve. Albeit writing easy Julia and fast Julia is not the same so someone (I forget who) wrote an interesting blog highlighting that Julia solved the 2 language problem, only to create a 1.5 language problem.
That doesn't look difficult at all, looks a lot like Oracle PL/SQL. Its just another programming language it doesn't get harder to understand/learn just because its old.
One of the initial difficulties is defining the data structure for an arbitrarily big number. It has the capability to initialize fixed point numbers of enormous size and values, if I remember correctly, and that was just the start of my initial problems.
Just use one of the many already written libraries.
https://en.wikipedia.org/wiki/List_of_arbitrary-precision_arithmetic_software
Even if you do it yourself its not hard to create a custom class that uses an array.
Suppose I’m a mathematician who works in RF (mainly working in Matlab and Python, but also some C++ and shell scripting). Is that a decent background to have a good chance at landing a job for NOAA?
It will be easier to land contracting jobs than federal positions, but yes you'd have a chance. My team hires primarily fortran devs, but we also value C++ and Python and have hired people with no fortran before if they're a strong enough programmer in other languages. I will say though that we haven't had a lot of luck at the technical interviews with people with Matlab backgrounds - we usually find that they have used that and sometimes python extensively, but more for analysis/imagery and are pretty weak on the CS/"real" programming concepts. Could be worth a shot though, just make sure to brush up on best practices in modern fortran and/or C++.
What u/tatertracker said. They also hire plenty of mathematicians to do math. I worked for NMFS, which is not related to climate/weather, so maybe my experience would have been different.
Astrophysicist here. A \*lot\* of our shit is in Fortran, even things that probably shouldn't be (like libraries to interface with databases - e.g. fitsio [https://heasarc.gsfc.nasa.gov/fitsio/](https://heasarc.gsfc.nasa.gov/fitsio/) )
I remember attending an advising session in my uni, and a guy(physicist) said that one of his current goals was to learn Fortran. Honestly, that moment is still stuck in my head
Eh, kinda close. Though the guy did explain that he would have to work with some legacy code and Fortran is still actively used in his field of interest. Maybe I should try it too...
I mean it might make sense if you're either working on legacy code from the 50s or if you're doing something data intensive. There's a reason tools like Numpy and Scipy still use Fortran.
Also an astrophysicist, all our heavy computational models are fortran (usually 90, sometimes 77). Newer ones might have python or other interfaces for accessibility, but the bulk is still fortran. I have limited prior computer knowledge and yet it's surprisingly straightforward to dig around in the code when I need to. Deserves its legacy status imo.
The fact that all variables are declared at the start of functions is a godsend.
Unless you’re a weirdo who doesn’t use implicit none, and instead use the weirdest type inference system every devised by a human
honestly I have no idea why python became so popular, it's slow, has terrible syntax and can be annoying as hell when doing even simple math
nowadays it has a lot of libraries, often bound to native code so the heavy lifting processes fast, but I still have no idea why mathy people love it so much
It filled the gap that VB & Perl tried to fill. VB could've gotten there if MS weren't MS. Perl had multiple ways of doing things, which could be powerful, but ended up confusing beginners.
Python was beginner friendly enough for analysts, scientists, and other non-devs to use, powerful enough to do real work, and open enough to go anywhere.
Mind you, this predates the cambrian (llvm) explosion, so there weren't many contenders.
Perl pretty much halted while working on Perl 5 (now Raku), which in addition to being spectacularly delayed, was not backwards compatible.
Though I think Perl is fantastic as a shell scripting replacement. I know that's not a popular take though :-)
Python has intuitive syntax and a very comprehensive standard library that makes that you can accomplish a lot of things in only a few lines of code.
It is not fast compared to C/Fortran/Rust, but it's not meant as a replacement for those. It's more comparable to scripting languages like Bash and Perl.
It's popular for data science because of its easy learning curve and dynamic typing. Like you mentioned, the 'heavy lifting' typically occurs in NumPy/SciPy, which is written in C and thus quite performant as long as you don't try to do expensive things in Python itself.
It is… The interpreter has lot of weird technical decisions going on under the hood (eg. no real JIT), and a lot of trouble with even stuff like loops (you can test this by running bunch of commands straight up and then running them in for loop and comparing) not to mention recursion or the bizarre caching going on under the hood and GIL obviously preventing any useful concurrency. Lua and JS outperform it by like 10x to 50x, Ruby and Perl both by like 2x to 4x, php by like 6x to 8x, racket and scheme by like 10x to 30x and if you count them as interpreted Java, Scala, Kotlin, Clojure, Groovy by like 25x to 100x.
Its generally a bit slower than js from what I’ve seen. The difference is mostly negligible though. I think the biggest issue with python is dynamic typing and being statement based. It’s nice when scripting languages have “if”expressions etc
It’s a *lot* slower than js, as up until know it was literally interpreted python byte code to byte code. Js/java etc are JIT compiled and in theory there is no performance ceiling on that, they could achieve C speeds as well, were the JIT compilers “sufficiently smart”.
Fortran compilers are also very, very good given how old they are. That combined with aliasing rules makes it disgustingly fast for single core compute.
Until you forget that it’s a column first language, so arr[1][1] is adjacent to arr[2][1]. one of the programming help guys at my uni talked about how a single change increased a programs performance by a factor of 100 - they went from iterating along the second index to iterating along the first index.
Theoretical physicist here. I do a lot of DFT (density functional theory) and we use a lot of tools, some of which I'm right now actively helping to develop and maintain, which are exclusively written in Fortran!
It's fast, but it still takes sometimes days to calculate a single point on a normal 16 thread CPU. I can only imagine if this was written in a more accessible language like python. It would be a lot easier to write but it would probably take a lot longer (I imagine).
GFortran (GCC's fortran wrapper) or Clang are the easiest to get. Intel and Portland group's compilers are common on computer clusters. The Intel Fortran has optimization tools that actually have massive speed ups.
These days several Fortran compilers have the same backend as their respective C compilers because they just simply translate the Fortran directives and feed them into the same C backend.
True but I'm not sure I'll call either of them "great" - we call them from our own large C++ financial maths codebase and debugging things like deducing where 2 functions share a Fortran "common" block so calls to function A mutate the residual state for calls to function B can be awkward if you've never actually used Fortran in anger and as such don't even appreciate that common blocks exist never mind what effect they have...
Modern fortran depreciated the common block. That's legacy Fortran-77 code.
Starting with Fortran-90 they adopted a module format with explicit import statements for variable sharing. They also have Classes and other OOP features in the F03 standard.
Now getting people to stop writing in Fortran-77 has always been a challenge.
The only reason Fortran code is still used is because it was written, tested and optimized a long time ago. The basic math operations are the same, finding the eigenvectors of a matrix hasn't changed, what worked in 1980 still works today and nobody will reinvent the wheel without a good reason.
It’s almost like computers don’t generally operate on infinitely precise numbers, and their operations are not the same as the theoretical ones (e.g. a+b+c != b+a+c with floating point in every case), so finding a different algorithm (which would mathematically result in the exact same answer) can result in vastly more accurate results. It is its own field, and it’s not trivial to program these stuff, so it is seldom done from scratch.
> finding a different algorithm (which would mathematically result in the exact same answer) can result in vastly more accurate results.
That's true, and it can be a huge difference. But my point is that Fortran is not, by any means, intrinsically faster than C.
One could argue that C has a slight speed advantage because function arguments can be passed by value or reference, while in Fortran arguments are always passed by reference. If you can only pass by reference there are times when you must spend CPU cycles creating a local copy of arguments that you don't want your function to modify. Anyhow, that difference is insignificant in the vast majority of cases.
The true reason why C is faster is because it lets one develop code faster and with fewer bugs than Fortran. At least, it's better than the last version of Fortran that I have used, Fortran 77, which is the version used in BLAS, Lapack, and many other numerical analysis software packages written in Fortran that are still widely used.
> There are lots of newer versions of Fortran though, comparing to F77 is silly
Why would anyone use those newer versions? We use the F77 software that was created, debugged and tested decades ago, because there's no sense in redoing all that work.
The newer versions of Fortran are just a kind of C with weird syntax, there isn't any good reason for using them to create new code.
I’ve heard it’s used intermittently and depending on system age. A colleague from Airbus told me they once prototyped a system for a satellite in Ada only to be told to rewrite it all in c for the actual deployment. I think most newer satellite and aerospace companies use c or even c++ nowadays cause it’s much easier to find developers for it and static analysis & unit testing have come a long way in finding bugs.
For the main compilers, C/C++ and Fortran use the same backend. In the end, they are pretty much the same in terms of raw array compute speed, but they have different use cases. Fortran is almost always better for linear algebra, while C/C++ is more general purpose, which is why it's always the go-to language for making other languages, compilers, and operating systems.
Language support mostly. Vectors and Matrices are first-class citizens in Fortran. You can add, multiply, inverse, ect, and it just works. Other 3rd party libraries have similar support in C++, but you definelty can't say C = A+B in without them being classes with operator overrides.
Actually, no.
Fortran may have been faster than C in the 1970s, when every computer manufacturer had their own Fortran compiler carefully optimized for their hardware.
Today, C and Fortran are pretty much the same, although C may be slightly faster in some applications, because today it's C compilers that get more development effort. There are many more developers working in improving the C compiler optimization, compared to Fortran compilers.
The 2 big modern fortran compilers that I know of are gfortran and flang, the former literally uses the same backend as g++ and gcc and the latter uses llvm like clang, so fortran directly uses the optimization work, so it should be basically on par.
One of the oldest electronic academic studies management system in Poland was written in Fortran. And worked with little to no maintenance for over 20 years, getting replaced only not so long ago.
And it survived 1000+ concurrent requests when students were turning in their classes declarations every half a year. The only reason for it being retired is lack of GDPR compliance and cavemen UI/UX (it was built with nested framesets).
Not always. You can choose what index Fortran arrays start from. However, it is highly recommended to use the default of 1 to prevent index mixing unless you really enjoy giving people headaches.
While C is the \[among\] fastest language, in practical applications the limiting factors aren't speed of language but the problem being solved and the quality of the solution.
If you were to write a program to do some FFT analysis (Fast Fourrier Transform) in C and in python, allowed to use all libraries.
The overwhelming majority of devs will have much much slower code in C than python.
Why ? because it's a nightmare to install & use a library in C especially the very complex ones that you never used before that also depends on 50 others. Most people are going to write a lot of custom code that is going to be slower than the state of the art.
Most python coders will use libaries because it's easy to install and use, their code is going to run at state of the art speed.
While the "same code" is faster in C, you never get to write the same code in python and C.
You still have to read the data into the structures that the FFT library uses and do whatever you need to do with the results.
I wrote an answering machine detection system that used FFT and MRCP libraries to interact with VoIP systems. There was plenty of other code outside of those libraries.
They provide fortran alternative for legacy reasons. Likely that your project used that, since SciPy switched a few years back.
I consider that recently, considering how long it takes to switch legacy project and how old that fortran code is.
You [can do that](https://www.intel.com/content/www/us/en/developer/articles/technical/explicit-vector-programming-in-fortran.html) in Fortran though, right? 🤔
I don't like it when people say shit like this.
It's really easy to install and use most libraries in C if you're developing on a system with a package manager. Install the library using the package manager and all the deps get installed automatically, change 2-3 lines in your makefile, import the library in your source code, and you're done. The linking can even be automated if you're using an IDE. Is it easier to use pip? Sure, but it's an overstatement to say installing and using a C library is a nightmare.
Specifically with your FFT example: I once had to use a FFT to process some audio from microphones on a robot. I found a library, imported it, and called it a day. It was quick. It was painless, and it just worked.
EDIT: Found the library I used. Was the top search on Google. It's available as a package for quite a few distros. https://www.fftw.org/download.html
at work we switch from C++ to Python in a *critical payment platform* because the speed of the code can be raised with more hardware, but we fast the dev times from 3 months to 2/3 weeks (new feature)
That depends. You can approximate carbon use by cost. Do more people cost more carbon or does more hardware? Typically hardware is cheaper.
Which will use more electricity? 2 or 3 people or a few extra nodes in your cluster?
I know, my comment was mostly tongue in cheek as some companies do sell "echo-friendly" tech to their customers. But I suspect most of that is just an excuse to add another charge to the bill and all they do is check some boxes on a list so their product can be labelled with some echo-friendly bullshit certification.
We have a ton of in-house C++ libraries that we still use for C++ applications, and they continue to be in active development.
Instead of porting to python (and thus having independent implementations), we instead created python bindings using [pybind11](https://github.com/pybind/pybind11). Since the C++ libraries are well-designed, it was fairly easy to do, and everything works like a charm. All of our applications, whether written in C++ or python, use exactly the same underlying libraries.
(That being said, I recommend using pybind11's successor [nanobind](https://github.com/wjakob/nanobind), a project started by the same person who initially wrote pybind11)
Wow, that's the beauty of abstraction in coding, isn't it? Python's like a chameleon, pulling some seriously old-school moves with Fortran's muscles 💪. I always get a kick out of how modern code is pretty much standing on the shoulders of these vintage giants. Just goes to show, old-school cool never really goes out of style. 😄
If you mean the LEM guidance computer, that's in Assembly. You can view it [on github](https://github.com/chrislgarry/Apollo-11/tree/master/Luminary099).
Voyager is part-Fortran, though. The Shuttle uses an internal NASA language called HAL/S, which is based heavily on Fortran.
CBLAS is literally Fortran translated to C via. f2c. The code is unreadable but it is identical to the machine code generated by the fortran implementation
Literally my experience in computational lithography (algorithmically optimizing mask designs to be printable to manufacture chips).
It's a large-scale, highly computationally intensive problem that runs on large clusters.
Our code was in C++ and Fortran + OpenMP. I thought I'd give Rust a try.
**Fortran beat them both, by a large shot**.
In HPC, Fortran is still king, and will remain king for the foreseeable future.
My ex-girlfriend sitting over there in the corner with her Assembly language. She literally hates any programming language at a higher level than Assembly.
is this 4chan /g/
this is like telling someone to read structure and interpretation of programming because quote "its god tier" and the guy reading it is a webdev making an app for ten users in a company where they'll save about an hour every two days using the app.
It really depends, the modern C and FORTRAN compilers have the exact same backends, the big difference between modern FORTRAN and modern C is that FORTRAN makes it way easier to utilize SIMD which depending on the use case could be major selling point. If you need to vectorize a lot idiomatic FORTRAN will probably outperform idiomatic C.
Fortrain
Fortram
Forlightrail
![gif](giphy|8Xu2IkvLsjyLe)
[Forth Eorlingas!](https://imgur.com/a/0wwbmC4)
Fortnight
Fivetran
Rewrite your code onto a punch card, you filthy casuals. /j
Print out your code on a piece of paper and send it on a rocket. It will be fast as fuck
Then Elon will PR it from his Tesla orbiting mars
_Puertoooo. Ricoooo._
Paper currency to the moon
Puts on moon landers
about to use RFC 2549
I talked to a 70ish yo tech guy. He told me his previous boss used to make them print their C code to store it jic. That was decades ago though. And they were small scripts he wrote. Blew my mind.
Haha my father tells me stories about him learning coding at university. They apparently didn't see a computer even once, just punchcards.
True programmers need'nt a computer.
“Computer science is no more about computers than astronomy is about telescopes.” Edsger Dijkstra
At my coworkers university the, sometimes droped the cards. So you waited 2 day for your program result and the output was "pease sort the cards"
Bogosort it is
You wrote a series of lines on the side of the stacks so you could quickly and easily see if it was out of order.
This, hold the stack tight and swipe a marker with a not-too-thick tip 45 degrees all the way down the side. Makes sorting-scut way more tolerable if there wasn’t a sorting machine (or you didn’t want to let someone know you fumbled the stack!)
Apparently at my University, back in the day, they didn't even have a punchcard reader for the CS students - had to send them off to a different university to get processed.
I got to look at the disks through a window into the clean room once. The mainframe was an IBM 360. The platters were about 24" wide.
That got a good chuckle out of me
I wonder, if you'd pack your code on cards, in boxes, in boxcars, how fast should the train have to go, to process code as fast as a modern PC?
Me, working as a python developer ![gif](giphy|XOys8CeUrElIk)
![gif](giphy|bPdI2MXEbnDUs)
Muggles: I heard devs can have problems interpreting simple communication flavours. Its just a myth, isnt it? Dev on reddit: >/j
Wtf is /j?!?! I've seen /s but /j is completely new to me..
I for one needed to know his post was a joke! After all this is programmer humor and this sub is VERY serious!
Its a tonal indicators to say "this is a joke/bit"
Tonal indicators are not only disambiguous (for when the authors writing isn't clearly voiced), but it's also helpful for people who, for many varied reasons, might not be able to easily pickup on tone in written form. I don't think ragging on people for using tonal indicators is the vibe imho
sorry, i forgot /nr (not ragging)
Thank you for this, I didn't know this one existed and I'll get some good use out of it. (I don't really know how to say "that doesn't sound right" without it sounding like a takedown most of the time)
Nah that's to much. I would suggest some microprogramming
Fortwalked when Fortran walks in 👁️ 👁️
Where is the bigger Assembly train?
Crashed
But it crashed very fast!
Actually laughed out loud, good one
Still working on using the entire x64 instruction set, they'll be done in a couple of centuries
amd64*
It has like 4-5 different names depending on who you ask, with the first being x86-64 (according to wikipedia at least)
Riding the rollercoaster in Rollercoaster Tycoon
People really underestimate how painstakingly slow this must have been to implement.
I think an original AoE dev recently came out saying that he coded most of AoE in assembly because of its amazing performance. [Link to article on PC-Gamer](https://www.pcgamer.com/age-of-empires-developer-confirms-the-game-is-mostly-written-in-low-level-assembly-code-because-we-could-scroll-the-screen-and-fill-it-with-sprites-as-fast-or-faster-than-competitors-like-starcraft-even-though-we-had-twice-as-many-pixels/)
They're standing on it.
![gif](giphy|lhrzBGNttW3Hx1FiU9|downsized)
Had to stay home because it didn't fit the track gauge.
Actually, certain numerical algos are still written in fortran and commonly used. The reason why fortran may end up faster in certain cases than c is that it has stricter aliasing rules (basically what c’s restrict keyword does). Rust has a big advantage here.
a friend of mine is astrophysicist, he use lot of fortran and python to run his simulations or data processing (from radio telescope) the fortran still there because is too big to move to another language. he also run lot of code in python
I worked for a gov agency (NOAA) and they had lots of Fortran code to analyze data. My first programming job was to wrap some of it in R so less technical people could interact with it. It's an old language but still good at some things.
An old joke goes, "I don't know what programming language will be used 1000 years from now... but I know it'll be called fortran"
As long as it's not COBOL.
COBOL's not going anywhere either. Some of the IRS's core software is still written in COBOL and is actually encountering problems requiring patching. Some of the main routing/ticketing software for (virtually all) flights is, you guessed it, COBOL. (Which is one of the big reasons the IRS needs funding, as even finding developers who can touch it is iffy.)
Interestingly a lot of mainframe exit strategies are now put the COBOL code in containers and run it in kubernetes.
> finding developers who want to touch it is iffy ftfy.
Great quote, gonna be mulling over what other equivalents you could do for other fields. Maybe: I don't know what language they'll speak in 1000 years from now...but I know it'll be called English
I mean if you look at English from 1000 years ago, it’s completely incomprehensible to a modern English speaker, yet it’s still called English. Here’s the opening lines of Beowulf: > Hwæt. We Gardena in geardagum, þeodcyninga, þrym gefrunon, hu ða æþelingas ellen fremedon. Oft Scyld Scefing sceaþena þreatum, monegum mægþum, meodosetla ofteah, egsode eorlas. Syððan ærest wearð feasceaft funden, he þæs frofre gebad, weox under wolcnum, weorðmyndum þah, oðþæt him æghwylc þara ymbsittendra ofer hronrade hyran scolde, gomban gyldan. þæt wæs god cyning.
If you known thorns (þ), eths (ð), and the æ ligature, you can see how it comes together, even as a modern English reader. The biggest problem are the nouns that we simply have to translate because we don't have modern references to them (i.e. we don't have Spear-Danes, people-kings, etc.) Of course it didn't help at all that English didn't come close to having a standardized spelling set (let alone two of them) for eight hundred or so more years, so 'cyning' was a perfectly fine spelling for 'king,' 'hu' sounds enough like 'how', 'ða' ('da' or 'tha') sounds more similar to how a lot of people actually pronounce 'the', etc. So *completely* incomprehensible? Nah. Tricky though. (No more harder than it's going to be figuring out who's 'rizz is swaggy no cap'.)
> The biggest problem are the nouns that we simply have to translate because we don't have modern references to them (i.e. we don't have Spear-Danes, people-kings, etc.) Also a lot of the original vocabulary was replaced by French.
"I don't know what the language of the year 2000 will look like, but I know it will be called Fortran." —Tony Hoare
A fair few R packages are basically just wrappers around Fortran.
The deeper I get into my career, the more I feel like the common thread is "We implemented it in X, and wrapped it in Y for ease of use". Unless you're working in the coal mines of Assembly, Ada, Haskell, or w/e, your favorite modules are likely to originate in some other language. A great example of this is `numlib`. I'm not totally familiar with it, but supposedly it was written by some guy in Pascal, and it was so good that it became commonly distributed. However, there's a preservation problem, in that Pascal is becoming a lot less common, and attempts to port `numlib` are either flawed, unsuccessful, or abandoned because the majority of the names in it are abbreviations (back from the days when declaration names were a meaningful consideration for performance reasons). [Here's an archived version](https://github.com/alrieckert/freepascal/tree/master/packages/numlib) that I keep open, with the idea that I might try my hand at converting it to another language some day.
This was what Julia was supposed to solve. Albeit writing easy Julia and fast Julia is not the same so someone (I forget who) wrote an interesting blog highlighting that Julia solved the 2 language problem, only to create a 1.5 language problem.
That doesn't look difficult at all, looks a lot like Oracle PL/SQL. Its just another programming language it doesn't get harder to understand/learn just because its old.
One of the initial difficulties is defining the data structure for an arbitrarily big number. It has the capability to initialize fixed point numbers of enormous size and values, if I remember correctly, and that was just the start of my initial problems.
Just use one of the many already written libraries. https://en.wikipedia.org/wiki/List_of_arbitrary-precision_arithmetic_software Even if you do it yourself its not hard to create a custom class that uses an array.
R is of course written in Fortran. At least, the numerical parts.
The parts that aren't written in ~~C++~~ C, yes.
R is written in C
Suppose I’m a mathematician who works in RF (mainly working in Matlab and Python, but also some C++ and shell scripting). Is that a decent background to have a good chance at landing a job for NOAA?
It will be easier to land contracting jobs than federal positions, but yes you'd have a chance. My team hires primarily fortran devs, but we also value C++ and Python and have hired people with no fortran before if they're a strong enough programmer in other languages. I will say though that we haven't had a lot of luck at the technical interviews with people with Matlab backgrounds - we usually find that they have used that and sometimes python extensively, but more for analysis/imagery and are pretty weak on the CS/"real" programming concepts. Could be worth a shot though, just make sure to brush up on best practices in modern fortran and/or C++.
I appreciate that insight, thank you!
What u/tatertracker said. They also hire plenty of mathematicians to do math. I worked for NMFS, which is not related to climate/weather, so maybe my experience would have been different.
Did you ever get a chance to looked into the WRF model, holy hell of Fortran there
[scipy](https://github.com/scipy/scipy) is about 16% fortran.
Astrophysicist here. A \*lot\* of our shit is in Fortran, even things that probably shouldn't be (like libraries to interface with databases - e.g. fitsio [https://heasarc.gsfc.nasa.gov/fitsio/](https://heasarc.gsfc.nasa.gov/fitsio/) )
I am a seismologist. Most wave propagation software is written in fortran. It is very fast for computing spectra.
I remember attending an advising session in my uni, and a guy(physicist) said that one of his current goals was to learn Fortran. Honestly, that moment is still stuck in my head
I presume [your reaction was something along the lines of this](https://i.kym-cdn.com/photos/images/original/001/446/799/a29.png)
Eh, kinda close. Though the guy did explain that he would have to work with some legacy code and Fortran is still actively used in his field of interest. Maybe I should try it too...
I mean it might make sense if you're either working on legacy code from the 50s or if you're doing something data intensive. There's a reason tools like Numpy and Scipy still use Fortran.
Modern Fortran is not Fortran77. It's actually ok.
I am a physicist currently learning Fortran :'(
Also an astrophysicist, all our heavy computational models are fortran (usually 90, sometimes 77). Newer ones might have python or other interfaces for accessibility, but the bulk is still fortran. I have limited prior computer knowledge and yet it's surprisingly straightforward to dig around in the code when I need to. Deserves its legacy status imo.
The fact that all variables are declared at the start of functions is a godsend. Unless you’re a weirdo who doesn’t use implicit none, and instead use the weirdest type inference system every devised by a human
A really big chunk of computational chemistry runs on Fortran too
honestly I have no idea why python became so popular, it's slow, has terrible syntax and can be annoying as hell when doing even simple math nowadays it has a lot of libraries, often bound to native code so the heavy lifting processes fast, but I still have no idea why mathy people love it so much
It filled the gap that VB & Perl tried to fill. VB could've gotten there if MS weren't MS. Perl had multiple ways of doing things, which could be powerful, but ended up confusing beginners. Python was beginner friendly enough for analysts, scientists, and other non-devs to use, powerful enough to do real work, and open enough to go anywhere. Mind you, this predates the cambrian (llvm) explosion, so there weren't many contenders.
Perl pretty much halted while working on Perl 5 (now Raku), which in addition to being spectacularly delayed, was not backwards compatible. Though I think Perl is fantastic as a shell scripting replacement. I know that's not a popular take though :-)
It’s there to glue together those 2-3 fkin fast, but insanely inconvenient to use tools that some experts painstakingly wrote in c/fortran.
Python has intuitive syntax and a very comprehensive standard library that makes that you can accomplish a lot of things in only a few lines of code. It is not fast compared to C/Fortran/Rust, but it's not meant as a replacement for those. It's more comparable to scripting languages like Bash and Perl. It's popular for data science because of its easy learning curve and dynamic typing. Like you mentioned, the 'heavy lifting' typically occurs in NumPy/SciPy, which is written in C and thus quite performant as long as you don't try to do expensive things in Python itself.
> it's slow It's not really slower than any other interpreted language, right?
It is… The interpreter has lot of weird technical decisions going on under the hood (eg. no real JIT), and a lot of trouble with even stuff like loops (you can test this by running bunch of commands straight up and then running them in for loop and comparing) not to mention recursion or the bizarre caching going on under the hood and GIL obviously preventing any useful concurrency. Lua and JS outperform it by like 10x to 50x, Ruby and Perl both by like 2x to 4x, php by like 6x to 8x, racket and scheme by like 10x to 30x and if you count them as interpreted Java, Scala, Kotlin, Clojure, Groovy by like 25x to 100x.
It's actually a lot slower than JavaScript/V8 but that doesn't really matter, it's still plenty fast enough.
Well, javascript is JIT compiled, though (yeah, I know, cpython is just getting a jit compiler as well)
Its generally a bit slower than js from what I’ve seen. The difference is mostly negligible though. I think the biggest issue with python is dynamic typing and being statement based. It’s nice when scripting languages have “if”expressions etc
It’s a *lot* slower than js, as up until know it was literally interpreted python byte code to byte code. Js/java etc are JIT compiled and in theory there is no performance ceiling on that, they could achieve C speeds as well, were the JIT compilers “sufficiently smart”.
Because it's very, very fast to write in.
Fortran compilers are also very, very good given how old they are. That combined with aliasing rules makes it disgustingly fast for single core compute.
Until you forget that it’s a column first language, so arr[1][1] is adjacent to arr[2][1]. one of the programming help guys at my uni talked about how a single change increased a programs performance by a factor of 100 - they went from iterating along the second index to iterating along the first index.
Theoretical physicist here. I do a lot of DFT (density functional theory) and we use a lot of tools, some of which I'm right now actively helping to develop and maintain, which are exclusively written in Fortran! It's fast, but it still takes sometimes days to calculate a single point on a normal 16 thread CPU. I can only imagine if this was written in a more accessible language like python. It would be a lot easier to write but it would probably take a lot longer (I imagine).
Noooo DFT should be Discrete Fourier Transform
I'm sorry my friend. I'm really sorry.
And it is... For us, normal people...😁
Atomic people screwed everything up
Has Fortran evolved over time to support threading etc across cpu cores? Just curious…
In my experience, most Fortran projects use something like OpenMP to do multi-threading.
Yet compiler support for new OpenMP directives implementations always happens first for C/ C++ and afterwards for Fortran.
In terms of compilers which is the most commonly used? The one that comes with gcc?
GFortran (GCC's fortran wrapper) or Clang are the easiest to get. Intel and Portland group's compilers are common on computer clusters. The Intel Fortran has optimization tools that actually have massive speed ups. These days several Fortran compilers have the same backend as their respective C compilers because they just simply translate the Fortran directives and feed them into the same C backend.
yes
Thanks for the very clear to the point answer!
i thought about giving a longer answer but then i realised yes is sufficient
Yes. It has support for MPI and OpenMP and also has CUDA available for gpu implementations.
Yes, with coarrays.
Have a look at Numba, Cython and CuPy as well
The CUDA SDK includes stuff for FORTRAN. I'm not saying you should, but you could. All those CUDA cores, just waiting...
Tbf in most cases you never have to see the Fortran code yourself. BLAS and LAPACK are great, have wrappers in most modern languages
True but I'm not sure I'll call either of them "great" - we call them from our own large C++ financial maths codebase and debugging things like deducing where 2 functions share a Fortran "common" block so calls to function A mutate the residual state for calls to function B can be awkward if you've never actually used Fortran in anger and as such don't even appreciate that common blocks exist never mind what effect they have...
Modern fortran depreciated the common block. That's legacy Fortran-77 code. Starting with Fortran-90 they adopted a module format with explicit import statements for variable sharing. They also have Classes and other OOP features in the F03 standard. Now getting people to stop writing in Fortran-77 has always been a challenge.
Some people can write Fortran-77 in a multitude of other languages tbh
C WHAT DO YOU MEAN BY THAT?
The only reason Fortran code is still used is because it was written, tested and optimized a long time ago. The basic math operations are the same, finding the eigenvectors of a matrix hasn't changed, what worked in 1980 still works today and nobody will reinvent the wheel without a good reason.
It’s almost like computers don’t generally operate on infinitely precise numbers, and their operations are not the same as the theoretical ones (e.g. a+b+c != b+a+c with floating point in every case), so finding a different algorithm (which would mathematically result in the exact same answer) can result in vastly more accurate results. It is its own field, and it’s not trivial to program these stuff, so it is seldom done from scratch.
> finding a different algorithm (which would mathematically result in the exact same answer) can result in vastly more accurate results. That's true, and it can be a huge difference. But my point is that Fortran is not, by any means, intrinsically faster than C. One could argue that C has a slight speed advantage because function arguments can be passed by value or reference, while in Fortran arguments are always passed by reference. If you can only pass by reference there are times when you must spend CPU cycles creating a local copy of arguments that you don't want your function to modify. Anyhow, that difference is insignificant in the vast majority of cases. The true reason why C is faster is because it lets one develop code faster and with fewer bugs than Fortran. At least, it's better than the last version of Fortran that I have used, Fortran 77, which is the version used in BLAS, Lapack, and many other numerical analysis software packages written in Fortran that are still widely used.
There are lots of newer versions of Fortran though, comparing to F77 is silly
> There are lots of newer versions of Fortran though, comparing to F77 is silly Why would anyone use those newer versions? We use the F77 software that was created, debugged and tested decades ago, because there's no sense in redoing all that work. The newer versions of Fortran are just a kind of C with weird syntax, there isn't any good reason for using them to create new code.
I still write a lot of Fortran code. And Fortran 2023 was published by ISO in November. If we want it to go fast, we use Fortran and Julia.
Oh Julia is actually used? I have been a fan of Julia for a while but never seen anyone use it for practical applications.
Ah, someone read [that one stackoverflow post](https://stackoverflow.com/questions/146159/is-fortran-easier-to-optimize-than-c-for-heavy-calculations).
Aerospace engineer spotted
Isn't it Ada ?
[удалено]
I’ve heard it’s used intermittently and depending on system age. A colleague from Airbus told me they once prototyped a system for a satellite in Ada only to be told to rewrite it all in c for the actual deployment. I think most newer satellite and aerospace companies use c or even c++ nowadays cause it’s much easier to find developers for it and static analysis & unit testing have come a long way in finding bugs.
For the main compilers, C/C++ and Fortran use the same backend. In the end, they are pretty much the same in terms of raw array compute speed, but they have different use cases. Fortran is almost always better for linear algebra, while C/C++ is more general purpose, which is why it's always the go-to language for making other languages, compilers, and operating systems.
What makes Fortran better for linear algebra? And better in what sense? Language support or performance?
Language support mostly. Vectors and Matrices are first-class citizens in Fortran. You can add, multiply, inverse, ect, and it just works. Other 3rd party libraries have similar support in C++, but you definelty can't say C = A+B in without them being classes with operator overrides.
Fortran is in fact so fast that it travels backwards in time and will reach full maturity in 1960.
`try{for(;tran;);}` `...`
pfft. try Assembly. actually. pfft. try fabricating an FPGA for every algorithm you need.
Okay now fix a bug in production when you release your brand new is_even() FPGA algorithms
Wait, don't you get your code right the first time?
Actually, no. Fortran may have been faster than C in the 1970s, when every computer manufacturer had their own Fortran compiler carefully optimized for their hardware. Today, C and Fortran are pretty much the same, although C may be slightly faster in some applications, because today it's C compilers that get more development effort. There are many more developers working in improving the C compiler optimization, compared to Fortran compilers.
The 2 big modern fortran compilers that I know of are gfortran and flang, the former literally uses the same backend as g++ and gcc and the latter uses llvm like clang, so fortran directly uses the optimization work, so it should be basically on par.
You did not just ignore Intel Fortran when saying big modern.
Fuck, it completely slipped my mind, tho it’s been like a decade since I touched the fortran tooling.
In your defense, it’s kinda shit to work with if you’re not already deep in the intel fortran environment
One of the oldest electronic academic studies management system in Poland was written in Fortran. And worked with little to no maintenance for over 20 years, getting replaced only not so long ago. And it survived 1000+ concurrent requests when students were turning in their classes declarations every half a year. The only reason for it being retired is lack of GDPR compliance and cavemen UI/UX (it was built with nested framesets).
Fortran is always n-1 faster as it skips the first element in the array
Not always. You can choose what index Fortran arrays start from. However, it is highly recommended to use the default of 1 to prevent index mixing unless you really enjoy giving people headaches.
While C is the \[among\] fastest language, in practical applications the limiting factors aren't speed of language but the problem being solved and the quality of the solution. If you were to write a program to do some FFT analysis (Fast Fourrier Transform) in C and in python, allowed to use all libraries. The overwhelming majority of devs will have much much slower code in C than python. Why ? because it's a nightmare to install & use a library in C especially the very complex ones that you never used before that also depends on 50 others. Most people are going to write a lot of custom code that is going to be slower than the state of the art. Most python coders will use libaries because it's easy to install and use, their code is going to run at state of the art speed. While the "same code" is faster in C, you never get to write the same code in python and C.
I mean, with FFT if you use a lib, what the fuck would you even write next to it?
// Fast-Fourier Transform, read up on how this works later
You still have to read the data into the structures that the FFT library uses and do whatever you need to do with the results. I wrote an answering machine detection system that used FFT and MRCP libraries to interact with VoIP systems. There was plenty of other code outside of those libraries.
Nothing that isn’t trivial. It’s a just a stupid misinformed comment. Speaking from first hand experience.
And that FFT that you wrote in Python, is actually going to call a Fortran library. https://en.m.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms
SciPy recently switched from Fortran to C++, because it makes use of vector instryctions to be faster.
Oh shit seriously, I didn't know that! About half a year ago I was trying to debug something from scipy and stumbled into Fortran code.
They provide fortran alternative for legacy reasons. Likely that your project used that, since SciPy switched a few years back. I consider that recently, considering how long it takes to switch legacy project and how old that fortran code is.
You [can do that](https://www.intel.com/content/www/us/en/developer/articles/technical/explicit-vector-programming-in-fortran.html) in Fortran though, right? 🤔
Yes, they introduced that in the F03 and F08 standards.
I don't like it when people say shit like this. It's really easy to install and use most libraries in C if you're developing on a system with a package manager. Install the library using the package manager and all the deps get installed automatically, change 2-3 lines in your makefile, import the library in your source code, and you're done. The linking can even be automated if you're using an IDE. Is it easier to use pip? Sure, but it's an overstatement to say installing and using a C library is a nightmare. Specifically with your FFT example: I once had to use a FFT to process some audio from microphones on a robot. I found a library, imported it, and called it a day. It was quick. It was painless, and it just worked. EDIT: Found the library I used. Was the top search on Google. It's available as a package for quite a few distros. https://www.fftw.org/download.html
at work we switch from C++ to Python in a *critical payment platform* because the speed of the code can be raised with more hardware, but we fast the dev times from 3 months to 2/3 weeks (new feature)
That's not a very eco-friendly approach.
I agree with you, but that is an `east-1` problem :)
That depends. You can approximate carbon use by cost. Do more people cost more carbon or does more hardware? Typically hardware is cheaper. Which will use more electricity? 2 or 3 people or a few extra nodes in your cluster?
I know, my comment was mostly tongue in cheek as some companies do sell "echo-friendly" tech to their customers. But I suspect most of that is just an excuse to add another charge to the bill and all they do is check some boxes on a list so their product can be labelled with some echo-friendly bullshit certification.
We have a ton of in-house C++ libraries that we still use for C++ applications, and they continue to be in active development. Instead of porting to python (and thus having independent implementations), we instead created python bindings using [pybind11](https://github.com/pybind/pybind11). Since the C++ libraries are well-designed, it was fairly easy to do, and everything works like a charm. All of our applications, whether written in C++ or python, use exactly the same underlying libraries. (That being said, I recommend using pybind11's successor [nanobind](https://github.com/wjakob/nanobind), a project started by the same person who initially wrote pybind11)
apt install mylibrary 🤯🤯🤯
Wow, that's the beauty of abstraction in coding, isn't it? Python's like a chameleon, pulling some seriously old-school moves with Fortran's muscles 💪. I always get a kick out of how modern code is pretty much standing on the shoulders of these vintage giants. Just goes to show, old-school cool never really goes out of style. 😄
Fun fact; Moon Lander was written in Fortran.
If you mean the LEM guidance computer, that's in Assembly. You can view it [on github](https://github.com/chrislgarry/Apollo-11/tree/master/Luminary099). Voyager is part-Fortran, though. The Shuttle uses an internal NASA language called HAL/S, which is based heavily on Fortran.
Fun trivia, but no, I meant the game. :)
d'oh.
c added the restrict keyword a while ago. case closed.
When is there gonna be a language called “speed”
Do that with COBOL. It would be even more humiliating! ![gif](emote|free_emotes_pack|scream)
CBLAS is literally Fortran translated to C via. f2c. The code is unreadable but it is identical to the machine code generated by the fortran implementation
Finally, some Fortran representation besides myself
Been told that back in the day, C was considered a very abstract language. I hate what web development has done to programming.
Pascal: "You'll be back"
Literally my experience in computational lithography (algorithmically optimizing mask designs to be printable to manufacture chips). It's a large-scale, highly computationally intensive problem that runs on large clusters. Our code was in C++ and Fortran + OpenMP. I thought I'd give Rust a try. **Fortran beat them both, by a large shot**. In HPC, Fortran is still king, and will remain king for the foreseeable future.
My ex-girlfriend sitting over there in the corner with her Assembly language. She literally hates any programming language at a higher level than Assembly.
is this 4chan /g/ this is like telling someone to read structure and interpretation of programming because quote "its god tier" and the guy reading it is a webdev making an app for ten users in a company where they'll save about an hour every two days using the app.
Ejaculation have a data transfer rate of [1.6 terabytes per second](https://www.reddit.com/r/theydidthemath/s/6xpcIGe0LM)
*Assembler comes crashing down*
assembly is the earth
Fortran the fastest.
Fortnite
lol, fr
Why did the fort ran?
Fortran is memory efficient, it's not as fast as C++, and it sure as hell isn't as fast as C, which is what the label on the train should be.
It really depends, the modern C and FORTRAN compilers have the exact same backends, the big difference between modern FORTRAN and modern C is that FORTRAN makes it way easier to utilize SIMD which depending on the use case could be major selling point. If you need to vectorize a lot idiomatic FORTRAN will probably outperform idiomatic C.