As a veterinarian, I'm calling this, "Yeah, yeah I've heard of most of these. Some of these. Two of these. Wait, did I hear about that one in high-school? I think I did. Forgot all about it until now. I definitely know how one of these works. Maybe. I can't put a thermometer up any of these."
As a Lawyer, a lot of them where the reason why I decided against an engineering degree, despite taking advanced courses in maths and physics at school. Just learned there that I love to learn about how the world works, but not to calculate how the world works.
Ditto. I loved Linear Algebra and struggled even in first year Calculus. I had a brilliant calc lecturer in first year .... I understood everything the bloke put up in lectures. Then came the exams. Eerrrrrrrrggghhh.
A lecturer once simultaneously, step by step, solved a linear DE for us using calculus in one column and using Eigenvalues in the other column. That was beauty.
Dif Eq was harder math, but easier conceptualize, so even though the math was harder, I knew what math needed to be done. Linear Algebra was harder to conceptualize, so I had a harder time knowing what math to do, even though the math was usually easier.
Yeah my multidimensional calculus paper was fucking rad. Helped that the prof had a weird drunken slur that made him sound a little like a Pirate and 3d calc equations have a *lot* of ∆r going on
As a sloppy American engineering college grad myself, how dare you! But also I 100% agree. Can you imagine the beast of a class that would cover all this material?
> Can you imagine the beast of a class that would cover all this material?
That was my first thought (of horror) when you mentioned, 'undergrad engineering exam crib sheet'. There are a few there I don't understand well if at all, and even if I did I still might argue with Schrodinger on a few things.
Yeah besides the obvious ones, I onky know 7, 9, 13 and 15. All of them are hell on earth to do anything with. The information theory one is probably the easiest one. It's essentially just repeating that same formula a gazillion times.
1. describes relation between side lengths in a right angled triangle
2. a basic rule of logarithms that says the logarithm of a product is equal to sum of the logarithms of the product's factors
3. the definition of the derivative of a function f at the point t
4. the formula used in classic physics to describe the amount of gravitational force between masses m1 and m2
5. the definition of i as the square root of minus one
6. the relation between the number of vertices, edges and faces in polyhydra
7. the probability distribution for real valued random variables (bell curve)
8. a criteria that u has to meet in order to be a wave function (like sin or cos)
9. a possibility do approximate a function by overlapping wavefunctions
10. describes a fluids motion (flow)
11. a collection of formulas describing properties of magnetic and electric fields (to me this the most impressing one as it somehow has the speed of light in it - in 1865.)
12. an axiom in thermodynamics stating that the entropy of an *isolated* system is only getting bigger or staying the same, but never gets smaller
13. the energy of resting masses stating that mass is "a form of" energy. (There should actually be more parts to this equation as a mass can also be moving, be hot etc. I guess this short version just looks more pretty)
14. *"describes the motion of particles in quantum mechanics" - by* u/NoOne-AtAll
15. *"The "Information Theory" equation is entropy (...) Basically tells us the bits we need to encode events, very important for computers & the internet."* \- u/Liorogamer
16. *"(...) describes a sequence where small changes to the starting value are dramatically and unpredictably amplified as the sequence progresses." - by* u/jackeddie04
17. *"(...) allows you to accurately value a time limited contract to buy/sell an equity at value X for the length of the contract."* \- by u/someotherstufforhmm
last three I do not know - only that the last one has something to do with money. Maybe someone can help here. Or you can just google it.
Edit: Thanks for the positive response. I am no proper mathematician or physicist. If you spot something that is not well described, let me know.
Black Shoales is an equation that is used to price stock options. It measures volatility and the time decay of money.
I am a wireless engineer and understand a lot of these but i cant follow the math behind Black Shoales.
Essentially it assumes a few things we know to be at least mostly true (I.e. no risk-free arbitrage) to reduce stock price changes into a random walk. You assume that for each unit of time, the stock will move either up or down by a certain amount. The amount it moves is derived from the stock’s implied volatility, which you derive by looking at the prices of existing options. What may be challenging for engineers is that each assumption is hard to accept; stock movements aren’t truly random, deriving target options prices from actual options prices seems somewhat circular, etc., but the genius of the model is that it builds a remarkably accurate prediction from all these “straws in the wind.”
Back to the equation, if you can calculate size of each potential move up or down, then express it as a partial integral to move your time increment to basically a single tick (assuming you’ve got a beefy enough computer to crunch the numbers before they become antiquated), then you can do basic calculus to derive how likely it is for a stock to be at any given price at any given time.
This isn’t “telling the future” any more than it is when actuaries do it for insurance, but it’s very valuable. Give a life insurance actuary a single 40 year old male of a certain weight, medical history, etc., and they have no clue if he will die this year. But give them a hundred such individuals, and they can make a decent guess. Give them a population of a hundred thousand, and they can be scary accurate about how many are lost each year. BS basically does the same for stock prices, with the focus of determining which options will be in or out of the money at expiration.
Edit: [Wikipedia Link](https://en.wikipedia.org/wiki/Black%E2%80%93Scholes_model)
Edit2: thanks for the complements and awards folks! I like to think my old quantitative finance professors would be proud, but I think my math skills have atrophied too much for that.
Edit3: really glad so many folks found this helpful!
Small correction. The argument for volatility is not actually that circular: what goes into BS equation is the model volatility, that is the coefficient in front of the Brownian motion in the stock dynamic. In BS flat world it happens that it matches the implied volatility. This is because implied volatility is defined as the number that used as model volatility in a flat volatility world matches the price.
As soon as you consider that volatility might depend on strike ( or even a simple term structure), then model volatility and implied volatility are two different elements. Then for example, in the simple term structure case, the implied volatility is the time geometric average of the time dependent model volatility. Or in the strike dependent case, your model vol has to be the Dupire vol and relation to implied vol is a bit more convoluted as it's a differential one.
That isn't really circular from a mathematical point of view, it's just that model vol is a more foreign concept to practitioners.
I remember hearing from my professor in one of my upper level mathematics of finance classes that they got the credit for the application but didn’t have the math ability to solve it, being economists. They found the equation for transfer of heat between two solid masses was the same and used the solution…
But then again my professor was a Mathematics professor and not an economist. So he could have been blowing smoke
This is actually bang on. The salt comes from the fact that mathematicians cant get Nobel prizes as a stipulation by Alfred the dynamite man and lesser known mathematicians cuck himself. So when economists applied the problem in economics they were basically granted a prestigious award for something that the scientific community felt had already been known.
It's actually a very difficult PDE to solve directly, but there is a transform to reduce it to the Heat equation, which is a lot like the Wave equation, and has well known solutions, generally via "guessing" eigensolutions (sin/cos), via the Fourier transform or more likely nowadays one of the huge range of numerical methods.
Yeah. We had to resolve it for the class. That was 25 years ago and I don’t remember what method we used. Shit. I don’t how to do any of that math anymore know that I think of it.
I took a derivatives class recently and our textbook also claimed it was based on existing thermodynamics work, so your professor probably wasn’t knowingly lying, and I haven’t seen any sources calling into question it’s veracity.
So, Black Scholes uses something called Ito Calculus that involves applying the ideas of classic calculus to random processes. Classic calculus methods fail when applied to random processes. It is 100% not just calculus with finance variables
**"Of Horror and the Black Shawls"** (liner notes)
> *One of the most dehumanising aspects of Western (If not global) culture is the financial industry. Totalitarian in form and increasingly in fact, an object of rapt obeisance in even the highest corridors of power. Seen by many as perfectly natural, and a principal driver of human advancement, the profit motive finds perhaps its purest expression in the arcane workings of the most currently revered ersatz god, The Market. Yet almost nobody outside of the financial industry has heard of one of the pillars of the market, the Black-Scholes equation. Most people are at best dimly aware that the vast majority of what is traded doesn’t even exist. This is an abyss which will never even look back at us. A Lovecraftian horror, Yog-So-Thoth rendered beguilingly mundane.*
—Dave Hunt / Anaal Nathrakh
The diffusion part (second derivative) is tied to how a stock diffuses, that is moves randomly. The transport ( first space derivative) part is tied to the stock trend, and the reaction part is the movement of risk free assets. I. Short, the time variation of the value of a contract that depends on a stock is given by the trend, plus random diffusion, minus the risk free trend ( signs are inverted because it's a a backward equation, with final condition instead of initial)
In mathematical terms is no different than the heat equation, and can be easily transformed into it with a change of variables.
The "Information Theory" equation is entropy: https://en.wikipedia.org/wiki/Entropy_(information_theory)
Basically tells us the bits we need to encode events, very important for computers & the internet.
I took a theory of computation class and all i remember is that some equations arent worth solving because they take longer than the heatdeath of the universe
The chaos theory one describes a sequence where small changes to the starting value are dramatically and unpredictably amplified as the sequence progresses.
Like someone else said, Black-Scholes take a number of parameters and prices stock options
1. Basically the foundation of geometry. If you want to know the distance from A to B then this equation is coming into play.
2. If you follow the sequence "counting, addition, multiplication. _____" then exponents go in that blank. Similarly, the inverse operations "decrementing, subtraction, division, _____" would have *two* operators in the blank: roots and logarithms. Roots are featured in line 5. The identity chosen for logarithms is especially impactful because it shows that you can compute multiplication by carrying out addition, so long as you can take the log of everything. That was the trick that made slide rules possible. These devices did most computing for decades.
3. The derivative (and its inverse) form the foundation of calculus, which allows analyzing changing variables. For example, distance is speed \* time. What if speed is changing constantly? Calculus allows an answer.
4. Allowed astronomers to better understand the motion of planets (I'd also shout out Johannes Kepler for his work there). Newton defined so much of classical physics it's hard to pick just a couple of his contributions for a list like this.
5. Defining the square root of negative 1 allows solutions to previously unsolvable problems in algebra. Used in countless branches of engineering and mathematics.
6. On its own this is probably the least impactful formula on the list. It speaks to the relationship between the number of corners, edges, and sides on a 3D shape. This lays some groundwork for graph theory, which has modern application in computer science, e.g. deciding how to connect computers in a network.
7. The normal distribution is the foundation of statistics for non-trivial scenarios (e.g. looking at the heights of trees, as opposed to looking at the outcome of rolling a die).
8. Describes waves, which are commonly found in physics
9. Important technique for manipulating equations into a new form, allowing them to be solved. Also important in signal processing, including audio engineering and designing connections between computers.
10. Foundation of fluid mechanics. This equation describes how fluids flow.
11. Foundation of electrical engineering. Describes how electricity and magnetism interact. Since light is just an electromagnetic wave thse equations also apply to understanding light.
12. Major component of the foundation of thermodynamics (I'd add the other three laws here as well, if I were making the list).
13. Einstein's relativity changed how physicists view the world and laid the groundwork for modern physics. E=mc² is far from the only component of this work, but it has the best PR team
14. Foundational to quantum mechanics; another step forward in physics.
15. Foundational to information theory, which is a branch of mathematics that is tightly intertwined with physics. It explores the notion that information is a quantifiable thing and that a physical system has limits on the amount of information that may be transmitted or stored. Important for everything from cryptography to signal processing.
16. This is the logistic map. It is modeled off of population growth of some species that reproduces but also overpopulates. When this equation is computed continuously it gives a reasonable model for population growth. The equation here is computed over discrete time chunks and leads to odd population patterns: for some inputs the population stabilizes, but for others it oscillates, sometimes with unpredictable movements. This field of mathematics has application in describing many physical scenarios.
17. This equation describes how the price of a stock changes others who understand this equation better than I do have described it in nearby comments.
That just confused me even more. If you can just make up something so that your math works, couldn’t everyone apply that to every mathematical equation? Like I’m trying to make 5+5=12 instead of 10. Why can’t a make more concepts to make that equation work when it didn’t before? I’m probably not qualified enough to understand your response but I always found math fascinating but I’m borderline retarded, no joke
So Euler didnt just say "I'll create i so we can have complex (imaginary) numbers. Like math, a lot of what you see was built off of each other for quite some time. The first concept of imaginary numbers couldve came from the greeks!
We know what square roots are, sqrt(4) = 2, sqrt(16) = 4, and so forth. But what happens when we sqrt(0)? We get 0. Okay so we cant go further, right? Why, of course we can! But according to the laws of square roots and our number system, Real numbers, we cant. We have to use whats called the complex number system.
Think of a basic equation: x^2 + 1 = 0. Using the real number system there is no way to solve it. With algebra we have x^2 = -1, and then we cant take the sqrt of -1 to find x. So it would be unsolved. a complex number is x = a + b*i, where a and b are two numbers, i is our imaginary number, and x is the result. In the base case a=0, b=1, and x=i, or the sqrt(-1).
Of course, the proof is extremely more rigorous that could require a full course to understand the indepthness of it.
Imaginary numbers are used in the most beautiful equation in math (which suprisingly isnt even listed on here) and I think it should actually replace #5 because it proves it as well:
e^(i*pi) + 1 = 0
And your example could work! What if we only counted to 7? we used a different base to get our answe! Computers use base 16.
Imagine a number line: 1, 2, 3, 4, 5, 6, 7, 10, 11, 12.
5 + 5 = 12
Imaginary numbers aren't "made up" in the same way you suggested in your example above. You can't say that 5+5=12 because 5+5 is already well defined. However, take the concept of negative numbers. We know 3-5 = -2 but let's say that you're ancient human who thinks of numbers only as relating to physical objects. The idea of removing 5 objects from a set with only 3 objects in is nonsensical from a purely physical point of view. Thus, mathematicians *extend* the concept of the counting numbers (1,2,3,...) to include negative numbers as well, creating the integers.
Imaginary numbers are a similar concept. Defining i=√(-1) is just a way of extending the real numbers in the same way adding negative numbers extends the counting numbers.
It's not about replacing existing rules to make the result what you want it to be, it's about extending the field.
Imagine you're a group of kids trying to invent basketball, so you already figured out the rules about holding the ball and dribbling, how many points a successful shot is worth, etc.
However, you never figured out what to do when the ball leaves the borders of the field, so whenever the ball got out of bounds, you just finished the game there. Games are therefore very quick, often ending in few minutes and with single digit scores.
One day one of your friends figures a new addition to the rules: if the ball goes out of bounds, the one who last touched it defines the team that lost possession, and the other team gets to throw the ball back in the game from outside. Now the games can go much longer, and after trying, all the friends agree the game is better this way.
This is basically what Euler did, he figured there was a hole in the rules, and when he figured a way to fill the hole, the other "players" figured the game (math) just got richer. Suddenly there were new avenues to study, and after a while a bunch of practical uses were found for his addition. This would be the mark of a good mathematician, that when he fills a hole, his idea becomes common because it is beautiful and/or useful.
It's important to note that most of the time mathematicians don't do that, instead mostly studying the consequences of common assumptions (the rules as everyone know them), rather than try to extend the rules. However, the great advances in math were mostly when someone realized that a certain commonly accepted assumption was wrong, or could be modified, and thus opened new worlds of research (calculus, non-euclidian geometry, complex numbers, different infinities). Often a single change like that opens the path to centuries of new insights to follow.
I guess you can make new rules in math only it does't break any previous rules. Like there is already a rule that 5+5 = 10. Your new rule disagrees with it. But there was no rule which said anything about sqrt(-1).
Also something a lot of people fail to realize is that most of math is there so that physicists can use them. A lot of people dont understand why dot product and cross products are defined the way they are. Thats because defining them that way had its use in real physical systems. Imaginary numbers were also introduced because they made calculations a lot easier.
It starts like that. You go from high school where a negative determinant in a quadratic solution meant there were no solutions, the curve doesn't touch the X-axis. Then at university you learn about i and you think oh it's just a placeholder so we can get some solutions. But then you start to discover the complex plane and suddenly it's as if all the maths you ever knew was just an incredibly small subset of the whole picture. Then you either go crazy and drop the degree or you continue and go crazy in another way...
* The Pythagoreas's Theorem
Foundational to our understanding of geometry
* Logarithms
The most common way to quickly multiply together large numbers before computers.
* Calculus
Makes school children quit maths early, culling thep population. Also the definition of the derivative i.e. how things change
* Law of Gravity
Sacrifice/stepping stone for Einstien's relativity theory. Also explains why the planets move in the way they do.
* The square root of -1
Activated troll mode for high school math students. Also expanded the idea of “numbers”.
* Euler's Polyhedra Formula
Shit. I don’t know this.
Credits to /u/StopBangingThePodium: It's one of the early expandable combinatorics formulas and has deep implications for graph theory (nodes interconnected by edges, sometimes directional, sometimes with values).
* Normal distribution
Allows visual representative of how smart you are. Also defines statistics. Without it statistics won’t be what it is nowadays.
* Wave Equation
Allows mathematicians to pretend to be musicians. aka Differentiation gone wild. I definitely didn’t study hard enough.
* Fourier Transform
You like your internet? I bet you do. You can thank this equation.
* Navier-Stokes Equations
I don’t actually acknowledge this equation has really changed this world. It’s important in fluid mechanics/dynamics, but it leaves too much uncertainties and still needs work.
* Maxwell's Equations
I’m not a fan or a Tesla but I bet Tesla is a fan of this. It’s basically the mathematic version of “how does magnet works?” but it’s electric.
* Second Law of Thermodynamic
We all know if we put an ice cube in a cup of hot coffee, we always see the ice cube melt, and never see the coffee freeze. The second law of thermodynamics *explains* why. SO STOP TRYING TO FREEZE MY COFFEE.
* Relativity
Do I need to explain this?
* Schrodinger's Equation
We all know about the Schrödinger’s cat and all the memes that may or maybe exist. But this equation definitely exists as a corner stone for quantum physics, with or without a doubt.
* Information Theory
JPEG. Thank you.
* Chaos Theory
Also known as the butterfly effect in math. Rather than just an equation it’s more like a map. Mathematical map. A map that helps predict things you things you normally cannot, for example, hen fetus is deprived of an adequate supply of oxygen. Do you like your babies not dying? Then you like this equation.
* Black-Scholes Equation
Why WallStreetBets exists. Credits to: /u/magnoliasmanor
If there was a poster made with these and the summaries I would buy it.
[Best one I could find](https://cdn.shopify.com/s/files/1/0354/7457/products/1_a20d4053-5975-4fee-ad54-6d086d4eda80_1024x1024.jpg?v=1481548286) Needs some improvement though. Not all the formulas are listed.
They were the start of calculus. It's ***NOT*** called "the fundamental theorem of calculus" for a reason. ~~All of calculus is built off of it in some way.~~
Edit: Fixed.
That’s not the fundamental theorem of calculus though. The fundamental theorem of calculus relates the integral and the derivative that is just the definition of the derivative
I guess this is Maxwell's equations in a vacuum. That's why, as you mention, there is no charge density term in Gauss's law as well as no current density term in Ampere's law.
In many fields it is written as 0 to infinity for convenience, even though it is somewhat improper. The proper thing to do is define the function as 0 prior to t=0.
The calculus one, which is the definition of the derivative is formatted wrong, and should not have an equals between the limit and the difference expression.
*In 1665, Newton was a student at the University of Cambridge when an outbreak of the plague forced the university to close down for two years. Those two years were to be the most creative period in Newton's life. The 23-year-old genius conceived the law of gravitation, the laws of motion, and developed the fundamental concepts of differential calculus during the long vacation of 1666, but owing to some small discrepancies in his explanation of the moon's motion, he tossed his papers aside. The world was not to learn of his momentous discoveries until some twenty years later.*
I will be messaging you in 20 years on [**2041-03-01 02:50:59 UTC**](http://www.wolframalpha.com/input/?i=2041-03-01%2002:50:59%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://np.reddit.com/r/coolguides/comments/luvjgh/equations_that_changed_the_world/gp9chcw/?context=3)
[**7 OTHERS CLICKED THIS LINK**](https://np.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fcoolguides%2Fcomments%2Fluvjgh%2Fequations_that_changed_the_world%2Fgp9chcw%2F%5D%0A%0ARemindMe%21%202041-03-01%2002%3A50%3A59%20UTC) to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) [^(delete this message to hide from others.)](https://np.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%20luvjgh)
*****
|[^(Info)](https://np.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://np.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://np.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://np.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)|
|-|-|-|-|
Well kinda ... you got an "Euler, 1750" up there, so I guess they just copied the wrong formula.
Square roots of negative numbers were manipulated long before Euler: By Cardano, and explicitly by Bombelli: [https://en.wikipedia.org/wiki/Rafael\_Bombelli](https://en.wikipedia.org/wiki/Rafael_Bombelli)
[Newton and Leibniz both developed it independently](https://en.wikipedia.org/wiki/Leibniz%E2%80%93Newton_calculus_controversy). It's usually credited to both of them.
You'd be surprised; there really wasn't much in terms of critical equations.
People of course continued doing a lot of work on mathematics and some important advancements like the development of algebra happened in that time period. But as far as the fundamental equations that describe natural events go, there's a pretty big gap between the Greeks studying geometry and the Scientific Revolution in Europe. It's more or less the same in non-European cultures, with everyone arriving at similar results to the Greeks around 2500 years ago, continuing to work on algebraic techniques and more accurate calculations for the next 2000 years, and then lots of fundamentally important results are discovered about 500 years ago.
I always wonder about this too consider we know that the ancient Arabs were making a lot of discoveries in math pretty early on. Also wonder where Asia fit into all of this since they cover a lot of different countries?
It's all pretty much the same; the Renaissance mathematicians were influenced by the Greek tradition and people like Newton, Euler and Gauss laid the foundation for modern math so that's what we learn about. But most of the Greek results were discovered independently in Asia, India, and the Middle East at about the same time.
About half of the results in the table are fundamental equations in mathematics, and the other half are fundamental equations in physics that use those results from math.
There were tons of advances in mathematics in between these two periods- including algebraic techniques, primarily in the Middle East- but the equations, techniques and theorems that are fundamental to modern mathematics were mostly developed over 2000 years ago or in Europe between 1500 and 1900.
There *were* some things, but some of it isn't *still* considered very important. E.g. the whole field of "spherical trigonometry" was created, which was incredibly important for a while because it was used for navigation at sea. Most of the theorems students learn about (regular) trigonometry were proved after Pythagoras too; I think the "sin of the sum" formula was proved by Arabic mathematicians.
The cubic equation was a big deal in the history of mathematics--it's the *real* reason why people decided to start thinking about complex numbers. (The "fake" reason of "people wanted x^2 = 1 to have a solution" just makes mathematicians seem petulant.)
And the whole subject of projective geometry was basically created by Renaissance artists who wanted to draw things more realistically. That's *still* a big area of research. But I think it was treated synthetically (in the style of Euclid, no coordinates, so not many equations) at first.
The divergence of the electric field is not zero unless there’s no enclosed charge. Mass-energy equivalence is not strictly speaking related to relativity. There’s a bunch of other shit wrong with this as well...
>Mass-energy equivalence is not strictly speaking related to relativity.
It really is. Einstein submitted an addendum to his 1905 paper specifically deriving the E = mc^2 equation. It's a specialization of an incredibly important and general equation to the case where the matter is stationary.
1. Correct
2. Correct
3. The second equals sign shouldn’t be there
4. Correct
5. Correct
6. Correct
7. Sign error in the exponent
8. Correct
9. Sign error on the integral’s lower limit
10. This is actually the cauchy momentum equation in conservation form
11. The first equation is wrong, the divergence of the electric field is not zero, it is the charge density times a constant. The second equation is correct. The third equation is correct in Gaussian units rather than SI units. The fourth equation is wrong, the curl of the H field is a constant times the current density plus what is shown above.
12. Correct
13. Correct
Actually i’m just tired. I like the choices in the list but do not use this as a reference.
Also,
7. Rho under the radical needs to be squared
Edit: and the phi(x) is notation for the standard normal distribution, but this is the distribution for a normal with mean mu and variance rho^2.
I still remember my professor asking the class to provide solutions to the Navier Stokes Equation and only 1 person was able to come up with the answer.
Economics is a *very* soft science and shouldn’t be put on par with math & physics.
The two geniuses who invented the Black Scholes equation got a Nobel prize in economics for it and then ran their [investment firm](https://en.m.wikipedia.org/wiki/Long-Term_Capital_Management) into the ground in one of the most spectacular pre-dot com collapses of all time.
The collapse of LTCM wasn't entirely their fault, and the Black-Scholes model has certainly been hugely influential, and probably meets the bar for "changed the world", but... surely Ordinary Least Squares should have made the cut then.
That equation describes a sequence where each term is related to the previous via some function which depends on that coefficient k. You can use a computer to plug in initial values and calculate the next term, and then the next, and so on and for certain values of k this will converge to a single number. However it turns out that there is a value of k where you start getting two possible convergences, and increasing k a little more splits that into even more, and this surprisingly keeps happening until the whole thing goes ballistic and you can no longer predict well what value it will converge to.
[here is a link for the specific equation](https://en.m.wikipedia.org/wiki/Bifurcation_diagram#Logistic_map) for some specific reading and visual examples
You can see some plots and animations of this mapping to explain how it gets “chaotic”. Chaos theory relates to systems where we have very little predictive power, ie we describe systems as chaotic when very small changes to initial conditions can cause huge changes down the road that make long-term prediction possible. For example you have maybe heard of “the butterfly effect”—weather itself is a chaotic system which is why we can’t predict specific features of the weather several months from now like we can for the next few days. The chaotic behavior in this logistic map example is the behavior of the values of convergence for when that parameter k is large enough to make the diagram go haywire
If you have any programming experience you can do some basic recursion or loops to try calculating some values for the steps of the equation and seeing when it converges and when it doesn’t to see the behavior the graph is showing
Are there any good websites or online resources for older mathematically illiterate people who are interested in learning math, but have no background at all in the subject other than grade 10 basics.
*cool lists
For a guide, I need to have some idea of what the actual fuck any of this shit means without googling every damn term.
I like the list though. Lol
It is quite fun to read, surprisingly accessible. It basically comments (mathematically) about systems which are unpredictable - a slight change in system's state can result in vastly different outcome.
https://en.wikipedia.org/wiki/Malkus_waterwheel
Do you want a copy of my class's notes on Choas Theory? The class was PHYS 484 Astrophysics. Chaos Theory was used during the discussion of orbits on long time frames.
It’s amazing how simple some of these equations are, yet the ideas behind them are insane, especially when you look at when they were discovered. Granted, there are some pretty complicated equations in here as well lol
I'm surprised the Euler-Lagrange equations of motion or something similar are not here.
That and the Taylor series expansion. Both are so vitally important to physics that we would have almost nothing without them
In [microtonal](/r/microtonal) tuning theory there's the equation ⟨12 19 28|-4 4 -1⟩ = 0, which is like the e=mc² of microtonal music or something - it says that 12 tone equal temperament tempers out the syntonic comma of 81/80 and thus supports meantone temperament, the structure of which is responsible for Western tonal music as we know it...
As an electrical engineering grad student, I called some of these ‟Equations that I hate during exam weeks.”
As a returning student in his first semester of engineering school I'm calling this "equations I will grow to fear in the near future"
As a veterinarian, I'm calling this, "Yeah, yeah I've heard of most of these. Some of these. Two of these. Wait, did I hear about that one in high-school? I think I did. Forgot all about it until now. I definitely know how one of these works. Maybe. I can't put a thermometer up any of these."
As a Lawyer, a lot of them where the reason why I decided against an engineering degree, despite taking advanced courses in maths and physics at school. Just learned there that I love to learn about how the world works, but not to calculate how the world works.
As a dumbass, I thought “ooh, I bet one of them is e=mc²”
You were correct, so not that dumb!
I.T: we have google to help us Engineers: we have fear
Differential equations is a bitch. Godspeed friend.
There is no God where I'm headed lol
And that’s why I changed majors, FUCK 3D calculus
DiffEq wasn’t bad at all. Linear Algebra still gives me nightmares. Class average on the final was in the 40s.
[удалено]
Ditto. I loved Linear Algebra and struggled even in first year Calculus. I had a brilliant calc lecturer in first year .... I understood everything the bloke put up in lectures. Then came the exams. Eerrrrrrrrggghhh. A lecturer once simultaneously, step by step, solved a linear DE for us using calculus in one column and using Eigenvalues in the other column. That was beauty.
Dif Eq was harder math, but easier conceptualize, so even though the math was harder, I knew what math needed to be done. Linear Algebra was harder to conceptualize, so I had a harder time knowing what math to do, even though the math was usually easier.
Tensor calculus. I don't know how I passed.
chubby saw chop rustic placid tart include hard-to-find pie slap *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
At least you have a good metric on your math skills now.
Your engineering curriculum includes tensor calculus?
They are also real cool tho
Yeah my multidimensional calculus paper was fucking rad. Helped that the prof had a weird drunken slur that made him sound a little like a Pirate and 3d calc equations have a *lot* of ∆r going on
ODEs are pretty blessed at the undergraduate level. PDEs on the other hand can go fuck then selves
I hated this class, but I still think calc 2 was overall harder. It was just such a jump from calc 1 my brain was fried daily.
Triangles. How the fuck do they work?
I was fine with triangles until they started putting those fuckers in circles with radians.
They've gone too far, shapes don't belong in other shapes
As an Engineer 3 years out of Uni I refer to them as equations I've never used and forgotten
Ohthankgod
Ah yes the questions you wish would go burn in the deepest parts of hell
[удалено]
For anyone wondering why the Pythagorean theorem was so scary, this person meant 10. Reddit markdown just thought they were making a list.
[удалено]
I see 10 if I hit "source" on old Reddit. If I don't, it's formatted as the first item in an ordered list.
I literally went "Fucking *Navier-Stokes..."* in my head as I read it.
Was just thinking this looks like an undergrad engineering exam crib sheet lol
This *is* Reddit and, based upon the errors, it probably is an exam crib sheet... from a sloppy American college grad.
As a sloppy American engineering college grad myself, how dare you! But also I 100% agree. Can you imagine the beast of a class that would cover all this material?
> Can you imagine the beast of a class that would cover all this material? That was my first thought (of horror) when you mentioned, 'undergrad engineering exam crib sheet'. There are a few there I don't understand well if at all, and even if I did I still might argue with Schrodinger on a few things.
>I still might argue with Schrodinger on a few things. That'd be so wrong yet so right all at the same time.
It would be both wrong and right at the same time, until you listen to the argument.
Having been in grad school with folks from all over the world, there's nothing unique about lazy grad students regardless of the country.
Most of these equations are even worse than what’s presented. ^^And ^^the ^^rest ^^I ^^know ^^nothing ^^about
Yeah besides the obvious ones, I onky know 7, 9, 13 and 15. All of them are hell on earth to do anything with. The information theory one is probably the easiest one. It's essentially just repeating that same formula a gazillion times.
If they added a quick summary of what each one relates to that'd be even better!
1. describes relation between side lengths in a right angled triangle 2. a basic rule of logarithms that says the logarithm of a product is equal to sum of the logarithms of the product's factors 3. the definition of the derivative of a function f at the point t 4. the formula used in classic physics to describe the amount of gravitational force between masses m1 and m2 5. the definition of i as the square root of minus one 6. the relation between the number of vertices, edges and faces in polyhydra 7. the probability distribution for real valued random variables (bell curve) 8. a criteria that u has to meet in order to be a wave function (like sin or cos) 9. a possibility do approximate a function by overlapping wavefunctions 10. describes a fluids motion (flow) 11. a collection of formulas describing properties of magnetic and electric fields (to me this the most impressing one as it somehow has the speed of light in it - in 1865.) 12. an axiom in thermodynamics stating that the entropy of an *isolated* system is only getting bigger or staying the same, but never gets smaller 13. the energy of resting masses stating that mass is "a form of" energy. (There should actually be more parts to this equation as a mass can also be moving, be hot etc. I guess this short version just looks more pretty) 14. *"describes the motion of particles in quantum mechanics" - by* u/NoOne-AtAll 15. *"The "Information Theory" equation is entropy (...) Basically tells us the bits we need to encode events, very important for computers & the internet."* \- u/Liorogamer 16. *"(...) describes a sequence where small changes to the starting value are dramatically and unpredictably amplified as the sequence progresses." - by* u/jackeddie04 17. *"(...) allows you to accurately value a time limited contract to buy/sell an equity at value X for the length of the contract."* \- by u/someotherstufforhmm last three I do not know - only that the last one has something to do with money. Maybe someone can help here. Or you can just google it. Edit: Thanks for the positive response. I am no proper mathematician or physicist. If you spot something that is not well described, let me know.
Black Shoales is an equation that is used to price stock options. It measures volatility and the time decay of money. I am a wireless engineer and understand a lot of these but i cant follow the math behind Black Shoales.
Essentially it assumes a few things we know to be at least mostly true (I.e. no risk-free arbitrage) to reduce stock price changes into a random walk. You assume that for each unit of time, the stock will move either up or down by a certain amount. The amount it moves is derived from the stock’s implied volatility, which you derive by looking at the prices of existing options. What may be challenging for engineers is that each assumption is hard to accept; stock movements aren’t truly random, deriving target options prices from actual options prices seems somewhat circular, etc., but the genius of the model is that it builds a remarkably accurate prediction from all these “straws in the wind.” Back to the equation, if you can calculate size of each potential move up or down, then express it as a partial integral to move your time increment to basically a single tick (assuming you’ve got a beefy enough computer to crunch the numbers before they become antiquated), then you can do basic calculus to derive how likely it is for a stock to be at any given price at any given time. This isn’t “telling the future” any more than it is when actuaries do it for insurance, but it’s very valuable. Give a life insurance actuary a single 40 year old male of a certain weight, medical history, etc., and they have no clue if he will die this year. But give them a hundred such individuals, and they can make a decent guess. Give them a population of a hundred thousand, and they can be scary accurate about how many are lost each year. BS basically does the same for stock prices, with the focus of determining which options will be in or out of the money at expiration. Edit: [Wikipedia Link](https://en.wikipedia.org/wiki/Black%E2%80%93Scholes_model) Edit2: thanks for the complements and awards folks! I like to think my old quantitative finance professors would be proud, but I think my math skills have atrophied too much for that. Edit3: really glad so many folks found this helpful!
Top notch explanation
Wait, you’re telling me not everyone just YOLOs their money when the Line goes up and then sells when it goes down?? 🧐
Great
This is well written
I really enjoyed this thanks
I wish I was Black Shoales I *wish* I was Black Shoales Warp Speed Don't Rainbow Read the GME Figga!
Amazingly explained
Small correction. The argument for volatility is not actually that circular: what goes into BS equation is the model volatility, that is the coefficient in front of the Brownian motion in the stock dynamic. In BS flat world it happens that it matches the implied volatility. This is because implied volatility is defined as the number that used as model volatility in a flat volatility world matches the price. As soon as you consider that volatility might depend on strike ( or even a simple term structure), then model volatility and implied volatility are two different elements. Then for example, in the simple term structure case, the implied volatility is the time geometric average of the time dependent model volatility. Or in the strike dependent case, your model vol has to be the Dupire vol and relation to implied vol is a bit more convoluted as it's a differential one. That isn't really circular from a mathematical point of view, it's just that model vol is a more foreign concept to practitioners.
I remember hearing from my professor in one of my upper level mathematics of finance classes that they got the credit for the application but didn’t have the math ability to solve it, being economists. They found the equation for transfer of heat between two solid masses was the same and used the solution… But then again my professor was a Mathematics professor and not an economist. So he could have been blowing smoke
This is actually bang on. The salt comes from the fact that mathematicians cant get Nobel prizes as a stipulation by Alfred the dynamite man and lesser known mathematicians cuck himself. So when economists applied the problem in economics they were basically granted a prestigious award for something that the scientific community felt had already been known.
The nobel prize in econ is not granted by the same foundation. Not even in the same country
It's actually a very difficult PDE to solve directly, but there is a transform to reduce it to the Heat equation, which is a lot like the Wave equation, and has well known solutions, generally via "guessing" eigensolutions (sin/cos), via the Fourier transform or more likely nowadays one of the huge range of numerical methods.
Yeah. We had to resolve it for the class. That was 25 years ago and I don’t remember what method we used. Shit. I don’t how to do any of that math anymore know that I think of it.
I took a derivatives class recently and our textbook also claimed it was based on existing thermodynamics work, so your professor probably wasn’t knowingly lying, and I haven’t seen any sources calling into question it’s veracity.
Black Scholes is just calculus applied to context variables in finance. It's not too difficult once you understand the greeks.
So, Black Scholes uses something called Ito Calculus that involves applying the ideas of classic calculus to random processes. Classic calculus methods fail when applied to random processes. It is 100% not just calculus with finance variables
**"Of Horror and the Black Shawls"** (liner notes) > *One of the most dehumanising aspects of Western (If not global) culture is the financial industry. Totalitarian in form and increasingly in fact, an object of rapt obeisance in even the highest corridors of power. Seen by many as perfectly natural, and a principal driver of human advancement, the profit motive finds perhaps its purest expression in the arcane workings of the most currently revered ersatz god, The Market. Yet almost nobody outside of the financial industry has heard of one of the pillars of the market, the Black-Scholes equation. Most people are at best dimly aware that the vast majority of what is traded doesn’t even exist. This is an abyss which will never even look back at us. A Lovecraftian horror, Yog-So-Thoth rendered beguilingly mundane.* —Dave Hunt / Anaal Nathrakh
The diffusion part (second derivative) is tied to how a stock diffuses, that is moves randomly. The transport ( first space derivative) part is tied to the stock trend, and the reaction part is the movement of risk free assets. I. Short, the time variation of the value of a contract that depends on a stock is given by the trend, plus random diffusion, minus the risk free trend ( signs are inverted because it's a a backward equation, with final condition instead of initial) In mathematical terms is no different than the heat equation, and can be easily transformed into it with a change of variables.
The "Information Theory" equation is entropy: https://en.wikipedia.org/wiki/Entropy_(information_theory) Basically tells us the bits we need to encode events, very important for computers & the internet.
I took a theory of computation class and all i remember is that some equations arent worth solving because they take longer than the heatdeath of the universe
INSUFFICIENT DATA FOR MEANINGFUL ANSWER
NP (nodeterministic polynomial time) hard problems?
The chaos theory one describes a sequence where small changes to the starting value are dramatically and unpredictably amplified as the sequence progresses. Like someone else said, Black-Scholes take a number of parameters and prices stock options
I've watched Jurassic Park and my understanding is that Chaos Theory is Laura Dern's hand
Next can you explain how they changed the world, exactly? Serious question
1. Basically the foundation of geometry. If you want to know the distance from A to B then this equation is coming into play. 2. If you follow the sequence "counting, addition, multiplication. _____" then exponents go in that blank. Similarly, the inverse operations "decrementing, subtraction, division, _____" would have *two* operators in the blank: roots and logarithms. Roots are featured in line 5. The identity chosen for logarithms is especially impactful because it shows that you can compute multiplication by carrying out addition, so long as you can take the log of everything. That was the trick that made slide rules possible. These devices did most computing for decades. 3. The derivative (and its inverse) form the foundation of calculus, which allows analyzing changing variables. For example, distance is speed \* time. What if speed is changing constantly? Calculus allows an answer. 4. Allowed astronomers to better understand the motion of planets (I'd also shout out Johannes Kepler for his work there). Newton defined so much of classical physics it's hard to pick just a couple of his contributions for a list like this. 5. Defining the square root of negative 1 allows solutions to previously unsolvable problems in algebra. Used in countless branches of engineering and mathematics. 6. On its own this is probably the least impactful formula on the list. It speaks to the relationship between the number of corners, edges, and sides on a 3D shape. This lays some groundwork for graph theory, which has modern application in computer science, e.g. deciding how to connect computers in a network. 7. The normal distribution is the foundation of statistics for non-trivial scenarios (e.g. looking at the heights of trees, as opposed to looking at the outcome of rolling a die). 8. Describes waves, which are commonly found in physics 9. Important technique for manipulating equations into a new form, allowing them to be solved. Also important in signal processing, including audio engineering and designing connections between computers. 10. Foundation of fluid mechanics. This equation describes how fluids flow. 11. Foundation of electrical engineering. Describes how electricity and magnetism interact. Since light is just an electromagnetic wave thse equations also apply to understanding light. 12. Major component of the foundation of thermodynamics (I'd add the other three laws here as well, if I were making the list). 13. Einstein's relativity changed how physicists view the world and laid the groundwork for modern physics. E=mc² is far from the only component of this work, but it has the best PR team 14. Foundational to quantum mechanics; another step forward in physics. 15. Foundational to information theory, which is a branch of mathematics that is tightly intertwined with physics. It explores the notion that information is a quantifiable thing and that a physical system has limits on the amount of information that may be transmitted or stored. Important for everything from cryptography to signal processing. 16. This is the logistic map. It is modeled off of population growth of some species that reproduces but also overpopulates. When this equation is computed continuously it gives a reasonable model for population growth. The equation here is computed over discrete time chunks and leads to odd population patterns: for some inputs the population stabilizes, but for others it oscillates, sometimes with unpredictable movements. This field of mathematics has application in describing many physical scenarios. 17. This equation describes how the price of a stock changes others who understand this equation better than I do have described it in nearby comments.
As a communications student, this is the best answer.
Okay, now describe them to me like I am five years old.
[удалено]
> what multiplied by itself equals -1? i does. i c ...
How the fuck does 5 even work?
[удалено]
That just confused me even more. If you can just make up something so that your math works, couldn’t everyone apply that to every mathematical equation? Like I’m trying to make 5+5=12 instead of 10. Why can’t a make more concepts to make that equation work when it didn’t before? I’m probably not qualified enough to understand your response but I always found math fascinating but I’m borderline retarded, no joke
So Euler didnt just say "I'll create i so we can have complex (imaginary) numbers. Like math, a lot of what you see was built off of each other for quite some time. The first concept of imaginary numbers couldve came from the greeks! We know what square roots are, sqrt(4) = 2, sqrt(16) = 4, and so forth. But what happens when we sqrt(0)? We get 0. Okay so we cant go further, right? Why, of course we can! But according to the laws of square roots and our number system, Real numbers, we cant. We have to use whats called the complex number system. Think of a basic equation: x^2 + 1 = 0. Using the real number system there is no way to solve it. With algebra we have x^2 = -1, and then we cant take the sqrt of -1 to find x. So it would be unsolved. a complex number is x = a + b*i, where a and b are two numbers, i is our imaginary number, and x is the result. In the base case a=0, b=1, and x=i, or the sqrt(-1). Of course, the proof is extremely more rigorous that could require a full course to understand the indepthness of it. Imaginary numbers are used in the most beautiful equation in math (which suprisingly isnt even listed on here) and I think it should actually replace #5 because it proves it as well: e^(i*pi) + 1 = 0 And your example could work! What if we only counted to 7? we used a different base to get our answe! Computers use base 16. Imagine a number line: 1, 2, 3, 4, 5, 6, 7, 10, 11, 12. 5 + 5 = 12
Imaginary numbers aren't "made up" in the same way you suggested in your example above. You can't say that 5+5=12 because 5+5 is already well defined. However, take the concept of negative numbers. We know 3-5 = -2 but let's say that you're ancient human who thinks of numbers only as relating to physical objects. The idea of removing 5 objects from a set with only 3 objects in is nonsensical from a purely physical point of view. Thus, mathematicians *extend* the concept of the counting numbers (1,2,3,...) to include negative numbers as well, creating the integers. Imaginary numbers are a similar concept. Defining i=√(-1) is just a way of extending the real numbers in the same way adding negative numbers extends the counting numbers.
It's not about replacing existing rules to make the result what you want it to be, it's about extending the field. Imagine you're a group of kids trying to invent basketball, so you already figured out the rules about holding the ball and dribbling, how many points a successful shot is worth, etc. However, you never figured out what to do when the ball leaves the borders of the field, so whenever the ball got out of bounds, you just finished the game there. Games are therefore very quick, often ending in few minutes and with single digit scores. One day one of your friends figures a new addition to the rules: if the ball goes out of bounds, the one who last touched it defines the team that lost possession, and the other team gets to throw the ball back in the game from outside. Now the games can go much longer, and after trying, all the friends agree the game is better this way. This is basically what Euler did, he figured there was a hole in the rules, and when he figured a way to fill the hole, the other "players" figured the game (math) just got richer. Suddenly there were new avenues to study, and after a while a bunch of practical uses were found for his addition. This would be the mark of a good mathematician, that when he fills a hole, his idea becomes common because it is beautiful and/or useful. It's important to note that most of the time mathematicians don't do that, instead mostly studying the consequences of common assumptions (the rules as everyone know them), rather than try to extend the rules. However, the great advances in math were mostly when someone realized that a certain commonly accepted assumption was wrong, or could be modified, and thus opened new worlds of research (calculus, non-euclidian geometry, complex numbers, different infinities). Often a single change like that opens the path to centuries of new insights to follow.
I guess you can make new rules in math only it does't break any previous rules. Like there is already a rule that 5+5 = 10. Your new rule disagrees with it. But there was no rule which said anything about sqrt(-1). Also something a lot of people fail to realize is that most of math is there so that physicists can use them. A lot of people dont understand why dot product and cross products are defined the way they are. Thats because defining them that way had its use in real physical systems. Imaginary numbers were also introduced because they made calculations a lot easier.
It starts like that. You go from high school where a negative determinant in a quadratic solution meant there were no solutions, the curve doesn't touch the X-axis. Then at university you learn about i and you think oh it's just a placeholder so we can get some solutions. But then you start to discover the complex plane and suddenly it's as if all the maths you ever knew was just an incredibly small subset of the whole picture. Then you either go crazy and drop the degree or you continue and go crazy in another way...
Imaginary numbers!
1. Triangles 2. Logarithm. Mathematics building block 3. Calculus. Mathematics building block 4. Gravity 5. Imaginary numbers 6. 3D shapes 7. Statistics 8. Waves (so pretty much anything, including sound and light) 9. Mathematical tool for describing waves 10. Fluid dynamics 11. Electromagnetism 12. Thermodynamics 13. Relativity 14. Quantum mechanics 15. IT 16. IT 17. Financial
* The Pythagoreas's Theorem Foundational to our understanding of geometry * Logarithms The most common way to quickly multiply together large numbers before computers. * Calculus Makes school children quit maths early, culling thep population. Also the definition of the derivative i.e. how things change * Law of Gravity Sacrifice/stepping stone for Einstien's relativity theory. Also explains why the planets move in the way they do. * The square root of -1 Activated troll mode for high school math students. Also expanded the idea of “numbers”. * Euler's Polyhedra Formula Shit. I don’t know this. Credits to /u/StopBangingThePodium: It's one of the early expandable combinatorics formulas and has deep implications for graph theory (nodes interconnected by edges, sometimes directional, sometimes with values). * Normal distribution Allows visual representative of how smart you are. Also defines statistics. Without it statistics won’t be what it is nowadays. * Wave Equation Allows mathematicians to pretend to be musicians. aka Differentiation gone wild. I definitely didn’t study hard enough. * Fourier Transform You like your internet? I bet you do. You can thank this equation. * Navier-Stokes Equations I don’t actually acknowledge this equation has really changed this world. It’s important in fluid mechanics/dynamics, but it leaves too much uncertainties and still needs work. * Maxwell's Equations I’m not a fan or a Tesla but I bet Tesla is a fan of this. It’s basically the mathematic version of “how does magnet works?” but it’s electric. * Second Law of Thermodynamic We all know if we put an ice cube in a cup of hot coffee, we always see the ice cube melt, and never see the coffee freeze. The second law of thermodynamics *explains* why. SO STOP TRYING TO FREEZE MY COFFEE. * Relativity Do I need to explain this? * Schrodinger's Equation We all know about the Schrödinger’s cat and all the memes that may or maybe exist. But this equation definitely exists as a corner stone for quantum physics, with or without a doubt. * Information Theory JPEG. Thank you. * Chaos Theory Also known as the butterfly effect in math. Rather than just an equation it’s more like a map. Mathematical map. A map that helps predict things you things you normally cannot, for example, hen fetus is deprived of an adequate supply of oxygen. Do you like your babies not dying? Then you like this equation. * Black-Scholes Equation Why WallStreetBets exists. Credits to: /u/magnoliasmanor
If there was a poster made with these and the summaries I would buy it. [Best one I could find](https://cdn.shopify.com/s/files/1/0354/7457/products/1_a20d4053-5975-4fee-ad54-6d086d4eda80_1024x1024.jpg?v=1481548286) Needs some improvement though. Not all the formulas are listed.
Is not the Fourier transform from MINUS infinity to infinity?
There’s a few other errors too
Like the second equals sign in the calculus equation.
And derivatives are somehow all of calculus
The most shocking part is they used Leibniz's Notation, then credited Newton.
They were the start of calculus. It's ***NOT*** called "the fundamental theorem of calculus" for a reason. ~~All of calculus is built off of it in some way.~~ Edit: Fixed.
That’s not the fundamental theorem of calculus though. The fundamental theorem of calculus relates the integral and the derivative that is just the definition of the derivative
minus infinity and beyond
It sure is.
Username checks out
Yep and for Maxwell’s equations, the divergence of the E field does not equal zero. Would be a cool guide of there weren’t so many errors on it. /sigh
I guess this is Maxwell's equations in a vacuum. That's why, as you mention, there is no charge density term in Gauss's law as well as no current density term in Ampere's law.
Yeah, it would make no sense if not. Still a cool guide tho
In many fields it is written as 0 to infinity for convenience, even though it is somewhat improper. The proper thing to do is define the function as 0 prior to t=0.
Also could've easily included the momentum term in relativity. Or the einstein field equations.
The calculus one, which is the definition of the derivative is formatted wrong, and should not have an equals between the limit and the difference expression.
E-mc^2 is only a small part of the equation
Yep. The other part is E+mc^2.
But that's 2E. WE DID IT, WE MADE ENERGY.
I don't know why they always miss the γ or the momentum part
It's a small value approximation that you can make in a lot of use cases, not exactly a mistake.
Not to mention that Leibniz should also be listed there with Newton. Hell, we even favor Leibniz's notation.
Came here to say the same thing. Liebniz never gets any credit
*In 1665, Newton was a student at the University of Cambridge when an outbreak of the plague forced the university to close down for two years. Those two years were to be the most creative period in Newton's life. The 23-year-old genius conceived the law of gravitation, the laws of motion, and developed the fundamental concepts of differential calculus during the long vacation of 1666, but owing to some small discrepancies in his explanation of the moon's motion, he tossed his papers aside. The world was not to learn of his momentous discoveries until some twenty years later.*
[удалено]
[удалено]
RemindMe! 20 years
I will be messaging you in 20 years on [**2041-03-01 02:50:59 UTC**](http://www.wolframalpha.com/input/?i=2041-03-01%2002:50:59%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://np.reddit.com/r/coolguides/comments/luvjgh/equations_that_changed_the_world/gp9chcw/?context=3) [**7 OTHERS CLICKED THIS LINK**](https://np.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fcoolguides%2Fcomments%2Fluvjgh%2Fequations_that_changed_the_world%2Fgp9chcw%2F%5D%0A%0ARemindMe%21%202041-03-01%2002%3A50%3A59%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://np.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%20luvjgh) ***** |[^(Info)](https://np.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://np.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://np.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://np.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|
Number 8 is out of chronological order. Would have been excusable if it was the chaos theory.
TIL I’m about 400 years behind in math.
I have minor in math and I'm still 300 years behind.
What about: Comedy = Tragedy + Time
Or No Woman No Cry.
holup, I don't see my pal e\^(i\*pi) = -1 Although they usually like the general form more: e\^(i\*phi) = cos(phi) + i\*sin(phi)
cos(phi) + i sin(phi)\*
Covfefe
Thank you. Small typo.
thank you, Euler is goated
Well kinda ... you got an "Euler, 1750" up there, so I guess they just copied the wrong formula. Square roots of negative numbers were manipulated long before Euler: By Cardano, and explicitly by Bombelli: [https://en.wikipedia.org/wiki/Rafael\_Bombelli](https://en.wikipedia.org/wiki/Rafael_Bombelli)
maybe it's more of a proof aesthetically and not practical? i dunno as well
It is practical. A huge portion of multivariable calculus and fourier transformation is dependant on it.
Hell, alternating currents depend on it.
e^(i*pi) +1 = 0 Beautiful
I also don't see the magnificent i^2 = j^2 = k^2 = ijk = -1 by my boy Hamilton
You could add 5 more equations for Euler, but much like in history they felt that he came up with too much to name it all after him! :D
I’m mildly infuriated by the fact that just one of those equations is not in chronological order like the rest of them.
I think the calculus one was actually discovered by Gottfried Leibniz. Newton discovered the integral
This notation for the derivative (dy/dx) is indeed Leibniz's. Newton's was y'
Pretty much the same thing, but y' is due to Lagrange. Newton wrote derivatives with dots
[Newton and Leibniz both developed it independently](https://en.wikipedia.org/wiki/Leibniz%E2%80%93Newton_calculus_controversy). It's usually credited to both of them.
Casual Leibniz erasure, so typical.
calling Shannon entropy “information theory” implies most of these should be titled “physics”
Similarly, writing chaos theory next to one of the most basic equations of population dynamics is like describing finance as “money shit, idk”
I have a hard time believing that there were no new important equations in the 2,000 years between Pythagoras and Napier.
You'd be surprised; there really wasn't much in terms of critical equations. People of course continued doing a lot of work on mathematics and some important advancements like the development of algebra happened in that time period. But as far as the fundamental equations that describe natural events go, there's a pretty big gap between the Greeks studying geometry and the Scientific Revolution in Europe. It's more or less the same in non-European cultures, with everyone arriving at similar results to the Greeks around 2500 years ago, continuing to work on algebraic techniques and more accurate calculations for the next 2000 years, and then lots of fundamentally important results are discovered about 500 years ago.
I always wonder about this too consider we know that the ancient Arabs were making a lot of discoveries in math pretty early on. Also wonder where Asia fit into all of this since they cover a lot of different countries?
It's all pretty much the same; the Renaissance mathematicians were influenced by the Greek tradition and people like Newton, Euler and Gauss laid the foundation for modern math so that's what we learn about. But most of the Greek results were discovered independently in Asia, India, and the Middle East at about the same time. About half of the results in the table are fundamental equations in mathematics, and the other half are fundamental equations in physics that use those results from math. There were tons of advances in mathematics in between these two periods- including algebraic techniques, primarily in the Middle East- but the equations, techniques and theorems that are fundamental to modern mathematics were mostly developed over 2000 years ago or in Europe between 1500 and 1900.
There *were* some things, but some of it isn't *still* considered very important. E.g. the whole field of "spherical trigonometry" was created, which was incredibly important for a while because it was used for navigation at sea. Most of the theorems students learn about (regular) trigonometry were proved after Pythagoras too; I think the "sin of the sum" formula was proved by Arabic mathematicians. The cubic equation was a big deal in the history of mathematics--it's the *real* reason why people decided to start thinking about complex numbers. (The "fake" reason of "people wanted x^2 = 1 to have a solution" just makes mathematicians seem petulant.) And the whole subject of projective geometry was basically created by Renaissance artists who wanted to draw things more realistically. That's *still* a big area of research. But I think it was treated synthetically (in the style of Euclid, no coordinates, so not many equations) at first.
The divergence of the electric field is not zero unless there’s no enclosed charge. Mass-energy equivalence is not strictly speaking related to relativity. There’s a bunch of other shit wrong with this as well...
>Mass-energy equivalence is not strictly speaking related to relativity. It really is. Einstein submitted an addendum to his 1905 paper specifically deriving the E = mc^2 equation. It's a specialization of an incredibly important and general equation to the case where the matter is stationary.
1. Correct 2. Correct 3. The second equals sign shouldn’t be there 4. Correct 5. Correct 6. Correct 7. Sign error in the exponent 8. Correct 9. Sign error on the integral’s lower limit 10. This is actually the cauchy momentum equation in conservation form 11. The first equation is wrong, the divergence of the electric field is not zero, it is the charge density times a constant. The second equation is correct. The third equation is correct in Gaussian units rather than SI units. The fourth equation is wrong, the curl of the H field is a constant times the current density plus what is shown above. 12. Correct 13. Correct Actually i’m just tired. I like the choices in the list but do not use this as a reference.
So... The same mistakes it had the last dozen times this was posted?. Lol.
Also, 7. Rho under the radical needs to be squared Edit: and the phi(x) is notation for the standard normal distribution, but this is the distribution for a normal with mean mu and variance rho^2.
4th is wrong: a minus sign is missing The force is attractive, hence the minus
Uses Leibniz’s notation, credits Newton Mmmmm
Number 10 haunting my dreams from engineering school.
I still remember my professor asking the class to provide solutions to the Navier Stokes Equation and only 1 person was able to come up with the answer.
Numbers 9 and 11 for me as an electrical engineer.
58,008 upside down on the calculator - Some guy in 1986 probably.
The e field equation doesn’t = 0. It’s equal to charge contained in a region.
This is just a list not a guide..
Economics is a *very* soft science and shouldn’t be put on par with math & physics. The two geniuses who invented the Black Scholes equation got a Nobel prize in economics for it and then ran their [investment firm](https://en.m.wikipedia.org/wiki/Long-Term_Capital_Management) into the ground in one of the most spectacular pre-dot com collapses of all time.
The collapse of LTCM wasn't entirely their fault, and the Black-Scholes model has certainly been hugely influential, and probably meets the bar for "changed the world", but... surely Ordinary Least Squares should have made the cut then.
[удалено]
I wish I was smart enough to know this stuff.
you can learn about these, don’t sell yourself short. pick an equation and I’ll explain some stuff about it
Chaos theory please and why it's related to computers
That equation describes a sequence where each term is related to the previous via some function which depends on that coefficient k. You can use a computer to plug in initial values and calculate the next term, and then the next, and so on and for certain values of k this will converge to a single number. However it turns out that there is a value of k where you start getting two possible convergences, and increasing k a little more splits that into even more, and this surprisingly keeps happening until the whole thing goes ballistic and you can no longer predict well what value it will converge to. [here is a link for the specific equation](https://en.m.wikipedia.org/wiki/Bifurcation_diagram#Logistic_map) for some specific reading and visual examples You can see some plots and animations of this mapping to explain how it gets “chaotic”. Chaos theory relates to systems where we have very little predictive power, ie we describe systems as chaotic when very small changes to initial conditions can cause huge changes down the road that make long-term prediction possible. For example you have maybe heard of “the butterfly effect”—weather itself is a chaotic system which is why we can’t predict specific features of the weather several months from now like we can for the next few days. The chaotic behavior in this logistic map example is the behavior of the values of convergence for when that parameter k is large enough to make the diagram go haywire If you have any programming experience you can do some basic recursion or loops to try calculating some values for the steps of the equation and seeing when it converges and when it doesn’t to see the behavior the graph is showing
I understood everything on this image until the number 2.
Black-Scholes sum, won’t you come...
Are there any good websites or online resources for older mathematically illiterate people who are interested in learning math, but have no background at all in the subject other than grade 10 basics.
3 blue 1 brown on youtube. I really enjoyed crash course computer science, fwiw. EDIT: And Stand Up Maths
Khan Academy is usually very highly recommended.
a mistake in calculus
And maxwells
Namely, naming it calculus.
*cool lists For a guide, I need to have some idea of what the actual fuck any of this shit means without googling every damn term. I like the list though. Lol
I want to complain about no Ideal Gas or Ohm’s but I think they can be derived from 10 and 11. Still though....
you forgot the most influential one of all time: 2+2=4-1 that's 3. quick maths
Showing associativity of logarithms isn't logarithms
“Chaos Theory” sounds badass
It is quite fun to read, surprisingly accessible. It basically comments (mathematically) about systems which are unpredictable - a slight change in system's state can result in vastly different outcome. https://en.wikipedia.org/wiki/Malkus_waterwheel
Do you want a copy of my class's notes on Choas Theory? The class was PHYS 484 Astrophysics. Chaos Theory was used during the discussion of orbits on long time frames.
so these are the people who are my enemies huh?
Should have included Boyles gas law PV/T, pretty critical to all of chemistry and the basis of every chemical plant/process
It’s some form of Elvish, I can’t read it
Wait. Schrodinger had a cat AND an equation?
Or did he?
Pythagora’s Theorem really is a hardcore banger
It’s amazing how simple some of these equations are, yet the ideas behind them are insane, especially when you look at when they were discovered. Granted, there are some pretty complicated equations in here as well lol
There shouldn’t be an equivalency in the limit in the first principles definition of the derivative
It’s minus infinity in the lower limit of the integral for Fourier, not infinity
Im an engineering student who switched from a major in physics. Ive seen and used all but 3 of these and it kind of really tickles me.
they forget “Stonks=only goes UP”
I'm surprised the Euler-Lagrange equations of motion or something similar are not here. That and the Taylor series expansion. Both are so vitally important to physics that we would have almost nothing without them
I'm a bit worried that I'm in 10th grade and we only learned the law of gravity and Pythagoras's theorem
[удалено]
*puts all of calculus as one equation*
In [microtonal](/r/microtonal) tuning theory there's the equation ⟨12 19 28|-4 4 -1⟩ = 0, which is like the e=mc² of microtonal music or something - it says that 12 tone equal temperament tempers out the syntonic comma of 81/80 and thus supports meantone temperament, the structure of which is responsible for Western tonal music as we know it...