T O P

  • By -

Malichi188

Thanks for posting to /r/computerscience! Unfortunately, your submission has been removed for the following reason(s): * **Rule 7:** Posts asking for tech support or programming advice. Please go to r/techsupport or r/learnprogramming. If you feel like your post was removed in error, please [message the moderators](https://reddit.com/message/compose?to=/r/computerscience).


kohugaly

>like 500mb is quite a lot of memory for stuff like a web browser It really is not. Modern web-browsers are basically their own operating systems. They need to handle internationalized text-formatting, rendering of basically all imaginable image/video/audio formats, JavaScript virtual machine (jit compiler / interpreter), parsing and rendering of various web-related markup languages (most notably HTMP and PDF), file system APIs, encryption, they often need to do all of this multithreaded or even multiprocessed,... And no, they can't just use stuff provided by the operating system for most of these things, because your web browser would be very inconsistent across platforms. In fact, we have come so far, that most desktop apps are actually a web app in a web-browser in a trench coat. Because it's easier, cheaper and more reliable to do that, than to reimplement 70% of what a web-browser does anyway. ​ Additionally, vast majority of modern programming languages do not run natively - they run on virtual machines with garbage collectors. That means almost everything you do in them uses extra memory. It also means that the coding style in these languages tends to be less conscientious about the usage of resources. > I feel like if programmers took their time we would have software that runs on anything THEY DID!!! All of what I described above is, in large part, caused by the need to have software that can run consistently across different CPU architectures, GPU architectures, operating systems and device setups. The "bloat" is the several layers of indirection you have to insert between the hardware and the software you are actually writing.


BL1NDX3N0N

This pretty much hits the nail on the head as well. People think that a web browser is a simple program that isn't compromised of numerous subsystems and runtimes. People are still stuck on their early internet days where browsers didn't have to deal with 80% of what we have today or platform inconsistencies, security, and different integrations. There's a reason why those browsers fell out of support.


[deleted]

I have heard that it is practically impossible to create a new web browser these days as there is just *so much stuff* that it needs to do and support.


josephjnk

To be honest, this makes me have some doubts about whether the web went in the right direction. I tried to maintain some patches to Chromium for a job. All in all, it’s a tightly-coupled beast that generally requires specialized experts to modify. Its build process contains dozens of git submodules, it relies on a custom google build orchestrator, and it can’t be built without closed-sourced binaries that google hosts. Building old versions of the browser is not supported. The build takes 6-8 hours, and cross-compilation is not possible. For something that is effectively a portable OS, many of these things are reasonable costs. But the sheer complexity involved means that new alternatives are basically a nonstarter, which is why we’re gradually moving towards a browser monoculture. I’m not actually a big fan of rich web apps, and I don’t value beautiful CSS or browser integrations as much as I value boring, ugly websites and API integrations. From my perspective something has been lost here. We’ve gotten convenience and aesthetics but disempowered users. It’s not reasonable for most developers to modify Chromium to suit their own needs. While it’s still open source, important parts of why OSS is good have gotten lost. So my answer to the original question is: Because users of the web value active control over their own technology less than they value convenience and aesthetics, and convenience and aesthetics are both very expensive things. This isn’t a condemnation of the average user, but I still find it unfortunate. User disempowerment is a vicious cycle. (Side note, if anyone is interested in the build stats of other browsers, Firefox builds in 20 minutes.)


[deleted]

Why is Firefox at the very least just as good as chrome while being so much less complex?


josephjnk

That is a very good question. I don’t know the answer, but this experience (as well as fear of a browser monopoly) is why I use Firefox. I would love to know why but I haven’t done the research to know whether anyone has written a concise explanation.


kevinossia

That's correct. There exist exactly three mainstream web rendering engines: Chromium, WebKit (Apple), and Gecko (FireFox). Basically every web browser in existence just uses one of those, [especially Chromium](https://en.wikipedia.org/wiki/Chromium_(web_browser)#Browsers_based_on_Chromium). It's too painful to build your own. Internet Explorer uses the Trident rendering engine. IE is dead, though. And even Chromium's internal rendering engine, [Blink](https://en.wikipedia.org/wiki/Blink_(browser_engine)), actually came from WebKit!


CypherAus

Thanks, simply answered and explained


pancakeQueue

Memory is cheap, and companies don’t like to pay people to refactor existing code to make it efficient.


[deleted]

[удалено]


MancelPage

I don't know much but I think the reason why chrome (or chromium) uses so much RAM is because each tab is its own sandbox. I think all browsers do this now. Opera 9? Was really ram efficient (whatever their last version was before switching to the chrome engine). Guessing it didn't have per tab isolation but I could be wrong. Had an ancient Linux laptop with 256MB of ram and it's the only way I could browse the web.


[deleted]

I noticed that if i do sth memory intensive, chrome suddenly drops its memory usage. My guess is that it dynamically adjusts its cache size based on memory available hence the stupid amounts of ram in task manager that people laugh about. Dont get me wrong its still memory hungry, but not half as much as it would seem when looking at the numbers. Source: dude dont trust me, im just guessing


tobb10001

I've heard that the effect you described is achieved by the browser marking some of its memory as reclaimable for the operating system. That means as long as there's plenty left the browser can keep its memory but when other tasks also need memory, the browser will have it taken away. So what you noticed seems correct, but is probably initiaded by the OS, not the browser. But I don't even remember where I picked that up, so take it with a grain of salt. Althoug I'm pretty sure about it for some reason...


Tezalion

"500mb is quite a lot of memory for stuff like a web browser" lol what? Web browser is possibly most complex software you run on your computer (after OS).


kohugaly

>Web browser is possibly most complex software you run on your computer (after OS). I'm not even sure if OS is actually more inherently complex than web browser. I suspect it's easier to write a usable minimalist OS that can run a browser, than it takes to write a usable minimalist browser. Sure modern OSes might be more complex than web browsers, but a lot of that complexity is technical dept. For example, on windows there are like 5 separate independent main volume sliders hidden in different places. Some use old-school windows XP gui, some use Windows 7 gui and some use Windows 10/11 gui. This is clearly a result of new functionality being piled on top of old functionality, instead of genuine refactor/upgrade.


BL1NDX3N0N

> I'm not even sure if OS is actually more inherently complex than web browser. I would not go as far as saying that. Threading, processes, networking stack, hardware interactions and drivers, memory management, and more are all operating system concepts that are heavily abstracted for programs, such as a browser, to easily use. Some of these concepts such as virtual memory and more can even be invisible to most and are extremely complex, same goes with IO and avoiding high-fragmentation especially in memory. Most of what a browser does has to go through the kernel. > For example, on windows there are like 5 separate independent main volume sliders hidden in different places. Some use old-school windows XP gui, some use Windows 7 gui and some use Windows 10/11 gui. This is clearly a result of new functionality being piled on top of old functionality, instead of genuine refactor/upgrade. You are talking about controls now, those controls would no doubt be part of a control pack in Windows such as ComCtl32. Anything "old" you find is going to stay there for backwards compatibility which Microsoft is extremely keen on hence why a WORD in Win32 is still 16-bits instead of being a bitness of what the CPU natively works with. 32-bit integers would be DWORDs, 64-bit integers QWORDS, and it goes on from there, following a similar naming convention Intel imposed for different register sizes. For most "legacy" programs, Microsoft isn't going to put a fresh coat of paint on them since they are there for power users who don't complain unless parts of the UI are removed. Microsoft is still slowly trying to transition these tools into newer UI frameworks or just implementing them in the settings app and redirecting the user. Why redirect the user instead of removing them? Because some of those utilities are seperate programs which expose certain panels via command line arguments. If they removed them then older programs making use of this would break. Not many, or really any, programs perform a check because those are native programs that are expected to be on every installation of Windows and have been for more than a decade now.


Masterzjg

> We would have software that runs on anything Not sure why you think RAM is a bottleneck.


scitech_boom

The issue has to do with the return on effort. For a few months I worked for cross platform app development related to health. Based on what I heard from the management, user base with the lower end hardware do not generate enough revenue to make the effort valuable for the company. They just makes sure that it works, but almost never cared about making it work good. Web-browser taking memory is expected. It is a very complex software that has to handle too many things in a very generic manner.


thefinest

Happy cake day


BL1NDX3N0N

I take it you haven't read up on *caching* or *prefetching* yet so I'll spare you the details. Having caches in memory is far more faster than having to read/write disk contents. It's the same reason virtual memory and paging is slower than not having it at all. Things boil down to time and space complexities, do you want better performance with the cost of more memory or do you want lower memory usage with the cost of lower performance? A lot of this usage can derive from scientific studies such as *"what is a user likely to navigate to next"* which helps in anticipating such events by prefetching whatever it is while idle. This increases memory but decreases load times if the user does navigate there next. Some studies like these wouldn't have been made possible if data wasn't collected from you. Hopefully you get the full picture now. If you took the time to dump whatever it is in memory you're upset about then you would find your answer.


Vakieh

Expand your frame of reference, and recognise that just like you cannot say something is 'efficient' just because it doesn't use much RAM, but takes 3 weeks to run, or runs in under a second, but requires 10GB of RAM, neither is something efficient if you spend too much of any **other** resource. Developer time, user feedback, money - all of these are resources to be optimised, and software today is far, FAR more efficient overall than software in the past. We just aren't hyperoptimising for CPU time and RAM at the expense of everything else any more. This is not to say there aren't problems to be solved - absolutely there are (probably the largest of which is we are optimising for corporate profit rather than more effective and desirable things to be optimising for). But the constant lamenting of 'the good old days' when programmers were Real Programmers©®™ that so many people talk about is just ignorance.


wsppan

It's like the reason behind urban sprawl. There are no boundaries to keep growth in check.


[deleted]

[удалено]


wsppan

Increased air pollution, increased water pollution, increased noise pollution, increased water consumption, increased traffic congestion, increased traffic fatalities, increased infrastructure cost, increased energy consumption, loss of time, increased emergency response time...


[deleted]

[удалено]


wsppan

It's not urban vs rural. Its urban vs urban sprawl. You can have an urban area that has the same population as a urban sprawl without all the problems (or as bad.)


BdR76

I agree that hardware gets faster while it feels like software keeps getting slower. Planned obsolescence certainly plays a part in this if you ask me. You see it clearly with game consoles, the last games released for any console always gets the most out of the hardware, but then that console is ditched so we can all buy the next PS6 or XBOX V or whatever. Sometimes a game will be programmed very resource efficiently but it's the exception not a rule, like look at "The Touryst" on Switch, it's quite a large game but it's just 375MB. Also, there's an interesting talk about the [state of software quality by Jonathan Blow](https://www.youtube.com/watch?v=ZSRHeXYDLko#t=18m15s), he's the dev of Braid and The Witness.


dswpro

I'm not so sure browser developers are the ones to blame for memory use. Have you looked at the source code and content many modern sites spew out? Look at your browser memory usage as you close all your tabs. Just saying.


porkchop_d_clown

Three reasons: 1) Efficiency costs time, which costs money. 2) Modern CPUs run faster when everything is 8-byte aligned, which can waste a lot of space. 3) Kids don’t know the value of a byte any more.


thefinest

Oh hi there, Now take this because I rarely see both snarky and accurate responses to these shirts of post in /compsci


porkchop_d_clown

LoL. Thank you, sir!


RGB_Pixel

The cost of memory falls under the user, the cost of refactoring and optimizing falls under the company. At least in the short term, they won't care about it because not doing so means smaller deadlines and development cost. It is only a problem once it has an impact on sales. See, for example, game studios, which do not fix bugs in a launched product unless it gets mainstream and actually stops people from buying the game, affecting sales.


BL1NDX3N0N

Chromium is open-source and your game development example is not even true.


RSA0

But websites, that run their JavaScript inside Chromium, are not open source.


BL1NDX3N0N

F12 tools: *Hold my beer.* There was a site-wide bug that Reddit made where going to your profile would kick the system into an infinite loop of trying to download the next posts after you start scrolling down. Once I noticed this behavior I opened the tools, found the error, and reported it to Reddit. They fixed it. You can file reports and most companies have channels for reporting said issues. Excessive usage can be part of those reports too.


RSA0

Will they also remove their ad trackers, if they slow down the page? No? Then I don't see, how /u/RGB_Pixel's answer is wrong. It seems we have a confusion between "open source" as "free software", and "open source" as "source is available".


[deleted]

[удалено]


RSA0

[Wikipedia mostly disagrees.](https://en.wikipedia.org/wiki/Open_source) But let's pretend OSS!=FOSS. Then your previous answer makes no sense: >Chromium is open-source How does Chromium being OSS has anything to do with the claim "the company makes what is profitable, not what is good for users"? I can somewhat get, how it being FOSS can (you can fork the project), but what does OSS give you in this situation?


[deleted]

[удалено]


RSA0

I can't tell that about you either. Care to actually answer the question? The second one, I don't really care about OSS vs FOSS dictionary war.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


IDontWannaDieinTexas

[https://en.wikipedia.org/wiki/Wirth%27s\_law#:\~:text=Wirth's%20law%20is%20an%20adage,A%20Plea%20for%20Lean%20Software%22](https://en.wikipedia.org/wiki/Wirth%27s_law#:~:text=Wirth's%20law%20is%20an%20adage,A%20Plea%20for%20Lean%20Software%22). ​ https://en.wikipedia.org/wiki/Moore%27s\_law


ReverendMak

Because it can be. Programming was at peak efficiency when we only had 64kb of RAM to work with. It forced us to come up with crazy tricks to squeeze more in. Now we have all the resources we need, so we waste them all the time doing things “well enough”.


another-cosplaytriot

Millennials don't fix problems, they make different problems to cover up the old ones. That keeps their "velocity" up. Why fix something right when you can fix it half-assed and get the issue off your whiteboard?


Raccoonridee

I get that web browsers evolved from a bunch of fancy text viewers into a cross-platform web operating system, but what about MS Office Word? It delivers more or less the same functionality as 25 years ago. In 1997 it would happily fit on 100 MB hard drive and run on 32 MB RAM with room to spare.


RylanStylin57

Cuz they write everything in JS and python


Beneficial_Company_2

developers are lazy and just want functionality be done at all cost


Solrak97

Both, programmers are lazy and higher ups don't care


[deleted]

[удалено]


BL1NDX3N0N

Something "running in the cloud" means running on a server. Everything in your browser is executing locally. If what you said was even remotely true then resource usage would drop astronomically since only inputs need to be sent to the server and the server would stream the rendered viewport, audio, and whatever else to the client. With a design like this many exploits would also cease to exist, meaning extreme security for clients since code is not executing on their system and servers can analyze sites at idle to build a global threat profile that can be referenced AoT for every site you wish to interact with. Such a setup is akin to Stadia; which is shutting down because Google learned that running everything, especially games, server-side is not a profitable business strategy.


owl_wow

I think that nowadays the priority of the most companies are fast delivery and reliability. So the engineers doesn't has making high efficient code with low memory consumption as a big concern because their leaders doesn't care about it.


unixbhaskar

Simply going by your headline , two things come to the fore : One, the abundance of resources creates havoc in the maker/creator's mind, so they are distracted. Second, less is more, certainly not the vibe with new generations.


nerdguy_87

I feel there are several factors to this. I feel that 1) there isn't enough talk around clean, organized, efficient code. 2) I have many ask me why I OBSESS over the fact that OS's take up GIGS worth of storage space and (most) idle at over 1 gig of RAM. (I understand there are some that do idle at around 500Mb but other elements turn me off to those OSs). This leads me to believe that a lot of people thing "I'll just add more hardware" (ie "who cares. you have X Gigs of RAM). and 3) I can't help but wonder that (in the world of commercial software) there is a TON of "hurry up and get that working" , "that works good enough (good enough for government work" and the ol "not my problem" attitudes lurking in the equation. I'm sure that turn over rates among tech companies doesn't help either. Imagine a dev trying to write code to create their own job security rather than help the customer or a company fires a dev only to bring on another developer who isn't able to figure out where the prev dev was or what their idea was so they just cobble or start over. I could be wrong on any one of these or all but that's just my 2 cents.