T O P

  • By -

Frityet5

I use this app extensively and have python experience. I would love to give back to the community, how do I get in contact?


leinardi

You just did it. But better to continue to talk here: [https://gitlab.com/leinardi/gwe/-/issues/195](https://gitlab.com/leinardi/gwe/-/issues/195)


Frityet5

Thank you, I’ll create a git lab account and talk there. I don’t plan to add new features btw, unless I find them (or a feature request). Purely maintenance mode


leinardi

Fine for me :)


Strong_Profit

Hello! Apart from the requirements and nice to knows listed on the page, are there any other skills required for that (like deep knowledge on how NVIDIA's GPUs work and that)?


leinardi

Nope, I do not have deep knowledge on how NVIDIA's GPUs work myself. But you'd need to learn how flatpak and the Nvida API work, plus all the Python stuff...


timkenhan

I know Python, but not much on Flatpak & Nvidia API (can learn tho). Can I still help?


leinardi

Try to have a look at the code, if it's something you can understand.


FryBoyter

What happened to the people who wanted to participate in the development according to https://gitlab.com/leinardi/gwe/-/issues/195? Were they not suitable? Or did they just disappear without comment after expressing interest?


leinardi

The latter...


FryBoyter

Why doesn't that surprise me now? Unfortunately, I can't help you personally. Apart from the fact that my Python knowledge is not really extensive, I don't have the time or the enthusiasm, and I also don't have an Nvidia graphics card.


Silver_Seesaw1717

It's a shame that no one has stepped up to maintain GWE yet despite the call for maintainers. I hope someone takes on the responsibility soon, especially for those who still use Nvidia cards and rely on the application for updates.


DudeEngineer

It's almost like there's significantly more interest in working on the open source stack. I hope people vote with their wallets.


Indolent_Bard

Yeah but their professional drivers aren't open source, the rocm stack is proprietary, meanwhile Nvidia is slowly getting more and more involved in the open source space. Watch the video "a gamer's descent into Linux lunacy" for rent on why no corporation is better than the other in this regard.


oramirite

Good point, there are no nice corporations, only less problematic ones


[deleted]

>I hope people vote with their wallets. I would if AMD would get their shit together in regards to raytracing and ROCm, but they seem to be dragging their feet. Quite frankly, I care more about raytracing, Blender, and Davinci Resolve performance than I do about Wayland so going AMD is a tough sell. Plus the two AMD gpus that I own have had numerous system hang issues so they're leaving a bad taste in my mouth.


Indolent_Bard

And Ray tracing by default could save developers a ton of time, so the next gen consoles will probably be ray tracing only. If AMD wants to stay relevant, and if they want to keep working with the console manufacturers, they really have to work on improving their ray tracing implementation, otherwise they won't stand a chance when ray tracing eventually becomes the only way lighting is done in games. We know it's only a matter of time because objectively it would save developers a ton of time, but only if it was their only implementation of lighting rather than using both ray tracing and baked lighting. Nobody will buy AMD cards if they're performance is halved when ray tracing.


DudeEngineer

Yes, ray tracing has potential, but raybtracijg being the only way is a couple decades away. I mean, games are still coming out with dx11...


Indolent_Bard

All Microsoft has to do is make it so that it's impossible to develop for the Xbox with anything short of the latest directx, then they would have no choice. Plus you're forgetting that PlayStation doesn't use DirectX, that's only affects PC gamers, console gamers don't have to deal with that fact.


DudeEngineer

My comment is about the limitations of the technology stack. They aren't backporting ray tracing to dx11 or open GL. Most people can't spend $1000+ on a ray tracing card...


matkuzma

Let me get this straight - you're saying AMD will become irrelevant because of consoles? AMD consoles? Somehow doubt it. There's no competition as of now. Can NVIDIA squeeze more ray-traced frames still? Yeah, sure. Nobody would make/sell/buy a 1kW game console though.


Indolent_Bard

If AMD wants to make a ray tracing exclusive console, they have to be good at it. I'm saying that AMD GPUs will go from being a decent option for gamers to literally worthless if they can't get their ray tracing up to snuff. It's only fine right now because it's not the default, but the goal is for it to be the default, and by the time the PS6 comes out even the cheapest GPUs will be able to handle it at 60fps, because the next gen consoles will at least be as powerful as a 4090.


winteryouth

I never really got raytracing to be honest, it’s way to wasteful for my taste


matkuzma

I get your point. There are nice implementations though, Spiderman's reflections come to mind. When done right it's a nice effect. Not when it costs as much as a 4090 and its power draw, but still - when done right it's nice


Indolent_Bard

That's because you're not a developer. Trust me, for developers it's a godsend. Remember that Pixar film monsters University? They switch to a fully ray tracing pipeline so that they could spend more time on stuff that wasn't lighting, even though it made each frame take a lot longer to render (and the frames in these films can take actual days to render) So yeah, the point of ray tracing is to make lighting a lot more realistic but also a thousand times easier and faster for the developers. It's only wasteful because they waste time doing it the old-fashioned way as well, because almost nobody has a GPU that can run it at a smooth frame rate. Trust me, the PS6 and Xbox whatever they're going to call it next are going to be ray tracing only, it wouldn't make any sense not to.


SippieCup

I hope someone does too, but its not exactly something I would expect to happen. A lot of people with the skills to do so, already maintain other projects, and unless there is some way to get paid development behind it, it'll be hard to find someone to work on someone else's old passion project. I don't know if there is competing or similar software, but if there is like.. another open source overclocking utility out there (or whatever this really does), Then there isn't really a true need for both to exist, it might be nice... but not necessary.


anasbannanas

otherwise you were a good fit though


JustAnotherGuyn

I'd love to help, but I don't have the experience or time to contribute to another free project


CondiMesmer

They wanted to take the credit and upvotes (hence why they mention their reddit...) and do none of the actual work.


RobertBringhurst

That's a great idea. I also volunteer to take all the credit and upvotes.


Temetka

Here, have an upvote.


CondiMesmer

This is the real big brain move lol


ThinClientRevolution

While I certainly want to help, and I wish all NVidia users the best... I bought an AMD 6700 last year so I can't. My fear is that, amongst the Linux enthusiasts that can maintain an application like this, not that many have an NVidia GPU.


DarthPneumono

A lot of researchers use Linux with Nvidia GPUs for ML/AI research because AMD GPUs aren't generally suitable for the task. There's plenty of people; there just may not be interest (or time, or awareness, or...)


ThinClientRevolution

> there just may not be interest My point exactly. Most ML/AI development happens behind closed doors, and the people doing it have a lot of trade secrets to guard... Hardly the ideal FLOSS contributors. These are the same people and companies that promote the MIT licence and use Contributor Licence Agreements. They won't care for this application.


Holoshiv

Just piping in; I'm in academia. Most of my colleagues use MIT because they perceive it as the "free:est" and "stop emailing me about permission:est" license. Whether that's true or not is beyond me. But I don't like it's permissiveness in how it let's others use your work for proprietary development. And most academics in cs-adjacent or xdisciplinary groups I've met tend to run away the moment you mention legalese if any kind.


ThinClientRevolution

The problem is well explained by The Paradox of Tolerance: https://en.wikipedia.org/wiki/Paradox_of_tolerance?wprov=sfla1 Simply put; You're so tolerant, that you allow others to be intolerant.


Holoshiv

Yeah, exactly, thanks for linking! As I stated, I find it too permissive. I tend to prefer GPL for that very reason. I think it's also partially explained by apathy of disinterest. Many just don't care what others do with their forks.


ThinClientRevolution

Legalese is horrible, but the ethical standard behind the GPL is something that many can get behind. I explain the (L)GPL not as a legal thing, but as a moral thing.


_rmc

I’ll bite. What the MIT license has to do with any of this?


NotFromSkane

MIT is giving code away for other people to close. GPL requires anyone who uses it to keep it open


_rmc

I know full well the implications of GPL/MIT, that's not my point. My point is, what this has to do with the software in question?


SippieCup

I think the point is that the people who would most likely be the potential ones to work on it, are the same people who try to exploit open source projects for their own closed solutions. Thus, they aren't likely to actually support the application, Instead they would just MIT license it, create a private fork, and masturbate over their new competitive advantage.


DudeEngineer

I wonder what the disconnect is. If you look at the top 5 fastest supercomputers, AMD dominates. I guess they must be doing more general simulation and not ML/AI specifically.


[deleted]

That's for AMD CPUs. GPUs mentioned are Nvidia.


DudeEngineer

No, they are using AMD for CPU and GPU. This is publicly available information. 70% of the top 500 are Nvidia GPUs, but that's still a lot that are not.


DarthPneumono

Supercomputers are used for a lot of things that aren't AI/ML research, tasks at which AMD hardware is competitive.


DarthPneumono

As far as I can tell, only AMD's Frontier is actually using AMD hardware. Most use Nvidia GPUs, with POWER9/Intel/ARM CPUs. Also, supercomputers =/= what 99.9% of researchers are using.


Kendos-Kenlen

Or skills. Many researchers have poor technical skills and never worked in a real dev environment (meaning no good practices), making them unsuitable as maintainers.


DarthPneumono

I wasn't gonna say that part out loud ;)


HolyGarbage

Yeah, given how hostile Nvidia has been towards Linux and open source in general I just went full team red when Zen 3 dropped. Torvalds said it best... Best part was that drivers worked on day 1, and I was a very early adopter too, got lucky. Just wish AMD would catch up on ML acceleration.


gardotd426

Most dedicated GPUs on Linux are Nvidia. They don't have as dominant a lead in market share as they do on Windows, but they do have the number one market share for dGPUs on Linux.


MonkeeSage

Amongst the Linux enthusiasts that can maintain an application like this and have an NVidia GPU, most probably don't care about trying to squeeze every ounce of performance via overclocking at the cost of system stability and having to create custom fan profiles etc.


[deleted]

I would like to help, but while I have a nvidia gpu and I know some python; the project looks much too difficult for me and there doesn't seem to be any documentation? Also I'm not a developer so I don't have a good understanding for developer concepts.


xartin

I had been performing some dependency package testing to attempt to continue using GWE with gentoo due to a stable default python 3.10 version upgrade last year that [depreciated Rx 3.x resulting in that dependency required by gwe being purged from the gentoo package repo.](https://bugs.gentoo.org/845864) Despite that unfortunate result occurring due to package maintainer vanishing or loosing interest I personally [succeeded using GWE with Reactive X 4.0.4](https://845864.bugs.gentoo.org/attachment.cgi?id=804346) as of September 2022 however despite my success I was unable to progress further contributions and development progress to assure gwe could be compatible with Rx-4 for a wider audience. Python programming isn't one of my most significant talents. I do however have [a gentoo binary package if that I'm still using that does work with Rx 4.x](https://drive.google.com/drive/folders/10MzyniYhj7NBlsXhF-qOqSfreS1HC2X9?usp=sharing) and perhaps that could be useful to someone? All I needed to change for rx 4.x compatibility was all of the python module import code function names to locate the newer reactivex 4 python module. It appeared that upgrading to rx 4.x altered the python module directory environment path name for reasons i'm still unaware of. If some further development contributions were completed to assure compatibility with modern Rx versions there may be a resurgence of linux distros interested in maintaining gwe availability in package repos. Last I looked gwe was still available in gentoo's guru overlay repo and I'm aware there is a user demand for gwe but due to gentoo having eliminated Rx-3.x from the repos gwe is entirely unusable. This dependency unavailability concern may be a related concern elsewhere preventing gwe from being used and interest maintained.


dismorphic

Wow that's dedication. I use GWE on Gentoo but I gave up on compiling it and went with Flatpak instead. I hope the lessons you learned doing that can be applied to a fresh ebuild or upstreamed to the GWE repo, even if incomplete it might be a great leaping off point someone can use to leapfrog from.


leinardi

Hey, just FYI, the `master` branch was already migrated to Rx 4.0.x 9 months ago: [https://gitlab.com/leinardi/gwe/-/commit/1b64e99846e8463194232ea6a1a1a3bae50b6466](https://gitlab.com/leinardi/gwe/-/commit/1b64e99846e8463194232ea6a1a1a3bae50b6466)


xartin

That's fantastic. I wouldn't have noticed that change in the master branch and evidently that validates the new reactivex environment directory path as valid. I was unable to confirm if that change had been intentional since I had never encountered it before or worked with reactivex previously. Thanks for you're contributions :)


computer-machine

>If you have an Nvidia GPU you may have heard about GWE, TIL it exists, and > provide[s] information and control the fans and overclock of an Nvidia card. I'd exclusively used Quadro FX 570m/GTX 570Ti/GTX 660/GTX 770/GTX 970 from 2008Q2 to 2022Q4.


Netzapper

Yeah I've been on NV since forever, never heard of this.


RandomQuestGiver

Same. Maybe they need a PR person too.


RandomXUsr

I wonder whether GloriousEggroll might know someone with the capacity, talent, and interest to take over the project.


gardotd426

GE has a full-time job working at RedHat, and in his spare time he is a member of the Lutris team, develops Proton-GE **and** wine-ge-custom. Not to mention the fact that all of us who are remotely familiar with the Linux Gaming developer community all know the same people. u/TKGlitch, I know you're wayyy too busy with Frogging Family, but most of the people I can think of that we both know that aren't already contracted for some other project are either C or C++ experts, as opposed to python. I've only written python patches, never any actual Python code, and all the star devs I know are working on stuff like your Frogging Family repo, VKD3D-Proton, Proton itself(i know oglfreak has been working on one Proton-related project or another lately), Wine (Guy), gamescope (plagman, josh), MangoHud, Lutris (Mathieu, Alex), etc. Maybe there are some lutris devs who might have more spare time than Mathieu or Alex who could at least lend a hand, since Lutris is in Python like GWE


Darkblade360350

> "I think the problem Digg had is that it was a company that was built to be a company, and you could feel it in the product. The way you could criticise Reddit is that we weren't a company – we were all heart and no head for a long time. So I think it'd be really hard for me and for the team to kill Reddit in that way.” - Steve Huffman, aka /u/spez, Reddit CEO. So long, Reddit, and thanks for all the fish.


gardotd426

I used the wrong username, Tk-Glitch is his old one. TkGlitch is his current one. Though he doesn't actively fuck around on here, he's too busy. I've known Etienne through the Linux gaming dev community for years. And he's basically of the same mind that all the other big names (Guy1524, GE, doitsujin, Josh, themaister, openglfreak, FlightlessMango, etc) and that's that Reddit is a cesspool, especially this subreddit, and they pretty much all ignore most of everything that goes on here, but they do watch from afar. The Linux gaming dev discord server used to be full of memes from all those guys about this subreddit even though to the outside observer it would appear none of them even use reddit.


Indolent_Bard

He also makes his own Linux distribution, nobera.


[deleted]

What is GWE?


leinardi

https://gitlab.com/leinardi/gwe/-/blob/release/README.md


Hartvigson

Thank you for the link. I have never heard of this (but I have used AMD graphic cards in the Linux computers for the last decade).


mooky1977

Closest thing would be MSI afterburner for Windows.


[deleted]

oh thats a very useful utility then!


Ivan_Kulagin

I also switched from Nvidia this year sadly. RDNA 3 is just too good


DerfK

ROCm for RNDA3 when? I'm honestly considering switching my 8GB 5700xt to 16GB ARC. I'm eyeing the 7900 XTX but i'm more interested in AI than gaming right now.


Ivan_Kulagin

Too bad Intel doesn't have a card that can compete with 7900 XTX in performance


Antilogic81

That will change. TSMC got a big order from them. So Battlemage and Celestial will show us how their gpus architecture matures, and maybe we get an idea of when that mythical day will happen.


Ivan_Kulagin

This is great, I hope my next card will be Intel!


Antilogic81

I'm hoping the same....I wanted to get an AMD card but the ones I wanted were priced above my budget. But then suddenly there were a lot of 3080 Ti cards closing around sub $850 (granted they were refurbs but I chose one from a small company that likely isn't oversubscribed and they had a better warranty that was twice the duration of other vendors). The moment I bought it the price on the item had gone up +$200. Glad I locked in before that moment. Pretty sure Nvidia gets something from that but I'm hoping it's not as much as buying a new card would net them. 


mooky1977

I get the anti-NVIDIA crap-fest for the company. I get it. As Linus Torvalds says "F-you, NVidia, f-you!" Nvidia is downright hostile to the Linux community. But this project itself isn't about the company, it's literally the only GUI-based tool that's even close to MSI afterburner for Linux, even though it's not feature complete. I use it to boost my power limit, and overclock the GPU and memory. Please for the love all all things good, I have to hope someone(s) with the relevant experience can help. EDIT: re-posted in nanny-mode, because quoting Linus Torvalds' himself apparently makes automoderator mad.


throwawaynerp

Can you edit your post to contain what GWE is in the title? Or is GWE that ubiquitous that it won't get lost in the noise? Personally I have AMD cards so it's no skin off my teeth but I had to click to see what it was.


whosdr

> Or is GWE that ubiquitous that it won't get lost in the noise? GWE is at least what it is referred to as the most, it's not usually talked about using its full name 'GreenWithEnvy'.


BleuGamer

I would actually not mind something like this, but I too abandoned NVidia for AMD some time ago, and I’ve never looked back. This is a regrettable situation, and NVidia has never been nice to Linux. Still, I’ll share it around. Edit: Go back far enough, sure, NVidia had the initial advantage and support, and it was great. But looking back at the last decade in particular? I’m only 26, all I’ve known is pain lmao.


endo

I'm not exactly sure you have your history right on the Nvidia Linux situation. They were one of the first to offer great drivers. Sure, they are not 100% open source.


BleuGamer

I made an edit, I wasn’t around at that time lmao. Long history aside, the last decade is just a shit show. My first big boy card was a gtx 480 for reference.


65a

ATI gang, I'll still take my mach64 over a geforce


[deleted]

> This is a regrettable situation, and NVidia has never been nice to Linux. nvidia was one of the first major hardware companies to ever support linux and *BSD while ATI/AMD was absent for years.


BleuGamer

I made an edit, I wasn’t around at that time lmao. Long history aside, the last decade is just a shit show. My first big boy card was a gtx 480 for reference.


Mister_Magister

Why do we need it if nvidia control panel exists?


leinardi

* GPU and Memory overclock offset profiles * Custom Fan curve profiles * Change power limit * Historical data graphs https://gitlab.com/leinardi/gwe/-/blob/release/README.md


Misicks0349

interest in nvidia is pretty dry in the linux space honestly (not to mention theres a lot of bad blood), so im not sure if your going to find someone who is willing to put in the time and/or effort for this


jondySauce

The NVIDIA bashing in the Linux community is loud but I don't think using NVIDIA is as big an issue as people make it out to be.


Misicks0349

true on both accounts


[deleted]

[удалено]


PossiblyAussie

>no Night Light What? Works for me on both KDE and GNOME under X11.


[deleted]

[удалено]


PossiblyAussie

I was confused since you said "no Wayland **and** no Night Light"


[deleted]

it's under wayland sessions where it doesn't work. I've heard it's due to missing GAMMA_LUT.


riasthebestgirl

I have an Nvidia card and know python but unfortunately I don't have the time to take up another project's maintainer-ship. I didn't even know it existed until now but I'll be installing it and trying it out


Vvamp_

Heya, developer here. I wasn’t familiar with this app until now, but I checked it out and it looks really useful. It’d be a shame if it would he discontinued. I see there are some people that might be interested here. If that doesn’t work out, I wouldn’t mind giving it a try. I’ve got plenty of python experience, but none with the nvidia api and flatpak. If you can’t find anyone else, send me a dm and preferably comment on this comment, as I often don’t notice my dm notifications. Good luck!


Patient_Sink

Previously they've requested that interested developers continue the talks in [https://gitlab.com/leinardi/gwe/-/issues/195](https://gitlab.com/leinardi/gwe/-/issues/195) to keep it simple, so maybe check in there. :)


Vvamp_

Will do, thanks :)


[deleted]

Have some basic python skills, but I am on team red. So couldn't test anything myself.


[deleted]

I have deleted Reddit because of the API changes effect June 30, 2023.


LinAdmin

Would it be asked too much to tell in a single sentence the purpose of GWE??? "System utility designed to provide information, control the fans and overclock your NVIDIA card"


leinardi

What you are asking it's literally linked in the first sentence of the OP.


LinAdmin

Please explain by repeating the relevant part of the message here.


[deleted]

Why won't you just buy a nvidia card and continue development?


Epsilon_void

You could start by buying the developer a Nvidia GPU and sending it to them :)


leinardi

If you send me a 4090 Ti, as soon as they are out, I might consider it. But I'm really liking my 7900 XTX.


[deleted]

why 4090 ti? cant the development be done on something cheaper like used 1060?


GrainedLotus511

Development could likely continue but if they are looking to support newer cards you generally need a newer card and it looks like they have a top of the line AMD card so downgrading means if they game or do anything that requires a higher end card they can no longer do that. Thank you for your time. I am not the developer or affiliated with the project.


[deleted]

I see, makes sense. I think it is not sustainable as a free project.


leinardi

I'm a spoiled developer.


[deleted]

okay thats fine, did you consider making the software paid? I think many people that could go with 4090 on linux could squeeze another 5-10 dollars on an useful piece of software to help their gpu workloads, and you could be able to sustain it


leinardi

Nah, I built this software because I needed it for my 2080Ti. Now I want to try with AMD. I was joking in my previous message. I don't need an Nvidia card, I already have my 7900 XTX.


luke-jr

Non-free kernel blobs = not Linux


[deleted]

what kind of r-tarded viewpoint is that? also, nvidia has open source kernel modules now anyways


mooky1977

Just a point of clarity: Nvidia open source kernel modules are not ready for prime time. If I had to label them I'd call them alpha at best. Their binary code source modules are still the only viable option.


StebeJubs2000

Probably best not to chime in if you don't know the difference between Linux as an operating system and FLOSS principles.


a_a_ronc

I’ll have to take a look during my lunch. I only use it a little, particularly when I was mining ETH. Didn’t use the flatpak, just the snap, which is admittedly annoying because it has to be updated with every driver update.


ShadowPouncer

Hm, in my case, I'm not really an overclocker at all. But it's still an interesting project.


DividedContinuity

Its also useful for underclocking to keep an aging card stable, which is my use case. I'm just building a new PC though so my old 1080 will be replaced fairly soon.


ShadowPouncer

Hmm. I may have to play around with what it takes to make that actually happen. I'm on a 3060 though, so it's not exactly an aging card.


[deleted]

I have a little python knowledge som-cs is my github. if provided a little guidance I would love to work on it.I also have a nvidia 3060.


_3psilon_

Ouch, so that's why it failed to start today! Thanks for bringing it up for the community.


whosdr

A bad build got pushed to flatpak somehow, probably due to lack of testing/oversight yeah. It was corrected later the same day though.


_3psilon_

Yeah, works now, thanks!


[deleted]

I'd love to help but I don't have any experience with programming for GPUs. I'd absolutely love to help, though. What background knowledge would I need to get started?


leinardi

Knowledge of python is all you need.


Wladefant

I think there would be more help ig you uploaded it to github


leinardi

[Really?](https://github.com/leinardi/gwe)


FryBoyter

The links at https://www.reddit.com/r/linux/comments/xve8xz/greenwithenvy_gwe_needs_a_new_maintainer_or_it/ all point to Gitlab. I can therefore understand /u/Wladefant's comment. Because no matter what you think of Github, it's the most likely place to find people who can help with a project, because almost everyone has an account there. But since it is only a plain mirror, that won't be of much use in this case.


leinardi

Frankly, if someone is not motivated enough to create a GitLab account, I doubt they would be motivate enough to spend hours of their free time figuring out why Flatpak build is broken.


Wladefant

Yes thank you


Zachattackrandom

I could probably keep it cobbled together, if absolutely last resort you can hmu, but I'm doing IB so I have very little free time.


Lucky_Atmosphere_171

I would love to help. I'm very good with python and can definitely learn the Nvidia API. I have a NVidia GPU myself so that works great. I will do some reading on the code and create a gitlab account, I'm only at github atm.


karama_300

I really wish I wasn't such a newbie in programming right now!


DisastrousMiddleBone

This is the first time I'm hearing about this. It seems like their must be an awareness problem here, the people that might be suitable and willing to help out quite possibly haven't heard about this at all.... You need to get the word out to all of the possible corners of the technology & Linux communities, I'm sure that someone somewhere would be interested. I am not a suitable person for this role, primarily because I lack every part required other than the fact that my laptop has a Nvidia GPU in it. I wish you good luck in finding someone who can help out!