Somewhere, somebody’s having a meltdown because Rust is spreading more and more in the kernel.
Probably more than just one somebody, based on the drama in these last few week’s. 😜
Good to see that NVIDIA is writing opensource drivers (or starting to). I guess it’s too much to ask to support old graphics cards, with NVIDIA mostly caring about money and a linux driver being an incentive to choose NVIDIA over AMD for some.
It’s too bad that there’s still a proprietary binary layer that this driver will talk to. (I’m assuming right/wrong that it’s not open source, since it’s binary.)
Best to support AMD if you game on Linux. Really wish Intel would step up their GPU game.
It’s too bad that there’s still a proprietary binary layer that this driver will talk to. (I’m assuming right/wrong that it’s not open source, since it’s binary.)
I must’ve missed that from in the post. Do you have more information on that?
It’s too bad that there’s still a proprietary binary layer that this driver will talk to. (I’m assuming right/wrong that it’s not open source, since it’s binary.)
I must’ve missed that from in the post. Do you have more information on that?
The article mentions the following …
the NOVA driver is intentionally limited to the RTX 20 “Turing” GPUs and newer where there is the NVIDIA GPU System Processor (GSP) with the firmware support to leverage for an easier driver-writing experience.
Also in the same article, there’s a link to another article that mentions it a little bit more …
“… serving as a hard- and firmware abstraction layer for GSP-based NVIDIA GPUs.”
I’ve also read something about it from other places, other articles as well …
The GSP is binary-only firmware loaded at run-time. The open-source kernel driver explicitly depends upon the GSP-supported graphics processors.
Basically, some/allot of the Nvidia “magic” is in their hardware/firmware, and that they are not open source.
Feel free to double check me on this though, that’s just my interpretation based on quickly reading some articles over the last 6 months or so.
One of the (now ex) maintainers by the name of Christoph Hellwig said that they don’t want multiple languages in their area of the kernel because it becomes hard to maintain, and specifically called out the fact that it wasn’t targeted at Rust - they would have rejected Assembly too. The Rust developer by the name of Hector (can’t remember his last name) pushing the change took it as a personal attack, flipped his shit and quit after trying to attack Christoph and get him removed for describing the introduction of another language as being akin to a “cancer.”
Then Linus came in, noticed that the change wasn’t actually pushing any non-C code into the kernel and told the maintainer that it wasn’t his area to block in the first place, and that he has no place telling others what to do outside of the kernel.
So we lost a kernel maintainer and a Rust developer over one issue.
I’ve bought two AMD GPUs in the last two years but I still have three Nvidia GPUs that I use. The cost of moving everything over to AMD is high so it just takes time to get rid of old hardware as a best case scenario.
The cost of moving everything over to AMD is high so it just takes time to get rid of old hardware as a best case scenario.
Totally understand. I hang on to my current GPU for as long as I can before switching to a new one (fiveish years), especially these days.
Having said that, if your goal is to move to Linux for gaming, best to go with a whole AMD setup if possible. Also a distro that updates often but is not bleeding edge. (For me, Fedora/KDE.)
Probably more than just one somebody, based on the drama in these last few week’s. 😜
It’s too bad that there’s still a proprietary binary layer that this driver will talk to. (I’m assuming right/wrong that it’s not open source, since it’s binary.)
Best to support AMD if you game on Linux. Really wish Intel would step up their GPU game.
This comment is licensed under CC BY-NC-SA 4.0
I must’ve missed that from in the post. Do you have more information on that?
Anti Commercial-AI license
The article mentions the following …
Also in the same article, there’s a link to another article that mentions it a little bit more …
I’ve also read something about it from other places, other articles as well …
Basically, some/allot of the Nvidia “magic” is in their hardware/firmware, and that they are not open source.
Feel free to double check me on this though, that’s just my interpretation based on quickly reading some articles over the last 6 months or so.
This comment is licensed under CC BY-NC-SA 4.0
One of the (now ex) maintainers by the name of Christoph Hellwig said that they don’t want multiple languages in their area of the kernel because it becomes hard to maintain, and specifically called out the fact that it wasn’t targeted at Rust - they would have rejected Assembly too. The Rust developer by the name of Hector (can’t remember his last name) pushing the change took it as a personal attack, flipped his shit and quit after trying to attack Christoph and get him removed for describing the introduction of another language as being akin to a “cancer.”
Then Linus came in, noticed that the change wasn’t actually pushing any non-C code into the kernel and told the maintainer that it wasn’t his area to block in the first place, and that he has no place telling others what to do outside of the kernel.
So we lost a kernel maintainer and a Rust developer over one issue.
You’re not wrong, but that’s not the part they quoted :)
Ahh you’re right, I misread and thought it was about the rust drama. I need more coffee.
I’ve bought two AMD GPUs in the last two years but I still have three Nvidia GPUs that I use. The cost of moving everything over to AMD is high so it just takes time to get rid of old hardware as a best case scenario.
Totally understand. I hang on to my current GPU for as long as I can before switching to a new one (fiveish years), especially these days.
Having said that, if your goal is to move to Linux for gaming, best to go with a whole AMD setup if possible. Also a distro that updates often but is not bleeding edge. (For me, Fedora/KDE.)
This comment is licensed under CC BY-NC-SA 4.0
This is True. I spend most of my gaming time on Bazzite with a 7800xt Nitro+ GPU. Works great 10/10.