October 8th, 2024

Rust GPU: The future of GPU programming

Rust GPU is a framework for GPU programming using Rust, enhancing safety and performance, supporting a unified codebase, ensuring memory safety, and integrating with the Rust ecosystem for code sharing.

Read original articleLink Icon
Rust GPU: The future of GPU programming

Rust GPU is an innovative framework that allows developers to write and run GPU software using the Rust programming language, capitalizing on its safety and concurrency features to improve performance and reliability. It enables a unified codebase for both CPU and GPU development, eliminating the need for a separate GPU-specific language. The Rust GPU compiler backend produces code compatible with Vulkan, facilitating cross-device functionality, while a reboot of the Rust CUDA project aims to integrate with Rust GPU for NVIDIA users. Rust's ownership model and type system ensure memory safety and enable fearless concurrency, which is crucial for performance on parallel GPUs. The framework also offers powerful abstractions, allowing developers to write high-level, reusable code without compromising performance, thus enhancing maintainability and productivity. Additionally, Rust GPU leverages the existing Rust ecosystem, including cargo and crates.io, to streamline GPU programming and facilitate code sharing. This integration allows for the use of no_std libraries in GPU code, broadening the scope of reusable resources and enhancing the development experience.

- Rust GPU enables GPU programming using the Rust language, enhancing safety and performance.

- It supports a unified codebase for both CPU and GPU, eliminating the need for separate languages.

- The framework ensures memory safety and concurrency through Rust's ownership model.

- Rust GPU allows for high-level abstractions, improving code maintainability and productivity.

- It integrates with the existing Rust ecosystem, facilitating code sharing and reuse.

Link Icon 6 comments
By @flumpcakes - 4 months
Rust does not seem well suited for GPU programming. Current shader languages could be improved and are only C-like probably for ergonomics.

What's the benefit here? The only one I can see is if you don't want to learn a shading language/compute platform and you are already writing in Rust and you want your codebase to be in a single language.

It's cool that this exists, but it really is oversold and a bit off putting calling it "the future" of GPU programming.

By @brianhorakh - 4 months
Whoa, was checking the rust GPU examples out this morning, but didn't end up using it.

Needed to find a way to bicubic resize images fast on an embed Nvidia gpu.

Was looking at both llvm ptx & vulkan spir-v as a way to reduce dependency on Nvidia hardware. We ended up using npp with autogen c++ binding.

The rust GPU foundation is a great idea. Lots of interesting possibilities for rust devs.

Rust patterns, because it's advanced type checking (part of memory safety) makes a lot of functional tasks faster for embedded devs. It's also a dream to cross compile and deploy with (cargo(, several times faster and easier than c++.

You need fewer tests and less (often neglected) runtime error handling code because of the contract with the compiler.

I also love how I can write crates for python using pyo3 that data science team can use.

By @DrNosferatu - 4 months
Great trailblazing! Now we just need Zig GPU ;)
By @krogue - 4 months
From a PR standpoint, this page encourages the viewpoint that Rust fans are too often Rust fanatics.

As for ultimate usefulness, I am not really convinced. The big selling point on Rust is no memory leaks, no use after free and so on. These issues do not exist in shaders; one cannot allocate memory in shaders really, so that point of Rush seems, well, pointless. I would also never put bounds checking into a shader either; that harms performance a great a deal. I guess if you like Rust's syntax this is a useful, though doing bits that are natural in typical Rust would likely be far from ideal on GPU code. This happens also with current shading languages too, but I strongly suspect this makes it worse.

As a point against usefulness: it is somewhat useless on Apple platforms since one will need to translate into Metal (and yes, there are tools for that), but that drops so many capabilities of targeting Apple devices. Apple GPU's, because of their nature, can do things other GPU's cannot (and also the other way) and to make this useful to me (or any performance minded project that targets Apple GPUs), will require shoe-horning those features somehow into this.

But I guess, it is nice for those who like the syntax of Rust (I confess I do not like Rust's syntax) and are targeting Vulkan as it gives one another shading language to write in; which I guess means a point for SPIR-V (and the interface language idea in general).

By @WhereIsTheTruth - 4 months
A language that is slow to compile can't be the future of anything

Waiting for your code to compile in order to be able to see how this new color looks, or this new font or this new title or this new game player speed feels, just is BAD, very BAD

You don't have to trust what i say

You can however trust facts

Here a real world example to verify that fact everyone can test at home

Clone this popular open source game written in Rust:

https://github.com/veloren/veloren

Compile it, `cargo build` easy

Insert "... an hour later .." meme

Nice, you got it to compile

Now change any value, just like a gamedev would do when he iterates on its game

Here for example, the strength of the lighting effect:

https://github.com/veloren/veloren/blob/master/server/src/cm...

Again, `cargo build`

Let us know, how long it took to compile on your machine

It took 17 seconds for me, 17!!! seconds!!!, just because i wanted to change the look of the lighting effect

Is this how you view the future of GPU/gamedev programming?

GPU/gamedev programming deserve better

Even gamers are sick of slow compilers

https://gameworldobserver.com/2023/04/07/shader-compilation-...

When fraudsters and propaganda takes over tech, that's what you get