Nvidia Rolls Out Its Fix For PC Gaming's 'Compiling Shaders' Wait Times (arstechnica.com)
- Reference: 0181205724
- News link: https://tech.slashdot.org/story/26/04/02/042250/nvidia-rolls-out-its-fix-for-pc-gamings-compiling-shaders-wait-times
- Source link: https://arstechnica.com/gaming/2026/04/nvidias-new-app-lets-you-precompile-gaming-shaders-during-machine-idle-time/
> Nvidia's new Auto Shader Compilation system promises to "reduc[e] the frequency of game runtime compilation after driver updates" for users running [3]Nvidia's GeForce Game Ready Driver 595.97 WHQL or later. When the feature is active and your machine is idle, the app will automatically start rebuilding DirectX drivers for your games so they're all set to roll the next time they launch.
>
> While the feature defaults to being turned off when the Nvidia App is first downloaded, users can activate it by going to the Graphics Tab > Global Settings > Shader Cache. There, they can set aside disk space for precompiled shaders and decide how many system resources the compilation process should use. App users can also manually force shader recompilation through the app rather than waiting for the machine to go idle.
>
> Unfortunately, Nvidia warns that users will still have to generate shaders in-game after downloading a title for the first time. The Auto Shader Compiler system only generates the new shaders needed after subsequent driver updates following that first run of a new title.
[1] https://www.nvidia.com/en-us/geforce/news/nvidia-app-dlss-4-5-dynamic-multi-frame-generation-available-now/
[2] https://arstechnica.com/gaming/2026/04/nvidias-new-app-lets-you-precompile-gaming-shaders-during-machine-idle-time/
[3] https://www.nvidia.com/en-us/drivers/details/265874/
um ok, but... (Score:3)
Steam does this already and most of my games are delivered via steam, so most of my games have this already.
I think steam does set the processes slightly nice, but I don't think they change the ioprio so it can still have a negative impact on systems without fast storage. (I have mirrored nVME SSDs so this is only a problem to any degree when this is done for infrequently played games, which are stored on HDD. That's a 3-way mirror too, though.)
Re: (Score:2)
Steam does this for Vulkan shaders, not DirectX.
Re: (Score:2)
Sucks for folks using a legacy, business desktop OS instead of something modern.
Re: (Score:2)
Heh, Steam does that at nice 20. Unless you're still running dnetc, it's going to yield to everything else.
Why can't the pre-compiled ones be distributed? (Score:2)
Yes I guess that is a stupidly simplistic question. On steam even in Linux I have had that 'vulkan shaders' background thread running seemingly for tens of minutes any time a driver or game is updated - that's just stupid.
Re: (Score:2)
It takes tens of minutes here, too. It has to be updated when the game is changed because the assets which include the shaders have changed. It has to be updated when the driver is changed because the driver is what runs the shaders. If you don't precompile then the compilation has to be grunted out on demand, and your game will likely have chokes and stutters while it's done in realtime. IME for most titles it's not that bad and resolves itself in a few minutes.
Re: (Score:2)
> your game will likely have chokes and stutters while it's done in realtime. IME for most titles it's not that bad and resolves itself in a few minutes
It's one of those issues that often are easily ignored until suddenly it's game-breaking.
For many games that stream assets and build shaders on the fly, if there's a bit of blurriness and stuttering when you first enter an area there are many players who can forgive that. Having that same experience walking into a boss's lair and suddenly the game is choking and stuttering as resources are processed, that's a fatal flaw that can make it difficult to play, or even outright kill the player while loading.
The
Re: (Score:2)
Particularly for PvP competitive games that require constant FPS (think CoD, Battlefield, Fortnite, etc) runtime shader compilation is a nonstarter. CoD won't even let you matchmake without compiled shaders, even tho the engine supports compiling them on demand.
Re: Why can't the pre-compiled ones be distributed (Score:2)
My limited understanding is that the compiled shader is specific to the card and driver. So there are probably a lot of combinations.
Re: (Score:2)
Yes, this. There are many *many* combinations. Distributed compilation and a remotely hosted shader cache would cost a lot of money to host. I don't think it's the technical considerations that are as preventative as simply the cost of hosting the service.
Re: (Score:2)
Microsoft announced Advanced Shader Delivery which does that, delivering precompiled shaders or partial ones when not possible to enumerate all combination. They did it for the rog ally(obviously easier for a fixed hardware) and now plan to extend it to pc. [1]https://devblogs.microsoft.com... [microsoft.com]
[1] https://devblogs.microsoft.com/directx/advanced-shader-delivery-whats-new-at-gdc-2026/
Re: (Score:2)
Oh, that's pretty neat. Microsoft is definitely the right level to address this at - they already have permission to enumerate the HW, own the hardware and software infra to tackle this, enjoy economy of scale other players are not privvy too, and can deliver a solution in a vendor agnostic way. Thanks for the heads up. It's the right thing to happen.
Re: (Score:2)
> Yes, this. There are many *many* combinations. Distributed compilation and a remotely hosted shader cache would cost a lot of money to host. I don't think it's the technical considerations that are as preventative as simply the cost of hosting the service.
What is the power consumption for doing this tens of millions of times, and how does the greenhouse gas emissions from that compare with the power consumption of running the servers? It seems like there are a lot of hidden costs in the current approach.
Re: (Score:2)
Of course there are. Tragedy of the commons. My point is that no single entity is likely to absorb the costs unless they're already enjoying economy of scale advantages and there are business experience/optic benefits to doing so. The poster above you pointed out that Microsoft seems to be addressing this, which makes a lot more sense to me than doing it at the 3d HW vendor level.
Re: (Score:2)
It's worth noting that many game studios/engines do support shared shader caches in their local studio pipelines, but the hardware config spread is much more limited, and the costs for lost productivity waiting for shaders is far greater than hosting a shader cache on premises.
Top of Kathy Ireland's Forehead Sexy Enough (Score:2)
So now, instead of taking a moment while the shaders compile to relax and eat some chips, maybe enjoy the local ambience of Mom's basement, you have to pause the game, eat some chips, then unpause? And you don't even get to read any helpful tips while you're eating? Sounds like a pain in the ass.
Re: (Score:2)
Bold of you to assume my mom isn't living in my basement!
Re: (Score:2)
If gaming is a pain in the ass, you're obviously using the game controller wrong!
Console master race stays winning (Score:2)
Console master race stays winning, you filthy PC gaming peasants.
BitTorrent (Score:3)
They need to implement BitTorrent or something. There's no reason everyone has to compile this shit themselves.
Re: (Score:2)
They can't be bothered to maintain any binary level compatibility within their drivers and instead make it your problem. But then that is exactly how the AI build-out works, they make it our problem and expense.
Re: (Score:2)
Asking people to host and serve a non-trivial amount of content to other players is a non-starter. (The size of compiling all the shaders for CoD can range from a couple gigs to 10 gigs.) Opting in to a torrent-like network would have to be opt in - many people would just opt out (justifiably or not) minimizing the point of such a network.
You can probably assume that if you've thought of something, they've thought of it too. They simply have constraints and considerations - both technical and business orien
Re: (Score:2)
> Opting in to a torrent-like network would have to be opt in - many people would just opt out
Sure, but many people would opt in, especially if you explained that they would benefit.
> They simply have constraints and considerations - both technical and business oriented - you don't need or want to account for.
Yeah, it's added complexity they would have to support and maintain. That alone is sufficient reason not to do it frankly.
Re: (Score:2)
Sure, but many people would opt in, especially if you explained that they would benefit.
Maybe. Maybe not. Before committing to developing such a thing, you'd have to at least do some research and analysis to find out if that's true and how the likely opt in/out ratios would impact the business case. Remember, this is hosting content in a daemon on your machine .. I think that'd a non-starter for a lot of people, despite the upside of shorter shader updates. (I'm not super up on what the US ISP market/landsc
Re: BitTorrent (Score:2)
Data caps are still a thing but they mostly control download. Most users are on cable now, this is generally asymmetric, so the upstream is mostly just limited by practical considerations. (Upstream and downstream frequencies must differ in DOCSIS, and they dedicate more bandwidth to downstream for obvious reasons.)
Re: (Score:2)
Also a torrent like network would be absolute loaded with cache misses. You need to fetch a shader from somebody who has the exact same hardware/drive/game version combination as you do, and they need to have opted in. I highly suspect the majority case for many would be to cache miss and end up compiling locally.
Re: BitTorrent (Score:2)
That's a lot more common than you would think though because of automatic game and driver updates, and the march of upgrades necessary to play modern games. Most players of a particular game are on similar hardware.
Re: (Score:2)
> They need to implement BitTorrent or something. There's no reason everyone has to compile this shit themselves.
Technically it has to be done for every graphics card model out there.
Shaders are real programs, and your graphics driver ships with a compiler (usually based on LLVM) that takes those shader programs and produces the final binary from it for your specific video card. Now, usually the source code to the shaders are not given - instead they are in IR (intermediate representation) which is basically
"switchable graphics" (Score:1)
i doubt this helps laptop users going between dedicated nvidia gpu and amd integrated on the cpu (my 890m on hx370 proart is plenty capable for most of my games and its quieter; but if i plug in the system switches to the nvidia and games have to do their recompile any time i launch after a swap i hope infinity nikki gets its mac port this year; hopefully with next update 2.5... and i hope infold optimizes it enough to work as well on neo as it does on iphone
Would Rather Have SSD Longevity (Score:2)
I generally shut off shader cache's. I have a Samsung 840, that has been used for video editing. Oddly, it still works. Although shader caches are a pittance compared to video editing, every little bit wears SSD's out. BTW, the 4TB Samsung 990 Series NVMe I was looking at six months ago, went up from $385 to $621. In the end--even excessive logging wears SSD's, as a whole block is written for even a one byte update.
Been there, done that? (Score:2)
Microsoft Windows has been doing something similar with their NET framework for years. Whenever there is an NET update there is a background task that runs the JIT compiler across various files to reduce future app startup times. Android's modern ART AOT system does something similar. Improving the user experience (faster startup times) is a laudable goal.
SimCity (Score:1)
Reticulating splines...
That's a great way to defer driver updates! (Score:2)
So you update your drivers and then your PC inexplicably grinds for half an hour. This will only teach people to stop updating their drivers.
RealPlayer G2 (Score:2, Funny)
Buffering....