News: 0176698621

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Legacy 32-bit PhysX Removal Cripples Performance On New GPUs (youtube.com)

(Thursday March 13, 2025 @06:00AM (BeauHD) from the would-you-look-at-that dept.)


Longtime Slashdot reader [1]UnknowingFool writes:

> Gamer's Nexus performed tests on the effect of removing legacy 32-bit PhysX on the newest generation of Nvidia cards with older games, and the [2]results are not good . With PhysX on, the latest generation Nvidia was slightly beaten by a GTX 580 (released 2010) on some games and handily beaten by a GTX 980 (2014) on some games.

>

> With the launch of the 5000 series, NVidia dropped 32-bit CUDA support going forward. Part of that change was dropping support for 32-bit PhysX. As a result, older titles that used it would perform poorly with 5000 series cards as it would default to CPU for calculations. Even the latest CPUs do not perform as well as 15-year-old GPUs when it comes to PhysX.

>

> The best performance on the 5080 was to turn PhysX off however that would remove many effects like smoke, breaking glass, and rubble from scenes. The second-best option was to pair a 5000 series with an older card like a 980 to handle the PhysX computations.



[1] https://slashdot.org/~UnknowingFool

[2] https://www.youtube.com/watch?v=h4w_aObRzCc



Could this be solved by a wrapper? (Score:3)

by Racemaniac ( 1099281 )

How hard would it be for Nvidia to release a 32 bit physix driver that just uses the 64bit physix under the hood to provide some backwards ompatibility?

Re: Could this be solved by a wrapper? (Score:2)

by Carewolf ( 581105 )

Sure. We just need Nvidia to open source their proprietary physx protocol

You know what to do kids (Score:2)

by greytree ( 7124971 )

Stick with your 3000 series until they fix this shit.

Re: (Score:2)

by Luckyo ( 1726890 )

It's unlikely to get fixed. NV deprioritized GPU sector heavily and moved developers to AI segment instead in last couple of years. Dropping 32 bit CUDA is likely a part of wider effort to dump old, largely unused feature support with likely goal to simplify driver support (i.e. provide working drivers with less work due to less engineers allocated to the GPU sector).

The less features you need to support in drivers, the less work is needed to keep churning out new drivers.

Children are unpredictable. You never know what inconsistency they're
going to catch you in next.
-- Franklin P. Jones