Legacy 32-bit PhysX Removal Cripples Performance On New GPUs (youtube.com)
(Thursday March 13, 2025 @06:00AM (BeauHD)
from the would-you-look-at-that dept.)
- Reference: 0176698621
- News link: https://hardware.slashdot.org/story/25/03/13/0020246/legacy-32-bit-physx-removal-cripples-performance-on-new-gpus
- Source link: https://www.youtube.com/watch?v=h4w_aObRzCc
Longtime Slashdot reader [1]UnknowingFool writes:
> Gamer's Nexus performed tests on the effect of removing legacy 32-bit PhysX on the newest generation of Nvidia cards with older games, and the [2]results are not good . With PhysX on, the latest generation Nvidia was slightly beaten by a GTX 580 (released 2010) on some games and handily beaten by a GTX 980 (2014) on some games.
>
> With the launch of the 5000 series, NVidia dropped 32-bit CUDA support going forward. Part of that change was dropping support for 32-bit PhysX. As a result, older titles that used it would perform poorly with 5000 series cards as it would default to CPU for calculations. Even the latest CPUs do not perform as well as 15-year-old GPUs when it comes to PhysX.
>
> The best performance on the 5080 was to turn PhysX off however that would remove many effects like smoke, breaking glass, and rubble from scenes. The second-best option was to pair a 5000 series with an older card like a 980 to handle the PhysX computations.
[1] https://slashdot.org/~UnknowingFool
[2] https://www.youtube.com/watch?v=h4w_aObRzCc
> Gamer's Nexus performed tests on the effect of removing legacy 32-bit PhysX on the newest generation of Nvidia cards with older games, and the [2]results are not good . With PhysX on, the latest generation Nvidia was slightly beaten by a GTX 580 (released 2010) on some games and handily beaten by a GTX 980 (2014) on some games.
>
> With the launch of the 5000 series, NVidia dropped 32-bit CUDA support going forward. Part of that change was dropping support for 32-bit PhysX. As a result, older titles that used it would perform poorly with 5000 series cards as it would default to CPU for calculations. Even the latest CPUs do not perform as well as 15-year-old GPUs when it comes to PhysX.
>
> The best performance on the 5080 was to turn PhysX off however that would remove many effects like smoke, breaking glass, and rubble from scenes. The second-best option was to pair a 5000 series with an older card like a 980 to handle the PhysX computations.
[1] https://slashdot.org/~UnknowingFool
[2] https://www.youtube.com/watch?v=h4w_aObRzCc
You know what to do kids (Score:2)
by greytree ( 7124971 )
Stick with your 3000 series until they fix this shit.
Re: (Score:2)
by Luckyo ( 1726890 )
It's unlikely to get fixed. NV deprioritized GPU sector heavily and moved developers to AI segment instead in last couple of years. Dropping 32 bit CUDA is likely a part of wider effort to dump old, largely unused feature support with likely goal to simplify driver support (i.e. provide working drivers with less work due to less engineers allocated to the GPU sector).
The less features you need to support in drivers, the less work is needed to keep churning out new drivers.
Could this be solved by a wrapper? (Score:3)
How hard would it be for Nvidia to release a 32 bit physix driver that just uses the 64bit physix under the hood to provide some backwards ompatibility?
Re: Could this be solved by a wrapper? (Score:2)
Sure. We just need Nvidia to open source their proprietary physx protocol