Forum Discussion
I7 6700 and RTX 2080, I'm getting 30 fps on every settings.
november 12-up to date driver
cpu usage:100% gpu usage 30%.
On I5 6500 and 1060 is running waay better than on modern pc's.
- TonicPlayer5 years agoNew Adventurer
I'm using a RTX 2080 at 1440p, I'm getting low fps (around 70-80) most of the time, and 100 fps only in wide open spaces (even with DLSS enabled, which I don't think it works anyways).
My whole PC froze 2 times already (which forced me to restart it), and I also had a DirectX crash (see the screenshot).
I'm also using a Ryzen 3700X and 32gb of ram, so I shouldn't have many problems running this game.
I suspect this could be because of bad implementation of DX12 and DLSS... I honestly don't know what to do. I didn't have those problems in previous games.
- 5 years ago
i7 9700K, 32 GB Ram, RTX 3090. 720p, 1080p, 1440p ultra, high or low settings, same FPS. 45 to 90 deppeding on the ammount of graphics of the area. GPU only up to 65% of usage.
- Hrafnsvart5 years agoRising Novice
Horrible 2042 performance today.
This is the same issue I had in the beta. My game is eating up my CPU (R7 2700X) and not even touching my GPU (RTX 2080), resulting in heavy throttling at ~35 fps no matter what graphical settings I choose. I understand that my CPU is a little behind, when it comes to top-of-the-line tech, but it shouldn't be THIS bad right? Main menu performs completely fine, as expected, at 120 fps or more, with normal CPU and GPU usage. But as soon as I get into a match, it drops the GPU and only wants to use my CPU.
The attached screenshot was taken while I was in a Conquest match on Hourglass with the "Medium" graphics preset.I don't mean to be talking trash about the game or the devs, but this has to be the worst optimisation I've ever seen for a game, much less an AAA title. I have never in my life had a video game use literally none of my GPU. Even L4D2 uses a chunk of my GPU, and that game is 12 years old!
Is there any sort of workaround to this issue? Or am I just waiting for the devs to fix it?
- 5 years ago@Hrafnsvart I have same problem on intel i7 9700K + RTX 2070 Super. 35-40 fps on anyone graphical settings.. I dont understand how it possible.
- Hrafnsvart5 years agoRising Novice@Sirinitar devs cutting corners i guess
- 5 years ago
@kiki01hun2080ti and 8700k here. 60- 80 FPS on all low settings on 1080p. Dips to 35-50. Super choppy, feels clunky as hell. It blows lmao. Terrible performance I can’t believe they released it. On top of that server issues now
- NetskySmile5 years agoNew Traveler@kiki01hun same bottleneck issue - 2080ti I7 9700k. Was the same in the beta also..
- AndersonLoko085 years agoSeasoned Newcomer
I'm having performance issues too, but I made some commands to help get more FPS
In the game I put everything at minimum, the resolution at 1024x768 running at 60Hz
Open the game console and type: (I have to do them every time I open the game)
Render.ResolutionScale 0.25 - Decreases the resolution scale in a fixed way, in percentage in this case 25%, to 50% put 0.5
Render.ForceDx11 true - Somehow I believe to force DX11 instead of native DX12
WorldRender.LightTileCsPathEnable false - Turns off some lights, making the game darker
WorldRender.TransparencyShadowmapEnable false - This causes the shadows to be non-transparent, making some places too dark
WorldRender.MotionBlurRadialBlurMax 0 - Disables motion blur
WorldRender.MotionBlurQuality 0 - Strips the quality of the motion blur
Thread.MaxProcessorCount 6 - I found this with a default value of 8, but my processor has 6 cores so I decided to change it to that.
My config:
I5 - 9400F
16gb DDR4 dual channel (8gb+8gb) 2666Mhz
NVIDIA GALAX GTX 1060 3gb OC
WINDOWS 10 PRO
HD 7200rpmI get some of my ideas from LowSpecGamer: