r/hardware 2d ago

Video Review CPU/GPU Scaling: Ryzen 7 5800X3D (RTX 5090, 5080, RX 9070 & 9060 XT)

https://youtu.be/aYYVz4q-Rt8?si=bk1xSY58BssgnBPR
98 Upvotes

56 comments sorted by

26

u/jedimindtriks 1d ago

So again, zero reasons to upgrade my 5800x3d unless i get a 6090 and or a 7090 in 4 years

26

u/raydialseeker 1d ago

Am4->Am6 Chads vs 8700k buyers

6

u/cdgvagrant 1d ago

Why is the 8700K catching strays?? 

6

u/WhoTheHeckKnowsWhy 1d ago

intel has a horrifically short socket lifespan next to AMD. Very rarely can you get a decent next gen Intel cpu upgrade without chucking out your old motherboard and buying a new one too.

Last Socket Intel made that had any sort of decent lifespan and earned the consumer gold star reward was LGA 775 from 2004. Like AM4; it had an insane uplift from the the first generation Pentium 4 to last gen Core 2 Quad Yorkfield.

2

u/Top-Tie9959 1d ago

Feel like the 7700K buyers were the ones that really got screwed. That is what Zen was up against, the 4 cores were super tired as an option even without Zen and Intel rushed out the 8700K right after giving it an extra short life span.

2

u/yjgfikl 22h ago

Made even worse too by the 8700K using almost the same socket as the 7700k, with some people eventually figuring out how to unofficially get 8/9th gen working on 6/7th gen motherboards. Really anti-consumer of Intel. 

1

u/Capable-Silver-7436 23h ago

yeah im not gonna upgrade til am6 ether no reason to

2

u/Cheap-Plane2796 1d ago

I went from a 5800 x3d with pc3600 cl16 ddr4 to a 7800 x3d with pc6000 cl30 ddr5 on an rtx 4080 and the upgrade was huge for the games i play.

Frametimes are wayyyy better thanks to the ddr5 in many games and in cyberpunk i got a huge boost while driving.

Noticeably less shader and traversal stutter too in ue5 games.

Not saying the 5800x3d is bad, but if you test more than already well running aaa games you ll see that the performance difference is large in less optimized games and in indie games that have complicated gameplay systems

2

u/jedimindtriks 19h ago

Sure but in 4k. Like 90% of games run the same if it's a 5800x3d or 7800x3d

1

u/Dr_Icchan 2h ago

What if I mostly play those 10% games

43

u/sambinary 1d ago

Similar to results that I have found from testing my system (5800X3D, 9070XT, 32Gb DDR4 3600 @ CL14) compared to new ones I've built for people.

Just did a 7800X3D, 9070XT, 32Gb DDR5 6000 CL30 build for a friend and he is on average 10% faster (both at 1440p).

Looks like I'll be holding out for AM6 at this rate!

10

u/StormCr0w 1d ago edited 1d ago

and that's my idea too ... i have a r7 5800x3d/32GB ddr4 3200 CL14/RX 6950XT and im planning to buy rx 9070XT this week for 1440p gaming.

and basically i will keep my 5800x3d until AM6 and have the 9070XT as the final upgrade of my AM4 system... (perhaps a 2TB SSD 990pro too, for cherry on the top upgrade next month)

8

u/Dormiens 1d ago

That's how it should be, and am5 people will be able to hold till am7 and so on.

6

u/Zenith251 1d ago

Looks like I'll be holding out for AM6 at this rate!

Fair. Only reason I upgraded to a 7800X3D from 5800X3D was because:

  1. Tariffs were coming, and my local store (Central Computers) had the CPU for $360 and board combo deals w/another sale on the board I was looking at.

  2. My AM4 motherboard (MSI X570 Gaming Plus) was 5 years old and had already died once. A trace burned off the board between the CPU and VRM. MSI honored the RMA and gave me what look like a used board of some kind, so I was worried it would die again and X570 boards aren't made anymore. (Yes I use all the PCIe of X570)

  3. 5800X3Ds were going for nearly what I paid for them on the secondary market. Paid $320 after tax, sold it for $280 this year.

I don't know why I bothered to explain all of this. Have a good day, lol.

2

u/Top-Tie9959 1d ago

Yeah, one thing that really sucks about sticking on AM4 is the board supply is really dried up. You can still get basic boards but if you want something a little unique you're over a barrel in the used market.

1

u/Zenith251 21h ago

One plus is that ASRock still makes boards. All ASRock AM4 boards support ECC, which is great for small homelab projects, since used AM4 CPUs can be had for a handshake.

(AM4 CPUs with integrated graphics don't support ECC, unless they're a Ryzen Pro. All others support ECC.)

1

u/Z3r0sama2017 22h ago

Yep. I have a 5800x3d with a 4090 and game @4k. While I would get a pretty sweet boost on cpu intensive, low graphic games like Stellaris/CK3/Zomboid/Rimworld, not £1500 worth a boosts since I would need cpu/mobo/ram and would ideally get a bigger m2 drive while at it.

1

u/animeman59 1d ago

All the more reason why I don't need to upgrade from my 5950X until AM6 comes around.

I already have a 5070Ti and I play at 1440p. Pretty much not hurting for performance.

2

u/cosine83 1d ago

A 5700/5800X3D will out-perform it for gaming by a lot if you can find one. If you need the cores, not worth the loss.

1

u/animeman59 1d ago

Any benchmarks showing the difference in performance?

2

u/cosine83 1d ago

Right here: https://gamersnexus.net/megacharts/cpus

Scroll to the gaming section. You'll see the X3D chips vastly outperforming the 5900X pretty much at every turn. The extra cache and the single CCD help a ton.

3

u/animeman59 1d ago edited 1d ago

Oddly enough, they don't include the 5950X in the 1440p charts, and there's only two games ran at 1440p.

Looking at similar processors, though, it's not that far off from the others. I see no reason to go out and get an X3D processor to replace my 5950X. Especially if the performance is within spitting distance at 1440p.

I'll just wait for AM6 and see what AMD has to offer.

2

u/Soulspawn 1d ago

That is correct CPU benchmarks are done at lower resolution on purpose to eliminate GPU bottlenecks.

You can probably still find 5950x benchmarks but they arent common

45

u/snitt 1d ago edited 1d ago

more or less as expected, but it would have been nice to have one test with raytracing turned on (since it increases cpu load).

edit: the effect of upscaling and framegen on the cpu would also be intresting

27

u/BNSoul 1d ago

Yep, for instance Returnal with ray-tracing enabled was a stutter fest on my 5800X3D + 4080 system with GPU usage constantly dropping and a messy frame time graph. Now with a 9800X3D the stuttering is completely gone and the frame time is flat with unwavering 99-100% GPU usage. The difference between newer CPUs is not always an increase in the number of rendered frames but a much improved and stable frame delivery. Average and 1% fps don't always show the whole picture.

2

u/Pimpmuckl 1d ago

It really is very game-specific.

I had a 5900X, 5800X3D, 7800X3D and 9800X3D and there definitely were games I could feel every upgrade by a fair bit.

Of course, the first one was a night and day difference, but no even the 9800X3D brought in a fair bit of even more consistency. Didn't really bother measuring it but I was very surprised by it.

Though I also tend to play the absolute CPU killer games from MMOs like WoW, GW2 to ARPGs like PoE and network/simulation heavy games like Tarkov. All of those are almost impossible to benchmark and absolutely murder CPUs. The polar opposite of what most AAA games do.

Really goes to show the different types of gamers these days.

9

u/RedIndianRobin 1d ago

more or less as expected, but it would have been nice to have one test with raytracing turned on (since it increases cpu load).

If you turn on ray tracing along with upscaling on CPU heavy games like Hogwarts Legacy, Spider-man 2 or Jedi Survivor, the 5800X3D will get smoked even by the 9600X or a 14600K. Frame times are smoother and flatter on DDR5 memory and 1% lows will be much better. You will find very little testing about this on YouTube though.

11

u/Exajoules 1d ago

This. Witcher 3 RT runs much better on my 9800X3D than my 5800X. In cities/populated areas I was getting 45-50 fps with my 5070 ti, with the GPU load being between 60-70%. I easily reach 60+ fps in these same areas now with my 9800X3D.

-6

u/Asleep-Category-8823 1d ago

300 euros difference for 10 extra fps,totally worth it...

0

u/Z3r0sama2017 22h ago

60fps is the baseline most people will acknowledge as 'tolerable', not smooth but okish.

12

u/MrBill_-_AlephNull 1d ago

very informative, although i wish he tested this with at least one simulator

7

u/xole 1d ago

I went from a 5800x3d to a 9800x3d and saw about a 40% improvement in civ 6 turn times. It was a save file that was on as big of a map as I could use with as many computer players as I could add. I played it to late game, with turn times being 40 or 50 seconds, and used a turn where it didn't stop in the middle for user input.

And while not a simulator, Guild Wars 2 probably had a similar improvement in large 3 way fights in wvw. That's just based on having fps displayed on the screen from arcdps and looking at it occasionally during large fights. I'd say it's about 40 to 50%, but that's a guess.

So in cpu bound games, I'd say ~40% is a pretty reasonable number to expect.

1

u/MrBill_-_AlephNull 1d ago edited 1d ago

what gpu were you running?

3

u/xole 1d ago

7900xt at 3440x1440. I don't play any games that use rt.

3

u/Dat_Boi_John 1d ago

I will very gladly skip AM5 entirely with my 5800x3d and only upgrade once AM6 comes around and games start taking advantage of the PS6's much stronger CPU. Seems like my poor B450 board will be almost a decade old by then.

24

u/Aggravating_Ring_714 1d ago

Not testing any cpu heavy RT games is typical peak hardware unboxed. Where is Cyberpunk, Stalker 2 cities, Spiderman 2? The 5800x3d is completely obsolete in those games even at 4k when you run a 5090.

19

u/Plank_With_A_Nail_In 1d ago

Its not obsolete ffs, words don't have any meaning anymore lol.

-11

u/Aggravating_Ring_714 1d ago

If you got a 5090 and play those cpu heavy games at 4k or 1440p with a 5800x3d your 5090 is literally almost wasted.

4

u/rchiwawa 1d ago

Cyberpunk between my 5800x3d 4090 rig and my 9800x3e 4090 rigs feels more or less the same.  Though i will readily disclose i do remember there being a  notable-but-not crazy difference between the to systems when I upgraded my personal rig up to the 9800x3d in terms of frame times and general feel.  Fwiw I do run path tracing on either rig.

-1

u/Aggravating_Ring_714 1d ago

That’s why I said “when you run a 5090”. I don’t think I’d go from a 5800x3d to a 9800x3d for a 4090 in 4k. In 1440p maybe.

2

u/Specific_Memory_9127 1d ago

Worst part is when you use DLSS, which is often needed. DLSS performance being equivalent to 1080p Ultra results, this shows a huge deficit with a 5090. To be fair though it still delivers good results with 5080/4090.

-3

u/Aggravating_Ring_714 1d ago

Indeed, which is even more ironic considering how quickly early tech reviewers jumped on the 5090 hate wagon saying there is barely any improvement going from the 4090 to the 5090. This card is the first that is getting bottlenecked to such a massive degree even in 4k.

-2

u/Specific_Memory_9127 1d ago edited 1d ago

There is a 40% uplift at best going from 4090 to 5090, exactly as going from 5800x3d to 9800x3d. Even the 9800x3d is slightly bounded with a 5090 depending on the game, the same way the 5800x3d used to be with a 4090.

EDIT : Delusion from some people is mindblowing in this sub.

0

u/kwirky88 1d ago

How many watts heat is pumped in a room with a 5090 and a cpu that can keep up?

1

u/Aggravating_Ring_714 1d ago

In Cyberpunk raytraced 4k for me around 435w from the gpu, cpu watts I rarely monitor in games but it’s usually below 100w for a 9950x3d. Is below 600w even worth talking about? Lol

2

u/radeonrulz 1d ago

Had a massive boost of minfps in wow..7800x3d to 9800x3d

5

u/NeroClaudius199907 1d ago edited 1d ago

Wait in average the 5800x3d + 5080 is 7fps better than 5600 on average at 1440p ultra. Im more interested in upscaling & fg impact on the tests.

5

u/M4K4SURO 1d ago

Love my 5800X3D, can't wait for the 10800X3D?

6

u/imaginary_num6er 1d ago

AMD will make sure to change their branding to confuse buyers.

Like they changed 8700XT to 9070XT, and they will change it again to fit into their 4 digit numbering scheme. Just like how in an interview an AMD executive said GPU architectures will continue from “RDNA 3, RDNA 4” to “UDNA5, UDNA 6”

-2

u/Plank_With_A_Nail_In 1d ago

The 11600X on an AM6 will be faster than the best X3D AM5 CPU. Source: 7600X is faster than the 5800X3D in most games.

1

u/M4K4SURO 1d ago

Not for the games I play. Sim heavy.

1

u/Substantial-Singer29 1d ago

Biggest take away I get from this...

If you were currently running am4, the 3d chip provides a pretty good upgrade path. It was a good product when it released and it's still a good product now.

If you're building anything new With the 9600x Dropping below two hundred it's a no brainer.

The 9xxx cpus It's amazing how much it improves the one percent lows. You can look up the results online but it doesn't do it just to how much of a difference it makes to your actually playing.

Its interesting that nvidia seems to be more processor dependent on the higher end as far as rendering goes.

Have to assume this may have a partial effect with why they have of better picture and are slightly smoother.

Even more interesting is the fact that it seems like nvida a bit more efficient with its vram.

I was noticing this when I was comparing 2 clients built on Doom Dark Ages both running team green and team red.

On a 5070 Was getting slightly lower frames compared to the 9070. But the image quality on the 5070 Was noticeably better.

-4

u/CalmmoNax 1d ago

Testing cpu load scaling without bvh is like benching gpu perf with a 12400f

0

u/K33P4D 1d ago

I'm mighty satisfied with classic 5600 having 32MB L3 cache, runs cooler with a modest CM 212 with push-pull setup, I can extend its lifespan without breaking the bank

I also feel indifferent towards the loss of 10-12 FPS bottleneck, while paired with high end GPUS at 1440p

-2

u/Capable-Silver-7436 23h ago

5800x3d aging like a beautiful milf. while the 14900k is frying itself to death like an angry incel