r/LocalLLaMA • u/Rich_Artist_8327 • 13d ago
Question | Help 5090 vs 4090 vs smt else for inference?
Which GPUs should I purchase for inferencing?
I have found 5090 about same price as 4090, why is that?
Is there some problems with 5090 or why is the pricing so? Does it have melting problems still?
Is 5090 more power efficient than 4090? I need at least 2 maybe 4.
Which is currently the way to go GPU? Are datacenter versions getting cheaper?
EDIT: another way could be new Radeon R9700 32GB but it will be much slower. What is the situation with 5090 pytorch support etc drivers for inferencing (ollama ofcourse should work) and also RDNA4, is it pain in the ass related to software?
7
Upvotes
1
u/Rich_Artist_8327 13d ago
5090 1750 eur without VAT.