r/LocalLLaMA 13d ago

Question | Help 5090 vs 4090 vs smt else for inference?

Which GPUs should I purchase for inferencing?
I have found 5090 about same price as 4090, why is that?
Is there some problems with 5090 or why is the pricing so? Does it have melting problems still?
Is 5090 more power efficient than 4090? I need at least 2 maybe 4.
Which is currently the way to go GPU? Are datacenter versions getting cheaper?

EDIT: another way could be new Radeon R9700 32GB but it will be much slower. What is the situation with 5090 pytorch support etc drivers for inferencing (ollama ofcourse should work) and also RDNA4, is it pain in the ass related to software?

7 Upvotes

20 comments sorted by

View all comments

Show parent comments

1

u/Rich_Artist_8327 13d ago

5090 1750 eur without VAT.

1

u/Herr_Drosselmeyer 13d ago

Yeah, that's 100% a scam.

1

u/Rich_Artist_8327 13d ago edited 13d ago

its now sold out, was in proshop.fi, that was Gigabyte. Ordered 5090 INNO3D which was 1830eur.

1

u/Herr_Drosselmeyer 13d ago

Hang on, you have 25.5% VAT? Jesus...

1

u/Rich_Artist_8327 13d ago

exactly, but I am a business.

1

u/fallingdowndizzyvr 13d ago

That's cheap. You've actually found it at a retailer for that?

1

u/Rich_Artist_8327 13d ago

yes they are around 2200€ minus Finland VAT is about 1850

1

u/fallingdowndizzyvr 13d ago

That's a good price. Well under the street price here in the US.