Why use commercial graphics accelerators to run a highly limited “AI”-unique work set? There are specific cards made to accelerate machine learning things that are highly potent with far less power draw than 3090’s.
Well yeah, but 10x the price…
Not if it’s for inference only. What do you think the “AI accelerators” they’re putting in phones now are? Do you think they’d be as expensive or power hungry as an entire 3090 for performance if they were putting them in small devices?
Yeah show me a phone with 48GB RAM. It’s a big factor to consider. Actually, some people are recommending a Mac Studio cause you can get it with 128GB RAM and more and it’s shared with the AI/GPU accelerator. Very energy efficient, but sucks as soon as you want to do literally anything other than inference
Ok,
Show me a PCE-E board that can do inference calculations as fast as a 3090 but is less expensive than a 3090.
I’d be interested (and surprised) too
Forget the board – can your whimpy-ass power supply handle the load?
No :(
I have a separate gaming PC and am considering to just use that hardware for my NAS and create a VM for gaming
You have put yourself into this black hole lol.
“I might just get a- Oh god my gaming rig is now my secondary PC and my credit card hurts. How did this happen?!”
3090s snicker evily in the background