- cross-posted to:
- hardware@lemmy.ml
- cross-posted to:
- hardware@lemmy.ml
You must log in or register to comment.
I wish they would release cheapish cards with huge cheap VRAM for consumer AI. But I guess even the cheapest memory is too expensive for that.
28GB? Are they really that stingy that can’t can’t pull the trigger on 32gb for their top of the line card? The 3090 had 24gb, the 4090 still had 24gb, and now they add a measly 4gb more? This tells me that they are probably going to keep vram mostly the same on the lower cards as well.
I believe this is tied to the memory bus size (which is tied to the overall architecture design).
I believe 448 bit memory bus doesn’t actually allow 32GB, I think it would have to be 56 GB.