cm0002@lemmy.world to memes@lemmy.world · 16 days agoIs 8GB a lot? Depends on the context.lemmy.mlexternal-linkmessage-square71fedilinkarrow-up11arrow-down10cross-posted to: [email protected]
arrow-up11arrow-down1external-linkIs 8GB a lot? Depends on the context.lemmy.mlcm0002@lemmy.world to memes@lemmy.world · 16 days agomessage-square71fedilinkcross-posted to: [email protected]
minus-squareAnivia@feddit.orglinkfedilinkarrow-up0·15 days ago Afaik for consumers only the 5090 has 32GB VRAM Only if you don’t count Apple Silicon with its shared RAM/VRAM. Ironically a Mac Mini / Studio is currently the cheapest way to get a GPU with lots of vram for AI
Only if you don’t count Apple Silicon with its shared RAM/VRAM. Ironically a Mac Mini / Studio is currently the cheapest way to get a GPU with lots of vram for AI