

Thank you, very insightful.
Really the big disguishing feature is VRAM. Us consumers just don’t have enough. If I could have a 192GB VRAM system I prolly could run a local model comparable to what OpenAI and others offer, but here I am with a lowly 12GB

















And the Tesla