Nein!
TechTakes
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
Doch!
Ooh!
Local AI kind of sucks right now, and everything is massively over branded as AI ready these days.
There aren’t a lot of compelling local use cases and the memory constraints of local mean you end up with fairly weak models.
You need a high end high memory local setup to get decent token rates, and I’m finding right now 30-70b models are the minimum viable size.
That doesn’t compare with speed of online models running on GPUs that cost more than luxury cars.
Ai bro here. The reason there shit aint selling is because its useless for any actual ai aplication. Ai runs on gpus even an ai cpu will be so much slower than what an nvidea gpu can do. Of course no one buys it. Nvideas gpus still sell very well, and not just because of the gamers.