this post was submitted on 07 May 2025
732 points (100.0% liked)

TechTakes

1837 readers
361 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] dreugeworst@lemmy.ml 25 points 1 day ago* (last edited 1 day ago) (2 children)

afaict they're computers with a GPU that has some hardware dedicated to the kind of matrix multiplication common in inference in current neural networks. pure marketing BS because most GPUs come with that these days, and some will still not he powerful enough to be useful

[–] blarth@thelemmy.club 1 points 21 hours ago

This comment is the most importantly one in this thread. Laptops already had GPUs. Does the copilot button actually result in you conversing with an LLM locally or is inference done in the cloud? If the latter, it’s even more useless.