this post was submitted on 29 Mar 2024
1014 points (98.1% liked)

Curated Tumblr

5680 readers
769 users here now

For preserving the least toxic and most culturally relevant Tumblr heritage posts.

Here are some OCR tools to assist you in transcribing posts:

Don't be mean. I promise to do my best to judge that fairly.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Hawk@lemmynsfw.com 6 points 1 year ago (1 children)

They were inferencing a cnn on a mobile device? I have no clue but that would be costly battery wise at least.

[–] didnt_readit@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago) (1 children)

They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.

[–] Hawk@lemmynsfw.com 1 points 1 year ago

It couldn’t quite be a decade, a decade ago we only just had the vgg — but sure, broad strokes, they’ve been doing local stuff, cool.