this post was submitted on 07 Feb 2025
11 points (100.0% liked)

Selfhosted

42716 readers
567 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

As per the title really. The whole AI revolution has largely passed me by, but the idea of self hosting something on a small box like this appeals. I don't have an nvidia GPU in my PC and never will, so far as I can tell that pretty much rules out doing anything AI there.

I guess I can run it as a headless machine and connect over SSH or whatever web interface the AI models provide? I'm assuming running Proxmox on it will not work that well.

My main idea for AI is identifying photos with certain properties to aid in tagging over 20 years and 10s of thousands of photos.

top 2 comments
sorted by: hot top controversial new old
[–] just_another_person@lemmy.world 5 points 2 weeks ago

You can use the majority of "AI" things with non-Nvidia hardware, so don't feel boxed in by that. Some projects just skew towards the Nvidia tool chain, but there are many ways to run it on AMD if you feel the need.

The "Super" board is just an Orin Nano with the power profiles unlocked. There is literally no difference except the software, and if you bootstrap an Orin Nano with the latest Nvidia packages, they perform the same. At about 67 TOPS.

For your project, you could run that on pretty much any kind of CPU or GPU. I wouldn't pay $250 for the Super devkit when you get a cheaper GPU to do this, and CPU would work just a bit slower.

[–] dukatos@lemm.ee 1 points 2 weeks ago

I work on 64GB version. It arrives with Ubuntu Linux and you'll have to use it because of DeepStream and other packages.. You can run GUI on it if you like... By default it runs Gnome, if I remember correctly. It has Triton inference server but it will run ollama as well... It is very fast, powerful and a bit expensive. You might try to find a cheaper version, they announced it recently.