pezhore
For you and @Stalinwolf@lemmy.ca, from llama3.2:
So, approximately 8,144,000 football fields would weigh around 273 billion metric tons.
But the prices will come down if tariffs are removed, right? Right!?
Does it offer alternative routes if there's an accident/congestion?
Thanks for the heads up. I intend to terminate the Cat6A myself after successfully wiring the rest of the main house. (I overbought a giant spool and have been looking for ways to use it). I have a cable snake that I've used in the past - but it's quite annoying to deal with - sometimes the snake comes loose, it's a fiberglass/plastic rolled snake so it is hard to get going in a straight line, etc.
That's where I'm hoping the conduit will make things easier - it's a straight line where there shouldn't be much room for torquing or bending the cable snake.
I'm doing that with docker compose in my homelab, it's pretty neat!
services:
ollama:
volumes:
- /etc/ollama-docker/ollama:/root/.ollama
container_name: ollama
pull_policy: always
tty: true
restart: unless-stopped
image: ollama/ollama
ports:
- 11434:11434
deploy:
resources:
reservations:
devices:
- driver: nvidia
device_ids: ['0']
capabilities:
- gpu
open-webui:
build:
context: .
args:
OLLAMA_BASE_URL: '/ollama'
dockerfile: Dockerfile
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
volumes:
- /etc/ollama-docker/open-webui:/app/backend/data
depends_on:
- ollama
ports:
- 3000:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434/'
- 'WEBUI_SECRET_KEY='
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
volumes:
ollama: {}
open-webui: {}
Sure! I mostly followed this random youtuber's video for getting Wyoming protocols offloaded (Whisper/Piper), but he didn't get Ollama to use his GPU: https://youtu.be/XvbVePuP7NY.
For getting the Nvidia/Docker passthrough, I used this guide: https://www.bittenbypython.com/en/posts/install_ollama_openwebui_ubuntu_nvidia/.
It's working fairly great at this point!
It's probably not required, but wouldn't it be easier for future runs to just go down a pipe vs blind feeding it/using a snake through a wall?
I know it goes against privacy concerns, but I miss the traffic/congestion capabilities of Google maps. Sure, taking the interstate may be the fastest/most direct route normally, but today there's an accident that's blocking the two left lanes and everything is fucked.
Feel like drawing my dearly departed dog Murphy?
I spun up a new Plex server with a decent GPU - and decided to try offloading Home Assistant's Preview Voice Assistant TTS/STT to it. That's all working as of yesterday, including an Ollama LLM for processing.
Last on my list is figuring out how to get Home Assistant to help me find my phone.