fenndev

joined 1 year ago
[–] fenndev@leminal.space 7 points 1 week ago

"Someone mentions a distro they like" ≠ shilling. I use Bazzite and have been for months. Before that, used Nobara, EndeavourOS, and vanilla Fedora, along with a number of others I tried when I was distro-hopping. Wholeheartedly believe that Bazzite is currently the best generally-available Linux distro for gaming and is up there for general use. It's not perfect, but nothing is - it gets close for the use-cases I mentioned, though.

[–] fenndev@leminal.space 2 points 2 weeks ago (1 children)

Question: Does the green globe icon always indicate that it's working?

[–] fenndev@leminal.space 2 points 2 weeks ago

Sorry, I realized as I was pasting it that I had typo'd in my config (consistently, as in it was functional) and started to correct it. My bad.

 

I'm trying to get Qbittorrent set up within Docker on my home server and want to configure port forwarding through my VPN for all of those Linux ISOs. Ideally, I also want to get a pipeline going with the *arr stack. I've heard the easiest way to do this is with Gluetun but I can't for the life of me figure it out or know how to test it. Anyone been through something similar?

Here is my current Docker Compose for reference:


services:
  gluetun:
    image: qmcgaw/gluetun:latest
    container_name: gluetun
    cap_add:
      - NET_ADMIN
    environment:
      - VPN_SERVICE_PROVIDER=airvpn
      - VPN_TYPE=wireguard
      - WIREGUARD_PRIVATE_KEY="[redacted]"
      - WIREGUARD_PRESHARED_KEY="[redacted]
      - WIREGUARD_ADDRESSES=10.131.184.14/32
      - FIREWALL_VPN_INPUT_PORTS=8069
      - SERVER_COUNTRIES=United States
    devices:
      - /dev/net/tun:/dev/net/tun
    volumes:
      - /home/fenndev/.config/gluetun:/config
    ports:
      - 9091:9091  # WebUI
     - 6881:6881
      - 6881:6881/udp
    restart: unless-stopped

qbit:
    image: lscr.io/linuxserver/qbittorrent:latest
    container_name: qbit
    network_mode: "service:gluetun"
    environment:
      - PUID=1000
      - PGID=1000
      - TZ=America/Los_Angeles  # Timezone set to Los A>
      - WEBUI_PORT=9091  # Qbittorrent webUI port
    volumes:
      - /home/fenndev/.config/qbit:/config  # Configura>
      - /home/fenndev/torrents:/downloads  # Torrent da>
    depends_on:
      glueten:
        condition: service_healthy
[–] fenndev@leminal.space 1 points 2 weeks ago (1 children)

May I ask what services you're running, and to see your Quadlet files? I'm about to make the same move.

[–] fenndev@leminal.space 3 points 3 weeks ago (1 children)

They have their own config generator and port forwarding is really easy to set up IMO. Both need to be logged in to see, though.

[–] fenndev@leminal.space 6 points 3 weeks ago (3 children)

The interface - GUI and website - is straight out of 2008 and documentation could be better, but otherwise it works just fine for torrenting and browsing. No complaints there.

[–] fenndev@leminal.space 17 points 3 weeks ago (8 children)

Happily using AirVPN for port forwarding.

[–] fenndev@leminal.space 8 points 1 month ago (1 children)

Would love some guides on torrenting over I2P.

[–] fenndev@leminal.space 1 points 2 months ago (1 children)

I appreciate the offer, but unfortunately, I'm an American.

 

I'm running a rather small homelab and am hunting for a good UPS to help keep everything running smoothly. My top priorities are:

  • Just enough battery life to keep things running until they can be shut down
  • Compatible with open source software for monitoring and automated shutdown

Would I have better luck getting a used one and a new battery, or a brand new unit altogether? Anyone have one they don't need anymore, on that note? 👀

Thanks for the advice!

 

I've got a Lenovo M720q running as my main server in my home and it's more than powerful enough for anything I could be doing right now. However, I also have a Le Potato lying around that I'd like to do something with. Any suggestions?

[–] fenndev@leminal.space 2 points 3 months ago

I just spun up a FreshRSS container and it is working flawlessly for that purpose so far. I appreciate the suggestions.

[–] fenndev@leminal.space -3 points 3 months ago (5 children)

It's hosted, but not self-hosted.

[–] fenndev@leminal.space 1 points 3 months ago

Linkwarden doesn't appear to support RSS, which is a massive bummer.

39
submitted 3 months ago* (last edited 3 months ago) by fenndev@leminal.space to c/selfhosted@lemmy.world
 

I'm looking for a self-hosted alternative for Omnivore. To keep it short and sweet, I'm looking for an app that I can subscribe to RSS feeds from and maintain Reader Mode-esque archives of news articles and interesting things I've read. Obsidian integration would be nice but is not a priority; however, the ability to save from Android is a must.

Hoarder is something I've recently spun up on my home server but despite looking great, it doesn't do what I'd like it to do. Clicking on an article doesn't present me with a Reader Mode archive, it takes me to the actual webpage; I have to click on something else to get the cached version (and even then, it doesn't format things in the way I'd like). I feel this order of operations should be reversed. On the mobile app, you can't even access the cached version.

I've used Wallabag before, but disliked the mobile interface. I wasn't self-hosting, however, so I'm not sure the difficulty level for it. Barring finding anything better, I'll likely try and self-host Wallabag.

Shiori looks fantastic but I'd rather not resort to using Termux on my Android phone to share content. No mobile app makes it difficult.

Any suggestions?

SOLVED

Following numerous suggestions, I spun up a FreshRSS container and will be looking into both Shiori (which has a third-party mobile app) and Linkwarden. Thanks, everyone!

 

TL; DR: Is it possible (and if so, desirable) to configure my OPNsense router to handle non-standard traffic instead of needing to configure each client device manually? Examples of what I mean by 'non-standard traffic' include Handshake, I2P, ZeroNet, and Tor.

 

I'm new to electronics and looking to assemble an array of components and tools for working on and designing electronics & circuits. Something immediately apparent is that all of the widely available kits orient you towards working with microcontrollers and SBCs; these kits are cool, but I want to have a halfway decent understanding of the underlying analog components and circuit design before I go digital.

With that in mind, what should I get? If anyone could specify specifics to look into, I'd really appreciate that! Thanks for the help.

Current list

  • A decent breadboard
  • Jumper wires
  • Multimeter
  • Batteries
  • Variable Power Supply?
  • Assorted resistors (1Ω-?)
  • Capacitors (Electrolytic and ceramic?)
  • Various ICs?
  • Transistors?
  • Diodes, probably?
  • Potentiometers
8
submitted 9 months ago* (last edited 9 months ago) by fenndev@leminal.space to c/selfhosted@lemmy.world
 

Edit: Thanks for the help, issue was solved! Had Traefik's loadbalancer set to route to port 8081, not the internal port of 80. Whoops.

Intro

HI everyone. I've been busy configuring my homelab and have run into issues with Traefik and Vaultwarden running within Podman. I've already successfully set up Home Assistant and Homepage but for the life of me cannot get things working. I'm hoping a fresh pair of eyes would be able to spot something I missed or provide some advice. I've tried to provide all the information and logs relevant to the situation.

Expected Behavior:

  1. Requests for *.fenndev.network are sent to my Traefik server.
  2. Incoming HTTPS requests to vault.fenndev.network are forwarded to Vaultwarden
    • HTTP requests are upgraded to HTTPS
  3. Vaultwarden is accessible via https://vault.fenndev.network and utilizes the wildcard certificates generated by Traefik.

Quick Facts

Overview

  • I'm running Traefik and Vaultwarden in Podman, using Quadlet
  • Traefik and Vaultwarden, along with all of my other services, are part of the same fenndev_default network
  • Traefik is working correctly with Home assistant, Adguard Home, and Homepage, but returns a 502 Bad Gateway error with Vaultwarden
  • I've verified that port 8081 is open on my firewall and my service is reachable at {SERVER_IP}:8081.
  • 10.89.0.132 is the internal Podman IP address of the Vaultwarden container

Versions

Server: AlmaLinux 9.4

Podman: 4.9.4-rhel

Traefik: v3

Vaultwarden: alpine-latest (1.30.5-alpine I believe)

Error Logs

Traefik Log:

2024-05-11T22:09:53Z DBG github.com/traefik/traefik/v3/pkg/server/service/proxy.go:100 > 502 Bad Gateway error="dial tcp 10.89.0.132:8081: connect: connection refused"

cURL to URL:

[fenndev@bastion ~]$ curl -v https://vault.fenndev.network
*   Trying 192.168.1.169:443...
* Connected to vault.fenndev.network (192.168.1.169) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
*  CAfile: /etc/pki/tls/certs/ca-bundle.crt
* TLSv1.0 (OUT), TLS header, Certificate Status (22):

Config Files

vaultwarden.container file:

[Unit]
Description=Password 
After=network-online.target
[Service]
Restart=always
RestartSec=3

[Install]
# Start by default on boot
WantedBy=multi-user.target default.target

[Container]
Image=ghcr.io/dani-garcia/vaultwarden:latest-alpine
Exec=/start.sh
EnvironmentFile=%h/.config/vault/vault.env
ContainerName=vault
Network=fenndev_default

# Security Options
SecurityLabelType=container_runtime_t
NoNewPrivileges=true                                    
# Volumes
Volume=%h/.config/vault/data:/data:Z

# Ports
PublishPort=8081:80

# Labels
Label=traefik.enable=true
Label=traefik.http.routers.vault.entrypoints=web
Label=traefik.http.routers.vault-websecure.entrypoints=websecure
Label=traefik.http.routers.vault.rule=Host(`vault.fenndev.network`)
Label=traefik.http.routers.vault-websecure.rule=Host(`vault.fenndev.network`)
Label=traefik.http.routers.vault-websecure.tls=true
Label=traefik.http.routers.vault.service=vault
Label=traefik.http.routers.vault-websecure.service=vault

Label=traefik.http.services.vault.loadbalancer.server.port=8081

Label=homepage.group="Services"
Label=homepage.name="Vaultwarden"
Label=homepage.icon=vaultwarden.svg
Label=homepage.description="Password Manager"
Label=homepage.href=https://vault.fenndev.network

vault.env file:

LOG_LEVEL=debug
DOMAIN=https://vault.fenndev.network 
 

cross-posted from: https://leminal.space/post/6179210

I have a collection of about ~110 4K Blu-Ray movies that I've ripped and I want to take the time to compress and store them for use on a future Jellyfin server.

I know some very basics about ffmpeg and general codec information, but I have a very specific set of goals in mind I'm hoping someone could point me in the right direction with:

  1. Smaller file size (obviously)
  2. Image quality good enough that I cannot spot the difference, even on a high-end TV or projector
  3. Preserved audio
  4. Preserved HDR metadata

In a perfect world, I would love to be able to convert the proprietary HDR into an open standard, and the Dolby Atmos audio into an open standard, but a good compromise is this.

Assuming that I have the hardware necessary to do the initial encoding, and my server will be powerful enough for transcoding in that format, any tips or pointers?

 

I have a collection of about ~110 4K Blu-Ray movies that I've ripped and I want to take the time to compress and store them for use on a future Jellyfin server.

I know some very basics about ffmpeg and general codec information, but I have a very specific set of goals in mind I'm hoping someone could point me in the right direction with:

  1. Smaller file size (obviously)
  2. Image quality good enough that I cannot spot the difference, even on a high-end TV or projector
  3. Preserved audio
  4. Preserved HDR metadata

In a perfect world, I would love to be able to convert the proprietary HDR into an open standard, and the Dolby Atmos audio into an open standard, but a good compromise is this.

Assuming that I have the hardware necessary to do the initial encoding, and my server will be powerful enough for transcoding in that format, any tips or pointers?

 

cross-posted from: https://leminal.space/post/4761745

Shortly before the recent removal of Yuzu and Citra from Github, attempts were made to back up and archive both Github repos; it's my understanding that these backups, forks, etc. are fairly incomplete, either lacking full Git history or lacking Pull Requests, issues, discussions, etc.

I'm wondering if folks here have information on how to perform thorough backups of public, hosted git repos (e.g. Github, Gitlab, Codeberg, etc.). I'd also like to automate this process if I can.

git clone --mirror is something I've looked into for a baseline, with backup-github-repo looking like a decent place to start for what isn't covered by git clone.

The issues I can foresee:

  • Each platform builds its own tooling atop Git, like Issues and Pull Requests from Github
  • Automating this process might be tricky
  • Not having direct access/contributor permissions for the Git repos might complicate things, not sure

I'd appreciate any help you could provide.

 

cross-posted from: https://leminal.space/post/4761745

Shortly before the recent removal of Yuzu and Citra from Github, attempts were made to back up and archive both Github repos; it's my understanding that these backups, forks, etc. are fairly incomplete, either lacking full Git history or lacking Pull Requests, issues, discussions, etc.

I'm wondering if folks here have information on how to perform thorough backups of public, hosted git repos (e.g. Github, Gitlab, Codeberg, etc.). I'd also like to automate this process if I can.

git clone --mirror is something I've looked into for a baseline, with backup-github-repo looking like a decent place to start for what isn't covered by git clone.

The issues I can foresee:

  • Each platform builds its own tooling atop Git, like Issues and Pull Requests from Github
  • Automating this process might be tricky
  • Not having direct access/contributor permissions for the Git repos might complicate things, not sure

I'd appreciate any help you could provide.

14
submitted 11 months ago* (last edited 11 months ago) by fenndev@leminal.space to c/opensource@lemmy.ml
 

Shortly before the recent removal of Yuzu and Citra from Github, attempts were made to back up and archive both Github repos; it's my understanding that these backups, forks, etc. are fairly incomplete, either lacking full Git history or lacking Pull Requests, issues, discussions, etc.

I'm wondering if folks here have information on how to perform thorough backups of public, hosted git repos (e.g. Github, Gitlab, Codeberg, etc.). I'd also like to automate this process if I can.

git clone --mirror is something I've looked into for a baseline, with backup-github-repo looking like a decent place to start for what isn't covered by git clone.

The issues I can foresee:

  • Each platform builds its own tooling atop Git, like Issues and Pull Requests from Github
  • Automating this process might be tricky
  • Not having direct access/contributor permissions for the Git repos might complicate things, not sure

I'd appreciate any help you could provide.

view more: next ›