
The first part of what I hope to be an ongoing series about repatriating and owning my own data and tech. In this post, I describe how I integrated my own self-

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
For Example
We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.
Useful Lists
The first part of what I hope to be an ongoing series about repatriating and owning my own data and tech. In this post, I describe how I integrated my own self-
Show your interest in getting ActivityPub support for Taiga (issue tracker and agile project management tool)
Please describe the problem / need you are trying to solve. I want to be able to subscribe to changes and publish them (publicly and/or privately; it's server-dependent). Describe the feature or th...
A short story on why I still go through the effort of self hosting servers and some things it taught me recently.
How to track where a bag is in HomeAssistant?
My son has a bag which he takes with him to Kindergarten every day. I'd like to throw in something like an Apple AirTag to be able to see where the bag is, but I have a couple of requirements:
Any ideas what would work?
Lemmy Webhook - add webhook support to your Lemmy instance
Hi there!
Since the last time the LemmyWebhook package gained quite a few new capabilities so I've decided it's time for another post.
Quick intro to the package: It adds support for webhooks to Lemmy, meaning you can get notified of events to automatically react to, instead of having to poll for everything, often using multiple http requests. Everything is done in a quite efficient way which avoids hitting your database as much as possible and if it does, it only uses queries on primary key. You can also (optionally) make it available to other users who can then run their bots on your instance only on the permissions you allow them, meaning if you only grant them access to post events, they don't also get access to new user events.
So, what's new?
Freesat → MythTV would be useful. But with what hardware?
cross-posted from: https://infosec.pub/post/8864206
I bought a Silicondust HD Homerun back before they put their website on Cloudflare. I love the design of having a tuner with a cat5 port, so the tuner can work with laptops and is not dependent on being installed into a PC.
But now that Silicondust is part of Cloudflare, I will no longer buy their products. I do not patronize Cloudflare patrons.
I would love to have a satellite tuner in a separate external box that:
- tunes into free-to-air content
- has a cat5 connection
- is MythTV compatible
Any hardware suggestions other than #Silicondust?
Paperless-ngx behind nginx? I'm lost.
-post won't delete, so redacted instead-
Paperless Ngx Setup
I installed paperless and really like it. I successfully get the consumption folder and email fetching working, but I have one small concern. All documents from consumption folder and email are immediately processed into Storage path as soon as Paperless importing them with all "matching" tags, document type, which many documents I haven't REVIEWED or SAVED yet, but they're already moved to the folder.
Is there a way to set documents from consumption folder and emails into a folder or maybe a queue before I manually review and save them ?
Establishing a secure external connection to my home media and services?
What I'm trying to do:
I've recently set up a home media server (Jellyfin, Radarr, Sonarr, etc) and would like to be able to give external access to the Jellyfin server to a few family members. Additionally, I'd like to establish an internally and externally accessible dashboard (probably using Homepage) that facilitates access to various services (e.g. Sonarr, Radarr, qBittorrent), as well as Frigate's dashboard, and allows access to a separate Home Assistant box's dashboard.
Ideal set up:
The dashboard would be accessible through https://dashboard.lemtrees.com/. Individual services would be accessible directly through https://.lemtrees.com/ (e.g. https://sonarr.lemtrees.com/). Access to this dashboard should be safe and secure, and accessible from anywhere (i.e. not just my phone or a pre-approved device) if possible. Access to this dashboard would facilitate access to the Home Assistant box's HA dashboa
Moved and now Nginx Is giving internal error on SSL reverse proxy setup
Hi everyone, as the title says I just moved houses and ISPs and now cannot access my server's services through nginx. I check that they are up and running as I have tailscale setup via my phone and can access the services through that. When I go into Nginx I see that I can "setup" an http reverse proxy (when i click on it brings up a blank page) but when I go to setup ssl I get an internal service error. I've also double checked and my domain is pointing to the right IP address and all required ports are forwarded. Any thoughts as to why this would be occurring? I initially thought it was a config issue but after removing nginx container and stack via portainer and redeployment I still get the exact same issues. Hopefully someone can help.
Edit 1: I've also tried different IPs (docker & tailscale internal IP) when setting up reverse proxy host in nginx to no avail. Tailscale internal IP was working flawlessly before move.
OS for Application-/Homeserver
Hey!
I am currently using Yunohost on my HP EliteDesk 600 G3 but I want to switch to a docker based system. That makes it a lot more flexible. Most important apps are: Vaultwarden and Nextcloud. I dont have a lot of data so 2TB is mostly enough (but would be nice if I can extend that).
Disks: 2x2TB SSD and 1x1TB SSD
I am using a Synology as my Backup for my Data (sending backup every night via restic)
So my question is: What OS do I use for this?
Had a look at:
OMV: nice, OpenSource, but Docker stuff is since the new update...not so nice.
unraid: tested it, very easy to handle, but feels like "to powerful" and I am only using SSDs and I read you should use HDDs for the array.
Debian + Portainer: Both Options above are powerful and I think more for "save a lot of data" systems. Debian+Portainer sounds like an minimalistic solution for what I want, but I dont know if this I have to configure a lot and having a lot of work with it. I am not very experience wi
ELI5 - Process for backing up docker configs
Hello, I'm doing my darndest to learn docker but I'm a bit lost in the sauce for understanding how to best structure the setup for backing up and portability.
My current approach is that I keep my docker compose yaml file in git (just using 1 for the time being) and all of my container configs live in another directory outside of this git repo. I understand if I were to move systems or my system were to fail, I'd need these docker config folders to setup my containers on a new system.
As for backing the config folders up, I plan to relocate them to a shared folder on my NAS. Is this the right way to do this or is there a better way to approach this?
Hopefully the lingo is correct, still learning. Thanks a ton!
Sshwifty secure login question
I’m in the process of selecting a web based ssh app to add all my ssh servers in one place and i’ve tried apache guacamole and it’s been working fine,
Also I’m trying sshwifty but the thing is, sshwifty doesn’t have a login interface before accessing the data so it’s not the best thing, so I’ve made an install and asking if that’s the best for my current setup..
I actually don’t have authelia nor authentik to put it behind 2fa app, and i don’t plan to install one soon BUT i installed sshwifty on oci vm that have a public ip of 123.123.123.123, and i only allowed port 8182 for this ip address so i added in the security list 123.123.123.123/32 so no one can access this app except localhost, and then i installed cloudflare tunnel into this vm and activated otp by email and allowed only my email.
So my question is, is this secure enough?
Alternative to Headscale?
Hi all,
I very briefly kicked the tires on Headscale, and whilst it certainly seemed very impressive, I did have a few concerns.
Primarily, that non-admin users don't seem to need to consent to having config changes applied to their devices. Whilst it's assumed admins are trustworthy (I'd like to think so!), it just struck me as not the way I'd want something to function when it comes to direct access between devices, routes etc. It also doesn't seem like it logs and tells users when something has changed, so shenanigans could occur, and the user would be unaware of it, especially if it got put back to its prior state of config.
Also seems to lack a self-service aspect to it, where if a user got a new device or had to reinstall their OS and had no backups then they'd need to ask me to be added back to the mesh. Ideally, a user would be able to add their own devices to their own group and allow interoperability between their own devices, but selectively open up access to specific dev
Firefly III data importer refusing to connect
Hey there!
I've been trying to set up Firefly III and it's data importer on my server. These are running on Docker Compose and I installed it according to the instructions found here
Now while Firefly itself is running fine, it's data import tool doesn't really want to work. I can connect to it just fine, however, after I put in my client ID, the next step refuses to connect. Firefly III is running externally on port 90, internal port 8080 and the importer is on port 91, internal 8080.
Here are the logs for the data importer container:
undefined
2023-12-04T00:01:14.522957414Z [2023-12-04 01:01:14] local.INFO: The following configuration information was found: 2023-12-04T00:01:14.523048918Z [2023-12-04 01:01:14] local.INFO: Personal Access Token: "" (limited to 25 chars if present) 2023-12-04T00:01:14.523135810Z [2023-12-04 01:01:14] local.INFO: Client ID : "0" 2023-12-04T00:01:14.52323269
I'd rather kill myself than host SMTP again
It's been two months since I am running a mail server. I worked on a beautiful UI like SendGrid and MailGun for more than six months. I plan to start a transactional email service.
I bought a range and rented another /24 range because I didn't want to have a bad neighbour on the subnet. I even got my own ASN because jerks like UCEProtect often put big ISPs on a blacklist at the ASN level.
Of course, I have got a decent experience with this. I wrote my own SMTP server, email routing, and other things such as bounce and suppression handling. In a sense, everything is fine. RDNS, DKIM, DMARC, and SPF.
I know that IP needs to warm up, so that's where I started. I paid for a few services to help me warm up, and it took me about two months to do so. Okay so far. The email was delivered 100% of the time to Gmail, but not at all to Yahoo and Outlook. The delivery rate to these two companies started to get better around last week, though. Some IP addresses started getting a 100% de
Sanity check for Proxmox + Plex setup
Hey all,
Just purchased a little Lenovo M720q and an external 8TB HDD to start off.
Wanted to use Proxmox so I can play around with VMs in the future.
Here is my plan:
I am not sure with file systems to use for the Proxmox host and external storage. Been seeing people recommend EXT4, NTFS, ZFS