Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)DE
Posts
8
Comments
62
Joined
2 yr. ago
  • Heyha ! Read about dd on makeuseof after reading your post, to see how it works.

    Restoring from an image seems exactly what I was looking for as a full backup restore.

    However this kind of 1 command backup isn't going to work on databases (mariadb, mysql...). How should I procede with my home directory where all my containers live and most of then having running databases?

    Does it work with logical volumes? Is it possible to copy evrything except /home of the logical volumes?

  • This post was about browsers but my feelings when I wrote It was a more general "conclusion". I only found out recently about some "hidden" privacy concerns with browsers (WebRTC leaking your real ip, fonts fingreprinting...) But when I found out about android's default keyboard sending samples, IOT weaknesses, smart devices data hoarding... It really feels like a losing battle while being connected to the world...

  • Do not overthink they want to know about you everything.

    That's true, they probably already have everything they need... It's not only about my personal data, and my example only points out to the web technology, but everywhere around us are some data hoarding devices that are either used to targeted ads, campaign, profiling, IA dataset feeding... whatever !

    It feels like we already lost our right to privacy and how personal data, telemetry is used as a whole in our society...

  • Privacy @lemmy.ml
    deepdive @lemmy.world

    Does it even make sense to care about privacy?

    Heyha !

    This is probably going to be long take and it's late here in europe... So for those who bare with me and are ready to read through my broken English, thank you.

    I'm personally concerned about how my data and my identity is used against my will while surfing the web or using/hosting services. Self-hoster and networking enthousiast, I have some entry/medium security infrastructure.

    Ranging from self-hosted adblocker, dns, router, vlans, containers, server, firewall, wireguard, VPN... you name it ! I was pretty happy to see all my traffic being encrypted through wireshark and having what I consider a solid homelab.

    Also having most undesired dns/ads blocked with adguard in firefox with custom configuration, blocking everything, and changing some about:config options:

    • privacy.resistFingerprinting
    • privacy.trackingprotection.fingerprinting.enabled
    • ...

    I though I had some pretty harden security and safe browsing experience, but oh my I was wrong...

    **From pixel tracking

  • Thank you for your insights and personal experiences :) I love Debian stable as server, never had any issues on a old Asus laptop ! I have only 2 years of "experience" and started with Ubuntu. Good introduction to linux but switched to Debian (<3)

    That's way I'm asking arround I don't wan't to have a too bad experience with Debian as main personal PC !

    Thank you for your personal blog post and the wiki link :) will surely read through before making my final choice !

  • Rethinkdns is probably your best bet! Right now they are missing an important feature where It takes wireguard's DNS configuration into account, making it obsolete for those who have private dns in a local environnement with an upstream dns !

    Can't wait for version 0.5.6 😄

  • Thanks :) good to know I can switch between those two in KDE ! I need to test Plasma and xfce to see wich fits better my needs and has better suppport for my system !

    Thanks for the clarification !!

  • Do you consider testing a better choice than sid for desktop/gaming environment?

    I'm really not sure which one I should use. I only have experience with bare bone debian stable as server, I'm trying to find the best choice when switching from windaube to debian :)

    Thanks for your insights and personal experiences !

  • Thank you !!

    I'm currently looking into xfce vs KDE plasma, something I need to pay attention to is a DE with x11 because nvidia hasn't fully supported wayland ?

    Am I right to consider it that way? Or do both support nvidia drivers?

    I'm sorry, I only use debian as bare bone on my server and currently considering to switch my main desktop from windaube to linux and alot of informations on the web seem contradictory or incomplete :/

  • You probably have your reason to run Debian testing but I read somewhere that testing is somehow a bad idea as desktop environment !

    If somehing is stuck and being updated in sid, and bugs are still happening, you could be stuck for month without the correct update in testing.

    Sorry if it's not clear, but I read it somewhere in the official debian documentation.

  • Are there any avantages of LMDE vs pure debian?

    I mean LMDE is just plain debian with pre installed packages, GUI?

    If LDME uses debian bookworm stable, their is not point in LMDE except for the ease of installation process?

  • Strange enough TLS 1.3 still doesn't support signed ed25519 certificates :| P‐256, NIST P‐384 or NIST P‐521 curves are known to be "backdoored" or having deliberately chosen mathematical weakness. I'm not an expert and just a noob security/selfhoster enthusiast but I don't want to depend on curves made by NSA or other spy agencies !

    I also wondering if the EU isn't going to implement something similar with all their new spying laws currently discussed...

  • Certificate chain of trust: I assume you’re talking about PKI infrastructure and using root CAs + Derivative CAs? If yes, then I must note that I’m not planning to run derivative CAs because it’s just for my lab and I don’t need that much of infrastructure.

    An intermediate CA could potentially be useful, but isn't really needed in self-signed CA. But in case you have to revoke your rootCA, you have to replace that certificate on all your devices, which can become a lot of hassle if you share that trusted root CA with family/friends. By having a intermediate CA and hiding your root CAs private key somewhere offline, you could take away that overheat by just revoking the intermediate CA and updating the server certificate with the newly signed Intermediate bundle and serving that new certificate through the proxy. (Hope that makes sense? :|)

    I do not know what X.509 extensions are and why I need them. Could you tell me more?

    This will probably give you some better explanation than I could :| I have everything written in a markdown file, and reading through my notes I remember I had to put some basic constraints TRUE in my certificates to make them work on my android root store ! Some are necessary to make your root CA work properly (like CA:True). Also if you want SAN certificates (multidomaine) you have to put them in your x509 extensions.

    ’m also considering client certificates as an alternative to SSO, am I right in considering them this way?

    Ohhh, I don't know... I haven't installed or used any SSO service and thinking of MFA/SSO with authelia in the future ! My guess would be that those are 2 different technologies and could work together? Having self-signed CA with a 2FA could possible work in a homelab but I have no idea how because I haven't tested it out. But thinks to consider if you want clients certificates for your family/friends is to have a intermediate CA in case of revocation, you don't have to replace the certificate in their root store every time you sign a new Intermediate CA.

    I’ll mention that I plan to run an instance of HAProxy per podman pod so that I terminate my encrypted traffic inside the pod and exclusively route unencrypted traffic through local host inside the pod.

    I have no idea about HAProxy and podman and how they work to encrypt traffic. All my traffic passes through a wireguard tunnel to my docker containers/proxy which I consider safe enough? Listening to all my traffic with wireshark seamed to do exactly what I'm expecting but I'm not an expert :L So I cannot help you further on that topic. But I will keep your idea in my notes to see If there could be further improvement in my setup with HAProxy and podman compared to docker and traefik through wireguard tunnel.

    Of course, that means that every pod on my network (hosting an HAProxy instance) will be given a distinct subdomain, and I will be producing certificates for specific subdomains, instead of using a wildcard.

    Openssl SAN certificates are going to be a life/time saver in your setup ! One certificat with multidomian !


    I'm just a hobby homelaber/tinkerer so take everything with caution and always double check with other sources ! :) Hope it helps !


    Edit

    Thinking of your use case I would personally create a rootCA and an intermediateCA + certificate bundle. Put the rootCA in the trusted store on all your devices and serve the intermediateCA/certificate bundle with your proxy of choice. Signing the certificate with SAN X.509 extension for all your domains. Save your rootCA's key somwhere offline to keep it save !

    The links I gave you are very useful but every bit of information is a bit dispatched and you have to combine them by yourself, but it's a gold mine of information !

  • Linux @lemmy.ml
    deepdive @lemmy.world

    Removing/deep cleanup of installed package doesn't work as expected. (remove, purge, autoremove)

    Hi everyone :)

    After installing the emacs package and trying to remove it afterwards:

    sudo apt remove --purge --autoremove emacs

    It only removed that package and not the other dependencies installed with it (emacs-gtk, emacs-common...). I had to manually remove them one-by-one.

    Isn't that command supposed:

    • remove package
    • it's configuration files
    • remove unused packages automatically installed ?

    What am I missing here?

    Also after reading the Stupid things you've done that broke your Linux installation post, I read a lot of people messing up their debian system after using the above command... So I assume that's not the correct way of doing things in Linux?

    Some insight from experienced user would be great :)

    Privacy @lemmy.ml
    deepdive @lemmy.world

    Selective cookie blocker ? (GitHub)

    Hi everyone !

    Right now I use:

    • Firefox's full protection with everything blocked by default
    • AdGuard adblocker extension
    • Adguardhome DNS blocker
    • ProtonVPN through wireguard
    • Selfhosted searxng instance (metasearch engine aggregator).

    While this gives me reasonable doubt of protection/privacy, this blocks me out to interact with FOSS projects on github, which kindda sucks!! I don't want to accepts GitHub's long cookie list of tracking and statistics, but not being able too interact and help FOSS project to thrive, improve, get some visibility, will in the long term hurt FOSS projects.

    I'm aware of GitHub's cookie management preferences, but I don't trust them to manage and choose what should be accepted or not !

    Firefox only al

    Linux @lemmy.ml
    deepdive @lemmy.world

    Backups on Linux seems overwhelmingly complicated...

    Yeah another post about backups, but hear me out.

    I read most of the other post here on lemmy, read through the documentation from different backup tools (rsync, Borg, timeshift) but all those backup tools are for "static" files.

    I mean I have a few docker container with databases, syncthing to sync files between server, Android, Desktop and Mac, a few samba shares between Server, Mac and Desktop.

    Per say on Borg's documentation:

    • Avoid running any programs that might change the files.
    • Snapshot files, filesystems, container storage volumes, or logical volumes. LVM or ZFS might be useful here.
    • Dump databases or stop the database servers.
    • Shut down virtual machines before backing up their images.
    • Shut down containers before backing up their storage volumes.

    How I'm supposed to make a complete automated backup of my system if my files are constantly changing ? If I have to stop my containers, shutdown syncthing and my samba shares to make a full backup, that seams a bit to

    Linux @lemmy.ml
    deepdive @lemmy.world

    Debian sudoers and user best practice

    Hi everyone 🙂

    TLDR

    How do you work with debian and su permission and what's the best way to do it for better security?

    • Add an user in the sudoers?
    • Give special permissions to a group? User?
    • Always connect to su - (default root)?
    • Add users to groups?

    The story is unrelated to the question, but is a direct cause

    This is rookie question even though I use linux (ubuntu and recently debian) regularly and have alot of selfhosted docker containers on an old spare laptop.

    While this is probably one of the basics you need to know right away when playing arround with sudo or su I wasn't aware of how you can f#ck everything up with a single command

    chmod -R xxx /home/$USER

    chown -R ...

    Why would you do that? Because I'm stupid and how sometimes no idea what I'm doin? I was actually trying to change some permission to create a samba share (that's another story xD).

    Trying to revert everything, alot of my docker containers, certificates and special files were unreadabl

    Selfhosted @lemmy.world
    deepdive @lemmy.world

    Traefik, yaml format and backticks...

    Hi everyone !

    I just learned the hardway how important it is to make the difference between ` and ' ...

    Tried for 1 whole day to figure out why I got a stupid error in traefik with rule=host(hostname).

    While the logs weren't clear about what was raising the error:

    error while parsing rule Host('vaultwarden.home.lab'): 1:6: illegal rune literal

    I tried alot of different things, from changing the self signed certs, wildcards, adguard DNS rewrite, changing network add some wired traefik rules... To finally compare 2 different yaml files and having that "ahhh..." feeling...

    Bit depressing but finally happy to have my own self signed local domain names.

    Selfhosted @lemmy.world
    deepdive @lemmy.world

    Linkding/searxng selfhosters: Linkding injector extension !

    For those selfhosting linkding and searxng (or using google, duckduckgo, brave) there is a very cool and useful extension: linkding injector !

    It's documented in the linkding readme, but it's worth mentioning for those who didn't knew about it ! It works great with selfhosted searxNG instances and is very useful to search through your bookmarks.

    linkding — selfhosted bookmark manager

    searxNG — selfhosted meta search engine

    Selfhosted @lemmy.world
    deepdive @lemmy.world

    Selfhosted or alternative privacy driven google calendar?

    Hi everybody !

    While I really like the simple and sleek google calendar web GUI and functionalities, I'm more and more concerned about my data and privacy. Even if I have nothing to hide, I don't agree anymore to sell freely and consciously my data to any GAFAM.

    Has anyone any alternative to google calendar?

    • Free and if possible, open source? It can have some discret sponsors/ads. As long as it isn't to intrusive.
    • Todoist integration
    • Sync between devices
    • GUI doesn't have to be PERFECT, but a bare minimum for my candy eyes !
    • Can be API, Web... doesn't matter as long as it syncs between devices (android, mac, windows, linux)

    I already searched through the web, but couldn't find any conclusive alternative, maybe someone knows some hidden gem :)

    Thank you !


    EDIT: The solution and compromise: nextcloud. It took me some times (2days) to set it up correctly and make it work as intended.

    • Android calendar sync with DAVx5
    • Calendar notification on android's native