Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)AU

With anti-libre software, we are not users, we are used.

youtube.com/watch?v=Gsr9s0fmJZs

Posts
23
Comments
712
Joined
2 yr. ago
Selfhosted @lemmy.world
Autonomous User @lemmy.world

How to install Open WebUI on Arch Linux (Windows guide coming soon)

cross-posted from: https://lemmy.world/post/28493612

Open WebUI lets you download and run large language models (LLMs) on your device using Ollama.

Install Ollama

See this guide: https://lemmy.world/post/27013201

Install Docker (recommended Open WebUI installation method)

  1. Open Console, type the following command and press return. This may ask for your password but not show you typing it.
 undefined
    
sudo pacman -S docker


  
  1. Enable the Docker service [on-device and runs in the background] to start with your device and start it now.
 undefined
    
sudo systemctl enable --now docker


  
  1. Allow your current user to use Docker.
 undefined
    
sudo usermod -aG docker $(whoami)


  
  1. Log out and log in again, for the previous command to take effect.

Install Open WebUI on Docker

  1. Check whether your device has an NVIDIA GPU.
  2. Use only one of the following commands.

Your device has an NVIDIA GPU:

 undefined
    
docke
  
  • Can't see anyone replacing my on-device p2p libre apps with an app/service only they control. Try taking payment before making something no one asked for.

    Not only OP, lots keep trying it. Don't fall for this scam.

  • Clearly, I am not asking how it works or about 'open source'.

    Who controls the users' computing, its users? No. This is fixed by adding a libre software license text file.

    Want more? See this text or video.

  • Paying for libre software is good. The Google Play Store is not libre software.

    'Closed source' and 'open source' misses the point of libre software. Most things are trash. Making scams and malware competitive does not help us.

  • Not very peer-to-peer when I need to open your website everytime or run a web server on my phone.

    We already have Syncthing.

    Normal people don't want to pay for a service or run a server.

    To spread privacy, we need more apps to replace them with Syncthing. Like an app for this: https://lemmy.world/post/28313324

    This fails to include a libre software license text file, like AGPL. We do not control it, anti-libre software. Very dangerous.


    They target an app we already control, Syncthing, to replace it with an app/service only they control.

    With buzzwords, technology, and 'open source', we are distracted and derailed away from this.

    Attacks like this will quickly get your friends leaving private apps, which you worked so hard to recommend, when you fail to show them how to check for software freedom.

  • Privacy @lemmy.dbzer0.com
    Autonomous User @lemmy.world

    Contacts and calendars sync without a server (Syncthing + Radicale)

    Privacy @lemmy.ml
    Autonomous User @lemmy.world

    Contacts and calendars sync without a server (Syncthing + Radicale)

    LocalLLaMA @sh.itjust.works
    Autonomous User @lemmy.world

    Removed

    Removed

    Selfhosted @lemmy.world
    Autonomous User @lemmy.world

    Ollama not using AMD GPU on Arch Linux

    cross-posted from: https://lemmy.world/post/27088416

    This is an update to a previous post found at https://lemmy.world/post/27013201


    Ollama uses the AMD ROCm library which works well with many AMD GPUs not listed as compatible by forcing an LLVM target.

    The original Ollama documentation is wrong as the following can not be set for individual GPUs, only all or none, as shown at github.com/ollama/ollama/issues/8473

    AMD GPU issue fix

    1. Check your GPU is not already listed as compatibility at github.com/ollama/ollama/blob/main/docs/gpu.md#linux-support
    2. Edit the Ollama service file. This uses the text editor set in the $SYSTEMD_EDITOR environment variable.
     undefined
        
    sudo systemctl edit ollama.service
    
    
      
    1. Add the following, save and exit. You can try different versions as shown at github.com/ollama/ollama/blob/main/docs/gpu.md#overrides-on-linux
     undefined
        
    [Service]
    Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
    
    
      
    1. Restart the Olla
    LocalLLaMA @sh.itjust.works
    Autonomous User @lemmy.world

    Ollama not using AMD GPU on Arch Linux

    cross-posted from: https://lemmy.world/post/27088416

    This is an update to a previous post found at https://lemmy.world/post/27013201


    Ollama uses the AMD ROCm library which works well with many AMD GPUs not listed as compatible by forcing an LLVM target.

    The original Ollama documentation is wrong as the following can not be set for individual GPUs, only all or none, as shown at github.com/ollama/ollama/issues/8473

    AMD GPU issue fix

    1. Check your GPU is not already listed as compatibility at github.com/ollama/ollama/blob/main/docs/gpu.md#linux-support
    2. Edit the Ollama service file. This uses the text editor set in the $SYSTEMD_EDITOR environment variable.
     undefined
        
    sudo systemctl edit ollama.service
    
    
      
    1. Add the following, save and exit. You can try different versions as shown at github.com/ollama/ollama/blob/main/docs/gpu.md#overrides-on-linux
     undefined
        
    [Service]
    Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
    
    
      
    1. Restart the Olla
    Selfhosted @lemmy.world
    Autonomous User @lemmy.world

    How to install Ollama on Arch Linux

    cross-posted from: https://lemmy.world/post/27013201

    Ollama lets you download and run large language models (LLMs) on your device.

    Install Ollama on Arch Linux

    1. Check whether your device has an AMD GPU, NVIDIA GPU, or no GPU. A GPU is recommended but not required.
    2. Open Console, type only one of the following commands and press return. This may ask for your password but not show you typing it.
     undefined
        
    sudo pacman -S ollama-rocm    # for AMD GPU
    sudo pacman -S ollama-cuda    # for NVIDIA GPU
    sudo pacman -S ollama         # for no GPU (for CPU)
     
    
      
    1. Enable the Ollama service [on-device and runs in the background] to start with your device and start it now.
     undefined
        
    sudo systemctl enable --now ollama
    
      

    Test Ollama alone

    1. Open localhost:11434 in a web browser and you should see Ollama is running. This shows Ollama is installed and its service is running.
    2. Run ollama run deepseek-r1 in a console and `olla
    LocalLLaMA @sh.itjust.works
    Autonomous User @lemmy.world

    How to install Ollama on Arch Linux

    cross-posted from: https://lemmy.world/post/27013201

    Ollama lets you download and run large language models (LLMs) on your device.

    Install Ollama on Arch Linux

    1. Check whether your device has an AMD GPU, NVIDIA GPU, or no GPU. A GPU is recommended but not required.
    2. Open Console, type only one of the following commands and press return. This may ask for your password but not show you typing it.
     undefined
        
    sudo pacman -S ollama-rocm    # for AMD GPU
    sudo pacman -S ollama-cuda    # for NVIDIA GPU
    sudo pacman -S ollama         # for no GPU (for CPU)
     
    
      
    1. Enable the Ollama service [on-device and runs in the background] to start with your device and start it now.
     undefined
        
    sudo systemctl enable --now ollama
    
      

    Test Ollama alone

    1. Open localhost:11434 in a web browser and you should see Ollama is running. This shows Ollama is installed and its service is running.
    2. Run ollama run deepseek-r1 in a console and `olla
    LocalLLaMA @sh.itjust.works
    Autonomous User @lemmy.world

    Llama 3.1 Community License is not a free software license

    Lemmy.world Support @lemmy.world
    Autonomous User @lemmy.world

    Requesting c/llm

    [email protected] community and @[email protected] moderator appear inactive.

    Privacy @lemmy.ml
    Autonomous User @lemmy.world

    PSA Help others escape WhatsApp using Watomatic auto reply (and how to write it)

    watomatic.app

    Example

     undefined
        
    🤖 Automated Reply
    
    💬 I reply faster on example.org
    
    ⁉️ WhatsApp is anti-libre software. We do NOT control it. It withholds a libre software license text file, like GPL.
    
      

    Explained

    I reply faster on

    Deleting the only way to reach someone online breaks your influence.

    example.org

    A link and only one link, so (1) they see it's an app, not some random word or typo, (2) they can download it without searching, and (3) they don't have multiple choice–they don't need to do any thinking or research. Remove everything stopping them.

    anti-libre software.

    Never say privacy, they've heard it all before (from you, no doubt). Say something different.

    We do NOT control it.

    Make it simple and direct. Think of the most removed person you know and break it down in a way they would understand. Think about every angle it could be misunderstood.

    It withholds

    Libre software is normal, default. Anti-libre software is cringe, weird, dangerous. Act

    Privacy @lemmy.ml
    Autonomous User @lemmy.world

    PSA Use your display name to help move others to privacy respecting replacements.

    For example, on WhatsApp, use the whole 25 character limit for profile name. Examples:

     undefined
        
    Bob Moved To Signal.org
    
    Alice MovedTo Signal.org
    
    CharlieMovedTo Signal.org
    
      

    Say Signal.org, not Signal, because they won't know it's an app.

    Use your about section too. Same on Discord, Steam, Instagram, etc.

    Privacy @lemmy.ml
    Autonomous User @lemmy.world

    Calendar app with automatic import from file and automatic export to file, to use with Syncthing?

    Automatic calendar synchronisation with no server and no cloud?

    Privacy @lemmy.ml
    Autonomous User @lemmy.world

    How to carrier level block all calls and texts to a phone number, to only allow data, to force others to use Signal?

    Not just locally on the device! There must be an error when calling or texting the number.

    Data must work for internet for Signal, VoIP, etc.

    This is for a UK PAYG SIM.

    There should be no way for anyone to demand my number, or for me to leak it, or doing so should be irrelevant as it does not work.

    Privacy @lemmy.ml
    Autonomous User @lemmy.world

    Ring doorbell alternatives (only camera or plus mic and speakers) + Small Size?

    Need at least the camera, if not that plus a microphone and speakers, not the lock or bell.

    Libre Software (Obviously) + End-to-End Encryption

    Small and easy to hide, so the camera isn't stolen, attacked or bypassed.

    Best answer yet

    Privacy @lemmy.ml
    Autonomous User @lemmy.world

    Apple AirTag alternatives for theft prevention?

    Privacy @lemmy.ml
    Autonomous User @lemmy.world

    GrapheneOS: Google Pay (GPay) via Fitbit, second phone or other options?

    UK Banks: HSBC, Lloyds, Starling, Monzo, etc.

    Privacy @lemmy.ml
    Autonomous User @lemmy.world

    GrapheneOS, Pixel 8 Pro £709 or Pixel 9 Pro £1,099?

    Update I have come to a decision. Thank you to all who contributed suggestions. Please feel free to keep the discussion going to help others.

    Technology @lemmy.world
    Autonomous User @lemmy.world

    Framework looking for unpaid workers to be 'Linux Community Ambassadors'

    community.frame.work Framework is looking for Linux Community Ambassadors!

    Framework is looking for Linux Community Ambassadors! We are looking for active members of the Linux community who frequently visit Linux and open-source events throughout the year to help us connect with the larger Linux community. Our volunteer ambassadors will attend local Linux and open-source...

    Framework is looking for Linux Community Ambassadors!
    Linux @lemmy.ml
    Autonomous User @lemmy.world

    Framework looking for unpaid workers to be 'Linux Community Ambassadors'

    community.frame.work Framework is looking for Linux Community Ambassadors!

    Framework is looking for Linux Community Ambassadors! We are looking for active members of the Linux community who frequently visit Linux and open-source events throughout the year to help us connect with the larger Linux community. Our volunteer ambassadors will attend local Linux and open-source...

    Framework is looking for Linux Community Ambassadors!