Skip Navigation
Posts
19
Comments
72
Joined
4 mo. ago
  • I mean, maybe that is a possibility. I just never thought of it. I would recommend checking the official documentation. Might help shed some light on this.

  • I see. I don't think there there are many solutions on that front for Android. For PC there are a few, such as LM Studio.

  • Thanks for your comment. That for sure is something to look out for. It is really important to know what you're running and what possible limitations there could be. Not what the original comment said, though.

  • This is all very nuanced and there isn't a clear cut answer. It really depends on what you're running, for how long you're running it, your device specs, etc. The LLMs I mentioned in the post did just fine and did not cause any overheating if not used for extended periods of time. You absolutely can run a SMALL LLM and not fry your processor if you don't overdo it. Even then, I find it extremely unlikely that you're going to cause permanent damage to your hardware components.

    Of course that is something to be mindful of, but that's not what the person in the original comment said. It does run, but you need to be aware of the limitations and potential consequences. That goes without saying, though.

    Just don't overdo it. Or do, but the worst thing that will happen is your phone getting hella hot and shutting down.

  • For me the biggest benefits are:

    • Your queries don't ever leave your computer
    • You don't have to trust a third party with your data
    • You know exactly what you're running
    • You can tweak most models to your liking
    • You can upload sensitive information to it and not worry about it
    • It works entirely offline
    • You can run several models
  • I am not entirely sure. My best piece of advice here would be: check their in depth documentation on GitLab. It might answer your questions.

  • That's great news! I'd love for it to be added to a wiki. Just make sure that whatever version of this post is added to the wiki is the most updated one.

  • No. That has to do with how the Tor network works. The bridge forwards the connection to a non exit relay. You do not communicate with an exit relay whatsoever. The middle relay does, but the exit relay doesn't know who are are and you don't know who the exist relay is.

  • I am not entirely sure, to be completely honest. In my experience, it is very little but it varies too. It really depends on how many people connect, for how long they connect, etc. If you have limited upload speeds, maybe it wouldn't be a great idea to run it in your browser/phone. Maybe try running it directly on your computer using the -capacity flag?

    I haven't been able to find any specific numbers either, but I did find a post on the Tor Forum dated April 2023 or a user complaining about high bandwidth usage. This is not the norm in my experience, though.

  • There are a few. There's Private AI. It is free (as in beer) but it's not libre (or open source). The app is a bit sketchy too, so I would still recommend doing as the tutorial says.

    Out of curiosity, why do you not want to use a terminal for that?

  • I don't know that one. Is it FOSS?

  • You are completely right. That was worded poorly and a few users have thankfully pointed that out. The answer, for most people, is yes. But that depends entirely on your threat model.

    The traffic to your Snowflake proxy isn't necessarily from people in 'adversarial countries'. A Snowflake proxy is a type of bridge, so just about anyone can use it. You can use a Snowflake bridge, if you want. However, it is strongly encouraged to save bridges (including Snowflakes) to people who need them.

    So, for most people, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but, to them, it will look like you're calling someone on, say, Zoom since it uses WebRTC technology. They can't see the data, though since everything is encrypted (check the Snowflake docs and Tor Brower's for further reference). You probably won't get in any trouble for that.

    Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement, but none of them have been arrested or prosecuted so far.

    If you know of any, let me know.

  • Thank you for pointing that out. That was worded pretty badly. I corrected it in the post.

    For further clarification:

    The person who is connecting to your Snowflake bridge is connecting to it in a p2p like connection. So, the person does know what your IP address is, and your ISP also knows that the person's IP address is – the one that is connecting to your bridge.

    However, to both of your ISPs, it will look like both of you are using some kind of video conferencing software, such as Zoom due to Snowflake using WebRTC technology, making your traffic inconspicuous and obfuscating to both of your ISPs what's actually going on.

    To most people, that is not something of concern. But, ultimately, that comes down to your threat model. Historically, there haven't any cases of people running bridges or entry and middle relays and getting in trouble with law enforcement.

    So, will you get in any trouble for running a Snowflake bridge? The answer is quite probably no.

    For clarification, you're not acting as an exit node if you're running a snowflake proxy. Please, check Tor's documentation and Snowflake's documentation.

  • Not true. If you load a model that is below your phone's hardware capabilities it simply won't open. Stop spreading fud.

  • Privacy on Android @lemmy.ml
    llama @lemmy.dbzer0.com

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328

    Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices!


    Step 1: Install Termux

    1. Download and install Termux from the Google Play Store or F-Droid

    Step 2: Set Up proot-distro and Install Debian

    1. Open Termux and update the package list:
       bash
          
      pkg update && pkg
        
    Privacy @lemmy.dbzer0.com
    llama @lemmy.dbzer0.com

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328

    Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices!


    Step 1: Install Termux

    1. Download and install Termux from the Google Play Store or F-Droid

    Step 2: Set Up proot-distro and Install Debian

    1. Open Termux and update the package list:
       bash
          
      pkg update && pkg
        
    Privacy @lemmy.world
    llama @lemmy.dbzer0.com

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328

    Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices!


    Step 1: Install Termux

    1. Download and install Termux from the Google Play Store or F-Droid

    Step 2: Set Up proot-distro and Install Debian

    1. Open Termux and update the package list:
       bash
          
      pkg update && pkg
        
    Privacy @lemmy.ml
    llama @lemmy.dbzer0.com

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328

    Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices!


    Step 1: Install Termux

    1. Download and install Termux from the Google Play Store or F-Droid

    Step 2: Set Up proot-distro and Install Debian

    1. Open Termux and update the package list:
       bash
          
      pkg update && pkg
        
    Open Source @lemmy.ml
    llama @lemmy.dbzer0.com

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328

    Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices!


    Step 1: Install Termux

    1. Download and install Termux from the Google Play Store or F-Droid

    Step 2: Set Up proot-distro and Install Debian

    1. Open Termux and update the package list:
       bash
          
      pkg update && pkg
        
    Free Open-Source Artificial Intelligence @lemmy.world
    llama @lemmy.dbzer0.com

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328

    Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices!


    Step 1: Install Termux

    1. Download and install Termux from the Google Play Store or F-Droid

    Step 2: Set Up proot-distro and Install Debian

    1. Open Termux and update the package list:
       bash
          
      pkg update && pkg
        
    Privacy @sopuli.xyz
    llama @lemmy.dbzer0.com

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616

    Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android)

    Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works:


    What Is Snowflake?

    Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a

    Privacy @programming.dev
    llama @lemmy.dbzer0.com

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616

    Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android)

    Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works:


    What Is Snowflake?

    Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a

    privacy @lemmy.ca
    llama @lemmy.dbzer0.com

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616

    Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android)

    Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works:


    What Is Snowflake?

    Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a

    Privacy @lemmy.world
    llama @lemmy.dbzer0.com

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616

    Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android)

    Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works:


    What Is Snowflake?

    Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a

    Privacy @lemmy.dbzer0.com
    llama @lemmy.dbzer0.com

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616

    Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android)

    Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works:


    What Is Snowflake?

    Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a

  • Though apparently I didn't need step 6 as it started running after I downloaded it

    Hahahha. It really is a little redundant, now that you mention it. I'll remove it from the post. Thank you!

    Good fun. Got me interested in running local LLM for the first time.

    I'm very happy to hear my post motivated you to run an LLM locally for the first time! Did you manage to run any other models? How was your experience? Let us know!

    What type of performance increase should I expect when I spin this up on my 3070 ti?

    That really depends on the model, to be completely honest. Make sure to check the model requirements. For llama3.2:2b you can expect a significant performance increase, at least.

  • Of course! I run several snowflake proxies across my devices and their browsers.

  • Tor - The Onion Router @lemmy.ml
    llama @lemmy.dbzer0.com

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616

    Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android)

    Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works:


    What Is Snowflake?

    Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a

    Privacy Guides @lemmy.one
    llama @lemmy.dbzer0.com

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616

    Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android)

    Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works:


    What Is Snowflake?

    Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a

    Privacy @lemmy.ml
    llama @lemmy.dbzer0.com

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616

    Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android)

    Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works:


    What Is Snowflake?

    Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a

    LocalLLaMA @sh.itjust.works
    llama @lemmy.dbzer0.com

    How to run LLaMA (and other LLMs) on Android.

    Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices!


    Step 1: Install Termux

    1. Download and install Termux from the Google Play Store or F-Droid

    Step 2: Set Up proot-distro and Install Debian

    1. Open Termux and update the package list:
       bash
          
      pkg update && pkg upgrade
      
      
        
    2. Install proot-distro
       bash
          
      pkg install proot-distr
        
    Ask Lemmy @lemmy.world
    llama @lemmy.dbzer0.com

    How do you feel about your content getting scraped by AI models?

    I created this account two days ago, but one of my posts ended up in the (metaphorical) hands of an AI powered search engine that has scraping capabilities. What do you guys think about this? How do you feel about your posts/content getting scraped off of the web and potentially being used by AI models and/or AI powered tools? Curious to hear your experiences and thoughts on this.


    #Prompt Update

    The prompt was something like, What do you know about the user [email protected] on Lemmy? What can you tell me about his interests?" Initially, it generated a lot of fabricated information, but it would still include one or two accurate details. When I ran the test again, the response was much more accurate compared to the first attempt. It seems that as my account became more established, it became easier for the crawlers

    /0 @lemmy.dbzer0.com
    llama @lemmy.dbzer0.com

    AI Data Scrapers: How do you feel about them?

    I created this account two days ago, but one of my posts ended up in the (metaphorical) hands of an AI powered search engine that has scraping capabilities. What do you guys think about this? How do you feel about your posts/content getting scraped off of the web and potentially being used by AI models and/or AI powered tools? Curious to hear your experiences and thoughts on this.


    #Prompt Update

    The prompt was something like, What do you know about the user [email protected] on Lemmy? What can you tell me about his interests?" Initially, it generated a lot of fabricated information, but it would still include one or two accurate details. When I ran the test again, the response was much more accurate compared to the first attempt. It seems that as my account became more established, it became easier for the crawlers

    No Stupid Questions @lemmy.world
    llama @lemmy.dbzer0.com

    Is Threads fully integrated with the Fediverse?

    I use both Threads and Mastodon. However, I realized that sometimes (public) profiles on Threads don't show up on Mastodon and vice versa. I also realized that most comments made on Threads posts don't show up on Mastodon – that is, if the posts appear on Mastodon at all. The same is true the other way around. Why does this happen?

    Ask Lemmy @lemmy.world
    llama @lemmy.dbzer0.com

    Are there any mental health communities here on Lemmy?

    I've been using Lemmy since the Reddit exodus. I haven't looked back since, but I miss a lot of mental health communities that I haven't been able to find replacements for here on Lemmy. Does anyone know any cool mental health communities that are somewhat active?