Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)CA
Posts
10
Comments
251
Joined
3 wk. ago
  • Deepseek-r1:1.5b

     undefined
        
    To determine how many 'r's are in the word "strawberry," we can break it down letter by letter:
    
    1. **S** - No
    2. **T** - No
    3. **R** (count 1)
    4. **A** - No
    5. **W** - No
    6. **B** - No
    7. **E** - No
    8. **R** (count 2)
    9. **R** (count 3)
    10. **Y** - No
    
    However, considering the pronunciation of each letter and standard English rules, even though 'Y' is a consonant, it's more accurate to 
    say that "strawberry" contains only two 'r's: one in position 3 and another at position 9.
    
    Final Answer: There are two 'r's in the word "strawberry."
    
    
      
  • There seems to be a big separation between "Big 'D'" and "little 'd'" in the deaf community. e.g. Deaf vs deaf. the latter choosing to use hearing aids and the former not. This gene therapy adds a whole new layer and it'll be interesting to see how it's utilised within the dead communities.

  • RetroGaming @lemmy.world
    catty @lemmy.world

    What are the games you played in your youth that you still play today?

    Are they the 'epics' of their time, or some things that are less well known?

    Community Promo @lemmy.ca
    catty @lemmy.world

    I've just created Ollama - Free LLMs on your computer - for everyone!

    I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.

    I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!

    So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!

    Thanks!

    New Communities @lemmy.world
    catty @lemmy.world

    Ollama - LLMs for everyone!

    I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.

    I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!

    So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!

    Thanks!

    Selfhosted @lemmy.world
    catty @lemmy.world

    I've just created c/Ollama!

    I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.

    I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!

    So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!

    Thanks!

    Funny @sh.itjust.works
    catty @lemmy.world

    Housos Australian TV Series

    We've never heard of this, but I've just come across it and within two minutes I haven't stopped laughing and know I want to binge watch it! Is it as good as it starts?

    Selfhosted @lemmy.world
    catty @lemmy.world

    What is a self-hosted small LLM actually good for (<= 3B)

    I've tried coding and every one I've tried fails unless really, really basic small functions like what you learn as a newbie compared to say 4o mini that can spit out more sensible stuff that works.

    I've tried explanations and they just regurgitate sentences that can be irrelevant, wrong, or get stuck in a loop.

    So. what can I actually use a small LLM for? Which ones? I ask because I have an old laptop and the GPU can't really handle anything above 4B in a timely manner. 8B is about 1 t/s!

    Selfhosted @lemmy.world
    catty @lemmy.world

    What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution

    I was looking back at some old lemmee posts and came across GPT4All. Didn't get much sleep last night as it's awesome, even on my old (10yo) laptop with a Compute 5.0 NVidia card.

    Still, I'm after more, I'd like to be able to get image creation and view it in the conversation, if it generates python code, to be able to run it (I'm using Debian, and have a default python env set up). Local file analysis also useful. CUDA Compute 5.0 / vulkan compatibility needed too with the option to use some of the smaller models (1-3B for example). Also a local API would be nice for my own python experiments.

    Is there anything that can tick the boxes? Even if I have to scoot across models for some of the features? I'd prefer more of a desktop client application than a docker container running in the background.