The adverts were just as cool.
It's what I miss the most
mechanical keyboard communities on lemmy may help here. There are loads of really old, forgotten keyboards that are early or pre rubber dome where the quality is vastly superior.
Deepseek-r1:1.5b
undefined
To determine how many 'r's are in the word "strawberry," we can break it down letter by letter: 1. **S** - No 2. **T** - No 3. **R** (count 1) 4. **A** - No 5. **W** - No 6. **B** - No 7. **E** - No 8. **R** (count 2) 9. **R** (count 3) 10. **Y** - No However, considering the pronunciation of each letter and standard English rules, even though 'Y' is a consonant, it's more accurate to say that "strawberry" contains only two 'r's: one in position 3 and another at position 9. Final Answer: There are two 'r's in the word "strawberry."
There seems to be a big separation between "Big 'D'" and "little 'd'" in the deaf community. e.g. Deaf vs deaf. the latter choosing to use hearing aids and the former not. This gene therapy adds a whole new layer and it'll be interesting to see how it's utilised within the dead communities.
Reaction time in what sense? I read some of it, and it states that slower reactions are part of the normal ageing process as well!
Downloaded it. Turned outs it just plays lunar lander. jk!
The wolves of the seas. Ten thousand years from now, everyone will have a pet killer whale.
I think it could actually be more like the "horse's head in the bed" warning from the God Father.
Would this mean that cultured-cell meat would be unhealthy too?
The sun has entered the chatroom
Well, clearly that ain't true because God wouldn't have made pork pies taste so nice otherwise.
For sure. e.g. Block the road, you block emergency vehicles / assault on emergency workers = terrorism.
Iran should have a national twitter for these kinds of burns
There are so many bad shows in there that are overproduced American trash, and not even 'good trash'.
- "Are you in, yet?"
- "You don't have to, if you don't want to"
- "Have you cum?"
- "What's 3 across, 'the tallest mountain in Europe?'"
NO! NOT THE BIG RED BUTTON!
time for them to set sail to the wild seas again!
and assuming that improvement doesn't plateau, ever,
Because so much money has been thrown at it, for startups, for power generation, for investors, that this is little more than marketing for retail investors to buy into.

What are the games you played in your youth that you still play today?
Are they the 'epics' of their time, or some things that are less well known?

I've just created Ollama - Free LLMs on your computer - for everyone!
I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.
I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!
So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!
Thanks!

Ollama - LLMs for everyone!
I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.
I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!
So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!
Thanks!

I've just created c/Ollama!
I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.
I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!
So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!
Thanks!

Housos Australian TV Series
We've never heard of this, but I've just come across it and within two minutes I haven't stopped laughing and know I want to binge watch it! Is it as good as it starts?

What is a self-hosted small LLM actually good for (<= 3B)
I've tried coding and every one I've tried fails unless really, really basic small functions like what you learn as a newbie compared to say 4o mini that can spit out more sensible stuff that works.
I've tried explanations and they just regurgitate sentences that can be irrelevant, wrong, or get stuck in a loop.
So. what can I actually use a small LLM for? Which ones? I ask because I have an old laptop and the GPU can't really handle anything above 4B in a timely manner. 8B is about 1 t/s!

What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
I was looking back at some old lemmee posts and came across GPT4All. Didn't get much sleep last night as it's awesome, even on my old (10yo) laptop with a Compute 5.0 NVidia card.
Still, I'm after more, I'd like to be able to get image creation and view it in the conversation, if it generates python code, to be able to run it (I'm using Debian, and have a default python env set up). Local file analysis also useful. CUDA Compute 5.0 / vulkan compatibility needed too with the option to use some of the smaller models (1-3B for example). Also a local API would be nice for my own python experiments.
Is there anything that can tick the boxes? Even if I have to scoot across models for some of the features? I'd prefer more of a desktop client application than a docker container running in the background.