

LibreWolf or Brave. I use Brave personally, it’s okay
LibreWolf or Brave. I use Brave personally, it’s okay
what a beatiful green car
what do you mean “also”?
nothing gives me more joy than intentionally writing “should of” to annoy pedants like you
I don’t know, feddit.nl is pretty chill. I always see everything and barely anything objectable
should of named it Mulf of Gexico
You May Live To See Crab-Made Horrors Beyond Your Comprehension
scientists need to get their act together. chop chop, fix my stupid brain already
the solution is a cabin in the woods
True, but the newest mistral model is already pretty great
fossify gallery is pretty nice
That’s why I avoid wrestlers
Suspicious lack of Qubes. Who do you work for??? the CIA? China? The Rwandan National Intelligence and Security Agency?
there was a big spike in Roman ticks at that time
The only mod let’s play I ever watched, where the channel immediately deleted the series after getting to that part
let’s just rebrand instances to superleddits and communities to subleddits 🤣
“Reddit but you can block the part that annoys you”
I don’t understand how anyone can buy a Tesla. The lack of a dashboard + the only interface being a tablet alone are a deal breaker for me.
You’re being sold a feature that is really just massive cost cuttings playing impostor as a luxury feature at a premium with 100x worse usability.
In my experience anything similar to qwen-2.5:32B comes closest to gpt-4o. I think it should run on your setup. the 14b model is alright too, but definitely inferior. Mistral Small 3 also seems really good. anything smaller is usually really dumb and I doubt it would work for you.
You could probably run some larger 70b models at a snails pace too.
Try the Deepseek R1 - qwen 32b distill, something like deepseek-r1:32b-qwen-distill-q4_K_M (name on ollama) or some finefune of it. It’ll be by far the smartest model you can run.
There are various fine tunes that remove some of the censorship (ablated/abliterated) or are optimized for RP, which might do better for your use case. But personally haven’t used them so I can’t promise anything.