• 8 Posts
  • 818 Comments
Joined 2 years ago
cake
Cake day: August 11th, 2023

help-circle
  • I really am not sure that’s true. I think that has more to do with socialization than one actually being more competitive. I am supposedly a guy and I rarely feel that competitive over stuff like gaming, and that’s despite having anger issues for a good portion of my life.

    As someone else said misogyny tends to push women away from these communities as well.



  • You know I am honestly not so sure. I have seen people who definitely aren’t healthy, and probably not emotionally secure who get and sometimes keep relationships. It’s a lot more complex than you think. Some part of this is because obviously people with similar issues want to be together, but I think as well that things like physical attractiveness do have a role. It’s also the case that being a nice person and being emotionally stable aren’t actually the same thing, and often don’t go together. In fact to me it seems like people who have issues are actually less judgemental. Some of the worst people are those who have never struggled with anything.

    It’s like how people have this concept that they either are or aren’t worthy of love. I don’t think that’s even a valid idea to begin with as there is no universal standard for what people want in a partner. Someone either wants you or they don’t, worthiness just isn’t a large factor.








  • I’ve tried making this argument before and people never seem to agree. I think Google claims their Kubernetes is actually more secure than traditional VMs, but how true that really is I have no idea. Unfortunately though there are already things we depend upon for security that are probably less secure than most container platforms, like ordinary unix permissions or technologies like AppArmour and SELinux.






  • That’s not true though. The models themselves are hella intensive to train. We already have open source programs to run LLMs at home, but they are limited to smaller open-weights models. Having a full ChatGPT model that can be run by any service provider or home server enthusiast would be a boon. It would certainly make my research more effective.


  • There is a lot that can be discussed in a philosophical debate. However, any 8 years old would be able to count how many letters are in a word. LLMs can’t reliably do that by virtue of how they work. This suggests me that it’s not just a model/training difference. Also evolution over million of years improved the “hardware” and the genetic material. Neither of this is compares to computing power or amount of data which is used to train LLMs.

    Actually humans have more computing power than is required to run an LLM. You have this backwards. LLMs are comparably a lot more efficient given how little computing power they need to run by comparison. Human brains as a piece of hardware are insanely high performance and energy efficient. I mean they include their own internal combustion engines and maintenance and security crew for fuck’s sake. Give me a human built computer that has that.

    Anyway, time will tell. Personally I think it’s possible to reach a general AI eventually, I simply don’t think the LLMs approach is the one leading there.

    I agree here. I do think though that LLMs are closer than you think. They do in fact have both attention and working memory, which is a large step forward. The fact they can only process one medium (only text) is a serious limitation though. Presumably a general purpose AI would ideally have the ability to process visual input, auditory input, text, and some other stuff like various sensor types. There are other model types though, some of which take in multi-modal input to make decisions like a self-driving car.

    I think a lot of people romanticize what humans are capable of while dismissing what machines can do. Especially with the processing power and efficiency limitations that come with the simple silicon based processors that current machines are made from.