• 0 Posts
  • 55 Comments
Joined 1 year ago
cake
Cake day: March 3rd, 2024

help-circle


  • ignirtoq@fedia.iotoTechnology@beehaw.orgThe rise of Whatever
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    4 days ago

    The thing is it’s been like that forever. Good products made by small- to medium-sized businesses have always attracted buyouts where the new owner basically converts the good reputation of the original into money through cutting corners, laying off critical workers, and other strategies that slowly (or quickly) make the product worse. Eventually the formerly good product gets bad enough there’s space in the market for an entrepreneur to introduce a new good product, and the cycle repeats.

    I think what’s different now is, since this has gone on unabated for 70+ years, economic inequality means the people with good ideas for products can’t afford to become entrepreneurs anymore. The market openings are there, but the people that made everything so bad now have all the money. So the cycle is broken not by good products staying good, but by bad products having no replacements.







  • It’s not disingenuous. There’s multiple definitions of “offline” being used here, and just because some people aren’t using yours doesn’t mean they’re ignorant or arguing in bad faith.

    Your definition of “offline” is encompassing just the executable code. So under that definition, sure, it’s offline. But I wouldn’t call an application “offline” if it requires an internet connection for any core feature of the application. And I call saving my document a core feature of a word processor. Since I wouldn’t call it “offline” I’m not sure what I would call it, but something closer to “local” or “native” to distinguish it from a cloud based application with a browser or other frontend.


  • I think you’ve already answered your own question. Trump’s first presidency was very different from his second. And the key difference is his advisors. No one knew how to deal with Trump in his first presidency, and the overarching pattern above the chaos was his close advisors constantly working against him to protect the system.

    In the break between his terms, he found people who would follow any of his directives, no matter how stupid or damaging. Or I should say, they found him. Trump is now being manipulated himself by a group of “loyalists” who stand to gain from the exact chaos Trump was thwarted from doing the first time around. These people know enough about how the systems work to be actually dangerous. And now that his advisors are philosophically aligned with him, he’s actually doing what he says he’s going to do. I would argue the 4 years of a Trump reprieve may have been the worst of all possibilities, because it gave Trump and his new in-group time to find each other and prepare.

    But most people weren’t paying attention. Many saw Trump in 2024 just like they saw him in 2016; the counterweight, spoiler, outsider set to upset Washington and get real changes happening. They thought Democrats weren’t helping them enough, and so wanted to upset the apple cart and get someone different in the high seat. They weren’t paying close enough attention to see that this time was actually radically different to all elections before, and I saw many, many people dismissing the warnings as “oh everyone claims their opponents are fascists, or are going to destroy the country.”

    So it’s a combination of people not paying attention, as usual, and Trump actually changing how he’s doing things this time in the worst way possible.


  • Ah, I think I misread your statement of “followers by nature” as “followers of nature.” I’m not really willing to ascribe personality traits like “follower” or “leader” or “independent” or “critical thinker” to humanity as a whole based on the discussion I’ve laid out here. Again, the possibility space of cognition is bounded, but unimaginatively large. What we can think may be limited to a reflection of nature, but the possible permutations that can be made of that reflection are more than we could explore in the lifetime of the universe. I wouldn’t really use this as justification for or against any particular moral framework.


  • I think that’s overly reductionist, but ultimately yes. The human brain is amazingly complex, and evolution isn’t directed but keeps going with whatever works well enough, so there’s going to be incredible breadth in human experience and cognition across everyone in the world and throughout history. You’ll never get two people thinking exactly the same way because of the shear size of that possibility space, despite there having been over 100 billion people to have lived in history and today.

    That being said, “what works” does set constraints on what is possible with the brain, and evolution went with the brain because it solves a bunch of practical problems that enhanced the survivability of the creatures that possessed it. So there are bounds to cognition, and there are common patterns and structures that shape cognition because of the aforementioned problems they solved.

    Thoughts that initially reflect reality but that can be expanded in unrealistic ways to explore the space of possibilities that an individual can effect in the world around them has clear survival benefits. Thoughts that spring from nothing and that relate in no way to anything real strike me as not useful at best and at worst disruptive to what the brain is otherwise doing. Thinking in that perspective more, given the powerful levels of pattern recognition in the brain, I wonder if creation of “100% original thoughts” would result in something like schizophrenia, where the brain’s pattern recognition systems are reinterpreting (and misinterpreting) internal signals as sensory signals of external stimuli.


  • The problem with that reasoning is it’s assuming a clear boundary to what a “thought” is. Just like there wasn’t a “first” human (because genetics are constantly changing), there wasn’t a “first” thought.

    Ancient animals had nervous systems that could not come close to producing anything we would consider a thought, and through gradual, incremental changes we get to humanity, which is capable of thought. Where do you draw the line? Any specific moment in that evolution would be arbitrary, so we have to accept a continuum of neurological phenomena that span from “not thoughts” to “thoughts.” And again we get back to thoughts being reflections of a shared environment, so they build on a shared context, and none are original.

    If you do want to draw an arbitrary line at what a thought is, then that first thought was an evolution of non-/proto-thought neurological phenomena, and itself wasn’t 100% “original” under the definition you’re using here.


  • From your responses to others’ comments, you’re looking for a “thought” that has absolutely zero relationship with any existing concepts or ideas. If there is overlap with anything that anyone has ever written about or expressed in any way before, then it’s not “100% original,” and so either it’s impossible or it’s useless.

    I would argue it’s impossible because the very way human cognition is structured is based on prediction, pattern recognition, and error correction. The various layers of processing in the brain are built around modeling the world around us in a way to generate a prediction, and then higher layers compare the predictions with the actual sensory input to identify mismatches, and then the layers above that reconcile the mismatches and adjust the prediction layers. That’s a long winded way to say our thoughts are inspired by the world around us, and so are a reflection of the world around us. We all share our part of this world with at least one other person, so we’re all going to share commonalities in our thoughts with others.

    But for the sake of argument, assume that’s all wrong, and someone out there does have a truly original, 100% no overlap with anything that has come before, thought. How could they possibly express that thought to someone else? Communication between people relies on some kind of shared context, but any shared context for this thought means it’s dependent on another idea, or “prior art,” so it couldn’t be 100% original. If you can’t share the thought with anyone, nor express it in any way to record it (because that again is communication), it dies with you. And you can’t even prove it without communicating, so how would someone with such an original thought convince you they’ve had it?


  • Math, physics, and to a lesser extent, software engineering.

    I got degrees in math and physics in college. I love talking about counterintuitive concepts in math and things that are just way outside everyday life, like transfinite numbers and very large dimensional vector spaces.

    My favorite parts of physics to talk about are general relativity and the weirder parts of quantum mechanics.

    My day job is software engineering, so I can also help people get started learning to program, and then the next level of building a solid, maintainable software project. It’s more “productive” in the traditional sense, so it’s satisfying to help people be more productive, but when it’s just free time to shoot the shit, talking about math and science are way more fun.



  • I’m sorry, I mostly agree with the sentiment of the article in a feel-good kind of way, but it’s really written like how people claim bullies will get their comeuppance later in life, but then you actually look them up later and they have high paying jobs and wonderful families. There’s no substance here, just a rant.

    The author hints at analogous cases in the past of companies firing all of their engineers and then having to scramble to hire them back, but doesn’t actually get into any specifics. Be specific! Talk through those details. Prove to me the historical cases are sufficiently similar to what we’re starting to see now that justifies the claims of the rest of the article.


  • Think of bad sleep or insufficient sleep like an injury. In ideal conditions your body heals it at a certain rate. You can make it take longer, or you can even make the injury worse, by not taking care of it, but you can’t make it heal faster. And at some point, if you’re consistently not taking care of it, you’ll make part of your injury permanent.

    Similarly with sleep, it’s not a bank balance, it’s damage to your body and brain that you need to repair. And you can only repair the damage with good sleep. You have to get good sleep until you feel better, and then you’ll know you have recovered.

    And if you consistently get bad sleep for too long (a week or more), your brain and body will be permanently changed. Like a permanent injury, you’ll never fully recover some of the damage. It’s hard to overemphasize how important good sleep is to your short- and long-term health.