• 1 Post
  • 83 Comments
Joined 2 years ago
cake
Cake day: July 6th, 2023

help-circle
  • I can give you my experience so far, seeing as the common criticisms of Linux usually boil down to unwillingness to try it as well as kernel level anticheat and Adobe products, and I…honestly don’t miss either of them, but I’m mostly a dev and a single player games enjoyer, so not much to miss, really.

    The speakers on my Razer blade laptop (running EndeavourOS, btw) stopped working randomly, but I’m not convinced it wasn’t my fault since I did have to work on the laptop internals for unrelated reasons and might have screwed something up.

    My webcam on my desktop, a Logitech Brio, has been acting up as of a couple of weeks on Bazzite, where the microphone keeps kinda dying and I have to unplug/re-plug the webcam to have a working mic. Also the audio quality on my Sony XM5s keeps changing to shitty quality, mostly when I do the re-plugging of the webcam, but it’s happened at random times before. Gotta go change the codec on the audio settings every now and then due to it.

    Monitor brightness can sometimes behave weirdly, not going back to a brighter setting after auto-dimming.

    Games with kernel anticheat don’t let me play online.

    This has mostly been it, to be honest. There’s a microscopic learning curve for Bazzite since it’s immutable, so I have flatpaks for most stuff, and “figure it out” for anything else, but other than that, it’s just better than Windows ever was. If you run into an issue, you’re most likely going to be able to solve it with a quick online search or by consulting the eldritch hallucinations of OpenAI or of your choosing.



  • At least learn a little bit about the technology you’re criticizing, such as the difference between fission (aka not fusion) and fusion (aka…fusion), before going on a rant about it saying it’ll never work.

    None of the reactors are being built with output capture in mind at the moment, because output capture is trivial compared to actually having an output, let alone an output that’s greater than the input and which can be sustained. As you’ve clearly learned in this thread, we’re already past having an output, are still testing out ways to have an output greater than an input, with at least one reactor doing so, and we need to tackle the sustained output part, which you’re seeing how it’s actively progressing in real time. Getting the energy is the same it’s always been: putting steam through a turbine.

    Fission is what nuclear reactors do, it has been used in the entire world, it’s being phased out by tons of countries due to the people’s ignorance of the technology as well as fearmongering from parties with a vested interest in seeing nuclear fail, is still safer than any other energy generation method, and would realistically solve our short term issues alongside renewables while we figure out fusion…but as I said, stupid, ignorant people keep talking shit about it and getting it shit down…remind you of anyone?


  • My dude, I understand your unwillingness, but docker is just a fancy new way of saying “install apps without it being a major PITA”. You just find the app you want on docker hub or some other docker repo, you pull the image, you run it, et voila, you have a container. No worrying about python suddenly breaking, or about running 5 commands in a row to spin up an app (I used to do this, including the whole python rain dance, to run home assistant. I feel stupid now).

    Decluttarr actually has a section to set up their container:

    https://github.com/ManiMatter/decluttarr#method-1-docker

    It’s step by step, all you have to do is get docker installed on your machine, then copy paste that text into a file, and run the docker command mentioned in the same directory as the file.

    Trust me, you want to learn this, because after the first 15 minutes of confusion, you suddenly have the holy grail to self hosting right at your fingertips. It takes me all of 5 minutes to add a new service to my homelab all because it’s so easy with docker. And it’s so ubiquitous and popular! TrueNAS SCALE uses docker for all its apps, the idea of containers essentially reshaped Linux desktop to be what it is today, with flatpaks and all.





  • JGrffn@lemmy.worldtoFediverse@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Usually a React dev, have been some other stuff, but generally yeah, websites. Anything from resort chain websites to complex internal applications. Unit tests were optional at best in most jobs I’ve been at. I’ve heard of jobs where they’re pulled off, but from what I’ve seen, those are the exception and not the rule.

    Edit: given the downvotes on my other comment, I should add that this is both anecdotal and unopinionated from my behalf. My opinion on unit testing is “meh”, if I’m asked to do tests, I’ll do tests, if not, I won’t. I wouldn’t go out of my way to implement them, say, on a personal project or most work projects, but if I was tasked to lead certain project and that project could clearly benefit from them (i.e. Fintech, data security, high availability operation-critical tools), I wouldn’t think twice about it. Most of what I’ve worked on, however, has not been that operation-critical. What few things were critical in my work experience, would sometimes be the only code being unit tested.




  • JGrffn@lemmy.worldtoFediverse@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    23
    ·
    1 year ago

    To be fair, I’ve yet had a job that actually pulls off unit testing. Most either don’t bother or just go for the grunt work bare minimum to force pass tests. Most friends in my field have had pretty much the same experience. Unit tests can be just a chore with little to no real benefit. Maybe an opensource project that actually cares about its code can pull it off, but I wouldn’t bat an eye if they never get to it.



  • JGrffn@lemmy.worldtolinuxmemes@lemmy.world:wq!
    link
    fedilink
    arrow-up
    17
    arrow-down
    13
    ·
    edit-2
    1 year ago

    Serious question. Why? No, for real, why? Why are these hard to understand editors still the default on most distros and flavors? Why haven’t they reinvented themselves with easier to understand shortcuts?

    I get the feeling my comment will attract heat, but I’m a web dev, studied comp Sci for years, have worked for nearly a decade and have spent over half my 30 year old life using computers of all sorts. I’m by no means a genius and I by no means know enough about this or most tech subjects, but I literally only knew how to close vim with and without saving changes in a recent vim encounter, purely due to a meme I saw in this community a few days prior, and I had already forgotten the commands by the time I saw this post. Nothing about vim and alternatives feels intuitive or easy to use, and you may say it’s a matter of sitting down and learning, which you can argue that, but you can’t argue this isn’t a bit of a gatekeeper for people trying to dip their toes into anything that could eventually rely on opening vim to do something.

    I won’t try to deny its place in computer history, or its use for many, or even that it is preferred by some, but when every other software with keyboard shortcuts agrees on certain easy to remember standards, I don’t quite understand how software that goes against all of that hasn’t been replaced or hasn’t reinvented itself in newer versions.

    Then again, I have no idea what the difference between vi, vim, emacs, and nano are, so roast away!



  • The article makes no mention to the molecules only working on cancer cells. The molecules, according to the article, attach to cell membranes, and then the molecules are jiggled to blow up the cells. That process doesn’t mention an ability to differentiate between cancer and non-cancer cells. The technique was tried on a culture growth, where a hammer would have the same results. It was also tried on mice, where half were left cancer-free, but little is said about the process, the specifics of the results, or what happened to the other half of mice.

    We all get the goal of cancer research, OP is just doubtful that this achieves it, as am I, as well as anyone who’s read good news about eradicating cancer in the past few decades. Most are duds or go nowhere even if initially promising, so…


  • The problem is the content being uploaded to these platforms serves a real and meaningful public, community purpose. Reddit has always been a knowledge base for a plethora of different subjects, YouTube has all sorts of content that has historical importance to the internet, as well as a trove of educational content that is unparalleled in size and quality.

    I take issue with that, because it’s not the company’s content, it’s just their platform. The content is vastly more important than the platform, but the companies act as if it’s theirs. They do everything based off what the community has built on their platforms, it’s their true essence and what actually attracts people.

    In the specific case of YouTube, I’d say that content is irreplaceable and indispensable. While it’s true that it is a privately owned platform and we don’t have much of a say on its direction, I truly believe the content is so important that the only viable path forward to prevent its loss, is to take said platform off private hands. I don’t believe it’ll ever happen, but it is what should happen, as it’s literally impossible to back up YouTube, just like it’s currently impossible to compete with YouTube.


  • I don’t trust shipping internationally.

    It’s the only way I can get anything, and it can and will go badly, but it is what it is. Currently dealing with returning the wrong version of a pixel 8 pro via Amazon, all the way from Honduras. Amazon’s outsourced CS from India doesn’t help one bit, those guys don’t read for shit. We have one advantage, though, there’s some legal Grey area thingy going on in Honduras and we can import and pay simply based on weight or volume. Like $0.80 per pound if shipped via water. No import fees, even though we should be paying them (and last I checked, they’re high). Gotta love third world countries, amirite?

    SSDs are only 2 times more expensive […] and that makes it worth it for me given all the advantages the offer.

    Speaking as someone from Honduras, 2-3x the price for the same functionality, specifically if it’s going to a NAS, doesn’t cut it for me. HDDs are reliable and cheap enough that, if you have the physical space, they make the most sense, and if you need the extra speed, you throw a couple of SSDs in raid 1 for caching. Maybe if you’re going for a smaller-sized NAS, and especially if you’re going to do stuff like video editing off it, SSDs make sense. For my needs, which is mostly data hoarding/photo editing/content serving through plex or jellyfin, I want the most space and can accept gigabit speeds (although an SSD cache would alleviate speed constraints if I wanted more than gigabit speeds).

    Of course, if you’re not aiming to build or maintain a NAS, absolutely don’t go for an HDD in 2023. That’s probably the same advice I’d give anyone if they’d asked me in the past 5 to 8 years, though.


  • I don’t understand why more people don’t do this, but if you go to pcpartpicker.com, go to start a build, go to storage, and sort by price per gb, you’ll get all the info you need. I’ve purchased Seagate Exos X20 20TB drives for under $350 us dollars this year. I buy off Amazon US and ship to my country, Honduras. I believe ebay has them at $319 or something.

    For reference, that’s around $0.016 usd/gb with some smaller drives going for as low as $0.011 usd/gb (you can get a 6tb Seagate enterprise drive for $64 us dollars), whereas the cheapest SSD you can get is still going to cost you at least twice to three times as much, at $0.037 usd/gb for the cheapest SSD on pcpartpicker, which is still a 2TB SSD for $75 us dollars (crucial p3 plus), amazing value for an SSD but still has a hard time competing with HDDs.