

That kind of data sanitization is just standard practice. You need some level of confidence on your data’s accuracy, and for anything normally distributed, throwing out obvious outliers is a safe assumption.
That kind of data sanitization is just standard practice. You need some level of confidence on your data’s accuracy, and for anything normally distributed, throwing out obvious outliers is a safe assumption.
That’s quite a strong table, holding 11 people
Maybe my taste buds are just broken, but for me, candy has always been either very sour for a very short time, or slightly sour all the way through. I’ve never had anything be very sour all the way through.
Love Platinum, they’re the only fountain pens I can even consider using, as they’re the only ones I found that can handle my low writing volume without drying out between uses.
Eh, I have audio interfaces and MIDI controllers on 10ft cables cause shorter just don’t reach my PC, works perfectly fine. Longer than that is a gamble but as far as I know 10ft is the upper bound of the USB 3.0 spec, so should be totally fine unless you have especially shitty cables.
A “server” is just a remote computer “serving” you stuff, after all. Although, if you have stuff you would have trouble setting up again from scratch, I’d recommend you look into making at least these parts of your setup repeatable, be it something fancy ala Ansible, or even just a couple of bash scripts to install the correct packages and backing up your configs.
Once you’re in this mindset and take this approach by default, changing machines becomes a lot less daunting in general. A new personal machine takes me about an hour to setup, preparing the USB included.
If it’s stuff you don’t care about losing, ignore everything I just said. But if you do care about it, I’d slowly start by giving from the most to least critical parts. There’s no better time to do it than when things are working well haha!
58% goes to fundraising, administrative and technological costs. The rest has some money going towards, but no limited to, other programs.
Only thing I can find in their financials that would maybe qualify as “random outreach” would be “awards and grants”, at 26mil last year out of 185mil revenue, or 14%.
https://meta.m.wikimedia.org/wiki/Grants:Programs/Wikimedia_Community_Fund
As far as I can tell, it’s not particularly random.
Maybe I’m missing something?
Tramp is more featured, but if all one cares about is being able to edit remote files using a local editor, vim can edit remote files with scp too: scp://user@server[:port]//remote/file.txt
I tried tramp-mode at some point, but I seem to remember some gotchas with LSP and pretty bleh latency, which didn’t make it all that useful to me… But I admittedly didn’t spend much time in emacs land.
Eh, I’m about the same age as OP, I don’t have to get to 50 to know that I’d take my parents’ economic context over the two crashes. The rest… For many reasons, if medicine does some miraculous leap forward by then, maybe I’ll still wish I got a lot more left to go by then.
Really bigger updates obviously require a major version bump to signify to users that there is potential stability or breakage issues expected.
If your software is following semver, not necessarily. It only requires a major version bump if a change is breaking backwards compatibility. You can have very big minor releases and tiny major releases.
there was more time for people to run pre-release versions if they are adventurous and thus there is better testing
Again, by experience, this is assuming a lot.
From experience shipping releases, “bigger updates” and “more tested” are more or less antithetical. The testing surface area tends to grow exponentially with the amount of features you ship with a given release, to the point I tend to see small, regular releases, as a better sign of stability.
I’d love to share your optimism, especially regarding that last sentence. As long as Google controls the most popular web browser out there, I don’t see the arms race ever stopping, they’ll just come up with something else. It wouldn’t be the first time they push towards something nobody asked for that only benefits themselves.
Your first hint that this is a naive take is that you’re brushing off a societal issue to a single, external factor.
I do connect to VMs and containers all the time, I just don’t see a reason not to speed myself up on my own machines because of it. To me, the downside of typing an alias on a machine that doesn’t have it once in a while, is much less than having to type everything out or searching my shell history for longer commands every single time. My shell configs are in a dotfiles repo I can clone to new personal/work machines easily, and I have an alias to rsync some key parts to VMs if needed. Containers, I just always assume I don’t have access to anything but builtins. I guess if you don’t do the majority of your work on a local shell, it may indeed not be worth it.
I’d rather optimize for the 99% case, which is me getting shit done on my machine, than refuse to use convenient stuff for the sake of maybe not forgetting a command I can perfectly just look up if I do legitimately happen to forget about it. If I’m on a remote, I already don’t have access to all my usual software anyway, what’s a couple more aliases? To me this sounds like purposefully deciding to slow yourself down cutting paper with a knife all the time cause you may not have access to scissors when you happen to sit at someone else’s desk.
Once the monopoly is in place, what’s protecting said “reasonable cost for the consumer”, exactly?
Music (and other art forms) happen to trigger our brains to shoot the same happy/sad/etc chemicals other less abstract physical experiences do, for reasons we don’t completely understand. I’m utterly confused why being aware of them, or having the curiosity of wanting to learn more about it, is “what’s going wrong with society”. If anything, curiosity is one of the main things that kickstarted us as a species, and brushing it off to some abstract “deeper layers of human existence” like it was some sorcery we shouldn’t dare try to understand would be way more concerning about our state as a society. As for the completeness of this particular theory… I mean, we are on /c/showerthoughts after all.
Jazz has patterns and repetition, like any interesting music genre. If it didn’t, it’d be called noise. They just aren’t as in your face and predictable as the ones employed by pop genres.
We do get what you mean (extremely condescending and reductive take, if you ask me). I was thinking rigidly along the lines of data engineering, as this is, well, a data engineering problem… There just isn’t 30% of people doing this on Google captchas, and this isn’t a “take”, just a reality of the scale and amount of people interacting with Google products. Have fun all you want, you do this, your data most likely gets thrown out, that’s all.
We’re still talking about image recognition, aren’t we? This feels like a general commentary on how Big Tech sees their customer base, which I don’t disagree with, but in my mind was just another discussion entirely…