![](/static/66c60d9f/assets/icons/icon-96x96.png)
![](https://lemmy.ml/pictrs/image/2QNz7bkA1V.png)
Makes it easy to dismiss my argument without bothering to think about it, you mean. Just take abortion, then. Or “tax is theft”, or right to bear arms, or any of a thousand other beliefs you probably don’t agree with.
Makes it easy to dismiss my argument without bothering to think about it, you mean. Just take abortion, then. Or “tax is theft”, or right to bear arms, or any of a thousand other beliefs you probably don’t agree with.
So like, if you were in a restaurant and ordered food, but it never came because a couple of the servers were blocking food from being served because the company wasn’t taking a strong stance against abortion, you’d think “these good people are taking a moral stand, good for them! The company better not take any action against them to make sure I get my food!”
Or for that matter, if Google stopped all cooperation with the IDF, the company’s Jewish employees could (in fact should) disrupt business because Google was supporting terrorism?
It seems to me that you can only support forms of protest you’d be willing to accept when the other side uses them against you. Basically the golden rule.
A while back, one of the image generation AIs (midjourney?) caught flack because the majority of the images it generated only contained white people. Like…over 90% of all images. And worse, if you asked for a “pretty girl” it generated uniformly white girls, but if you asked for an “ugly girl” you got a more racially-diverse sample. Wince.
But then there reaction was to just literally tack “…but diverse!” on the end of prompts or something. They literally just inserted stuff into the text of the prompt. This solved the immediate problem, and the resulting images were definitely more diverse…but it led straight to the sort of problems that Google is running into now.
I’ve installed Linux on at least 20 laptops & desktops in the past decade, many for first-time users. I generally go with Mint or ElementaryOS for newbies. I can’t remember ever having a compatibility issue. I’m sure they still happen sometimes, but when people talk about it they act like it’s still 2005.
They’ve never released proper open-source drivers for Linux, or helped external developers make any, or made it easy to use their closed driver with Linux. They’re just hostile to open source, basically. That used to be pretty common in the old days, but most companies have given up and joined in, which is why installing Linux is usually a smooth experience these days.
If you’re using Linux: get an AMD card. They just work out of the box, no failures to boot to GUI or anything. It just works…like everything else. Which, having spent 20 years fighting with graphics drivers on Linux, is sheer bliss to me.
Oh, but the defacto standard for anything AI-related is NVidia. So if you ever wanna mess with LLMs, object detection, speech recognition, etc…you’re likely stuck with NVidia, and the old routine: Got a problem? Of course you do. Try reinstalling the drivers three times, then uninstall some random other packages, then burn some incense, say 10 Hail Marys, and make an offering to the GPU gods before restarting the computer. Didn’t work? Well, repeat all those steps in a different order. Fifth time’s the charm!
That was true in 2000. The situation had improved a lot by, like, 2005, but it was still pretty rough. You were still likely to have to drop to a console at some point even in 2010.
These days there’s 20 distributions that are easier to install, use, and maintain than Windows, and you don’t even have to know ls
to use it.
You can label your devices. When formatting, do mkfs.ext4 -l my-descriptive-name /dev/whatever
. Now, refer to it exclusively by /dev/disk/by-label/my-descriptive-name
. Much harder to mix up home
and swap
than sdc2
and sdc3
(or, for that matter, two UUIDs).
I mean it seems outrageously greedy, but stop and think about it: if they’d paid for a pizza party, the banner would’ve had to read “Thanks for driving sales and beating plan by $5,999,727!!” And that’s just ugly.
They’ve given me too many headaches…
I.e. you did use them, but learned the hard way why you shouldn’t.
Very likely OP is a student, or entry-level programmer, and is avoiding them because they were told to, and just haven’t done enough refactoring & debugging or worked on large enough code bases to ‘get’ it yet.
Or mutable constants…
A big part of it is that people are so unbelievably cynical now. They’ll rush over one another to point out and then circlejerk over the most negative aspects of every new development, while ignoring every positive.
The old internet would have flipped out over ChatGPT, much less Midjourney, and generated thousands of hilarious stories and images and websites that made ridiculous random comic books or fake government websites for absurd departments or whatever. They would have been delighted with it…and as an afterthought it may have occurred to them that there might be downsides.
Today, people get furious about the fact that AI exists, that it was trained on existing material, that it might affect people’s lives. Long articles are written on the terrible effects AI is going to have on politics or media. Post an AI-generated image in anything other than an AI-art forum, and you’ll be absolutely lambasted. Suggest that there may just be a few updates and watch the downvotes and angry replies flood in.
Part of that is just experience. We’ve lived though a few ‘revolutions’ for which the net effect was…arguably not so great. Part of it is that the age of the average Internet-savvy user is like 35-40 now, not 22, so they’re bringing a level of fear and skepticism that wasn’t there before.
And partly there just seems to be a sort of social malaise and negativity that wasn’t there before. People in 2005 were happy and excited for the future. Now everybody just seems fearful, angry, and burned out.
“No, wait, it’s not what you think! There’s a continuous integration system, a commit would’ve triggered a new build! It might have paged the oncall! Babe! The test suite has been flaky lately!”
So fewer downloads, which is a very different thing.
For real? You’re really suggesting that anybody who says they’re for free speech is actually a fascist pedophile?
First, why is every post on this forum -1? Somebody must be holding a grudge.
Second: it doesn’t matter. ECC just prevents bit flips in RAM, once data leaves a system it’s irrelevant whether it had ECC or not.
I’ve been running servers of various kinds for decades. There is a difference between running servers on hardware with ECC vs none, but it’s not a big deal. Unless you’re running, like, banking software or something where accuracy or uptime is critical…I wouldn’t sweat it. You may just have to reboot cuz of a kernel panic once or twice a year.
Dude I was at this concert, but there was another guy there who was on his phone doing something weird…the whole concert was ruined!
IOW you’ve turned it into a thought-terminating cliche.
I mean we don’t really know how hard they did try. Maybe they finished both those lines and finally said fuck it.
Nice, you avoided having to think on a self-imposed technicality. Real intellectual rigor there.