
I think you're thinking of SerenityOS (although it isn't actually a Linux):

Beehaw's been holding back because they want to switch to Sublinks, and going past 0.18.4 would've made migration harder. But Sublinks has been slow to be ready and it's getting untenable to wait much longer, so they'll be upgrading to the latest Lemmy sometime. I'll be happy to finally switch to the actual app version of Voyager when they do, for now I've been self-hosting the last version of the web app that worked.

Yeah, I only know it from Hitchhiker's Guide, where Oolon Colluphid follows up his famous proof of the non-existence of God with a proof that black is white and gets run down at a zebra crossing. I was a kid when I first read it so I took it literally, haha. It does kinda fit with the other absurd bits of that series.

It's the same rule, "fair use". Copyright isn't absolute, it needs to strike a balance between "give creators control of their thing" but also "people deserve to participate in our collective culture."
Making a one-off drawing of a character and not trying to make money off of it likely checks the fair use boxes (it's an explicitly fuzzy system, so a trial would be needed to say for sure if it's fair use or not). Whether the training set for a generative AI system is fair use or not is still an open question, but many feel that it can't be, as it's operating on a massive scale (basically every image ever created by humanity) and has the potential to eliminate the entire industry of humans selling the art they create, which copyright is supposed to protect. Ghibli isn't going to be harmed by someone drawing a picture of their characters for a meme. It could be harmed by another company making money off of mass production of knockoffs of their style which were created with thousands of unauthorized copies of their direct artwork.

Even if the right move was "give up and do what the Republicans want," they still did a terrible job. House Democrats held the line and stuck their necks out, only to get blindsided, and Schumer shouldn't have signaled that there'd be a fight right before he caved. The left hand doesn't know what the right hand is doing, and only a handful of people in the party seem to even be trying to do anything.

Yeah, they're probably talking about nulls. In Java, object references (simplified pointers, really) can be null
, pointing nowhere and throwing an exception if you try to access them, which is fine when you don't have a value for that reference (for example, you asked for a thing that doesn't exist, or you haven't made the thing yet), but it means that every time you interact with an object, if it turns out to have been null, a null pointer exception is getting thrown and likely crashing your program. You can check first if you think a value might be null, but if you miss one, it explodes.
Kotlin has nulls too, but the type system helps track where they could be. If a variable can be null, it'll have a type like String?
, and if not, the type is String
. With that distinction, a function can explicitly say "I need a non-null value here" and if your value could be null, the type system will make you check first before you can use it.
Kotlin also has some nice quality of life improvements over Java; it's less verbose (not a hard task), doesn't force everything to belong to a class, supports data classes which are automatically immutable and behave more like primitive values than objects, and other improvements.

They could organize protests, they could help workers unionize, they could put their necks out and disrupt things, they could do anything besides stand by and say "oh no, this is so bad." They have a gigantic megaphone and the ears of almost half the country, their power isn't limited to the votes they have or don't have. I want them to be making plans that are bold, plans where they feel a need to account for "how do we make sure this doesn't turn into an outright riot though," the things you'd do if you actually believed the rhetoric about Trump being a threat to democracy.

Fellow former popes, you mean, by reading this message you hereby excommunicate yourself.

I want my tax dollars to be used for something useful, not buying special little numbers

I'm bitterly clinging to my iPhone 13 mini, because I suspect it's the last phone I'll ever actively enjoy. I went along with bigger phones when that became the trend and decided I didn't like them, and the mini line was such a relief to go back to. Once it's no longer tenable, I'll probably just buy a series of "the least bad used phone I can find" because I know I'll be mildly frustrated every time I use it.

I'm still using an iPhone mini and I haven't experienced any bad layouts, broken websites, or any difficulty like that. It has the same resolution of the biggest iPhone I've ever had (iPhone X) so things are smaller, which would make it a poor fit for someone with poor vision, but for me it's an absolutely perfect phone. It's frustrating to know that the perfect phone for me could easily exist, and yet Apple will refuse to make it for me. I'll be stuck with phones I don't like for the rest of my life, it seems.

It's the last one, the "wait a day" option and the "pay $20" options aren't equivalent. If it's still a day away from viability, it isn't viable yet, but if it's $20 away, it is. You may be of the opinion that waiting a day isn't a big deal, or is only $20 worth of hardship, but that's not your choice to make for others.
You'd think ending a doomed pregnancy would be a simple matter even for pro-lifers, yes. They often don't consider the issue, or assume that it'll always be clear-cut and obvious in every circumstance, or worry that any exception will be used as a loophole.

I can't believe this word doesn't seem to have made it into any part of this thread, but I think you're looking for viability: the point where a fetus can live outside of the womb. This isn't a hard line, of course, and technology can and has changed where that line can be drawn. Before that point, the fetus is entirely dependent on one specific person's body, and after that point, there are other options for caring for it. That is typically where pro-choice folks will draw the line for abortion as well; before that point, an abortion ban is forced pregnancy and unacceptable, after that point there can be some negotiation and debate (though that late into a pregnancy, if an abortion is being discussed it's almost certainly a health crisis, not a change of heart, so imposing restrictions just means more complications for an already difficult and dangerous situation).

I empathize with that frustration. The process of thinking you're right, learning you're wrong, and figuring out why is very fundamentally what coding is. You're taking an idea in one form (the thing you want to happen in your mind) and encoding it into another, very different form, a series of instructions to be executed by a computer, and your first try is almost always slightly wrong. Humans aren't naturally well-adapted to this task because we're optimized for instructing other humans, who will usually do what they think you mean and not always what you actually said, can gloss over or correct small mistakes or inconsistencies, and will act in their own self-interest when it makes sense, but a computer won't behave that way, it requires you to bend completely to how it works. It probably makes me a weirdo, but I actually like that process, it's a puzzle-solving game for me, even when it's frustrating.
I do think asking an AI for help with something is a useful way to use it, that really isn't all that different from checking a forum (in fact, those forums are probably what it's drawing from in the first place), and hallucinations aren't too damaging because you'll be checking the AI's answer when you try what it says and see if it works. It's more the blindly accepting code that it produces that I think is harmful (and you aren't doing that, it sounds like.) In an IDE it's really easy to quickly make pages of code without engaging the brain, and it works well enough to be very tempting, but not, as I'm sure you know, well enough to do the whole thing.

Yeah, totally fair. I'll note that you're kind of describing the typical software development process of a customer talking to the developer and developing requirements collaboratively with them, then the developer coming back with a demo, the customer refining by going "oh, that won't work, it needs to do it this way" or "that reminds me, it also needs to do this", and so on. But you're closer to playing the role of the customer in this scenario, and acting like more of an editor or manager on the development side. The organizers of a game jam could make a reasonable argument that doing it this way is akin to signing up for the game jam, coming up with an idea, then having your friend who isn't signed up for the game jam implement it for you, when the point is to do it all in person, quickly, in a fun and energetic environment. The people doing a game jam like coding, that's the fun part for them, so someone signing up and skipping all that stuff does have a little bit of a "why are you even here then" aspect to it. Of course it depends on the degree the AI is being used, how much editorial control or tweaking you're doing, it's a legitimate debate and I don't think you're wrong to want to participate.

I'll acknowledge that there's definitely an element of "well I had to do it the hard way, you should too" at work with some people, and I don't want to make that argument. Code is also not nearly as bad as something like image generation, where it's literally just typing a thing and getting a not-very-good image back that's ready to go; I'm sure if you're making playable games, you're putting in more work than that because it's just not possible to type some words and get a game out of it. You'll have to use your brain to get it right. And if you're happy with the results you get and the work you're doing, I'm definitely not going to tell you you're doing it wrong.
(If you're trying to make a career of software engineering or have a desire to understand it at a deeper level, I'd argue that relying heavily on AI might be more of a hindrance to those goals than you know, but if those aren't your goals, who cares? Have fun with it.)
What I'm talking about is a bigger picture thing than you and your games; it's the industry as a whole. Much like algorithmic timelines have had the effect of turning the internet from something you actively explored into something you passively let wash over you, I'm worried that AI is creating a "do the thinking for me" button that's going to be too tempting for people to use responsibly, and will result in too much code becoming a bunch of half-baked AI slop cobbled together by people who don't understand what they're really doing. There's already enough cargo culting around software, and AI will just make it more opaque and mysterious if overused and over-relied on. But that's a bigger picture thing; just like I'm not above laying back and letting TikTok wash over me sometimes, I'm glad you're doing things you like with the assistance you get. I just don't want that to become the only way things happen either.

The irony is that most programmers were just googling and getting answers from stackoverflow, now they don't even need to Google.
That's the thing, though, doing that still requires you to read the answer, understand it, and apply it to the thing you're doing, because the answer probably isn't tailored to your exact task. Doing this work is how you develop an understanding of what's going on in your language, your libraries, and your own code. An experienced developer has built up those mental muscles, and can probably get away with letting an AI do the tedious stuff, but more novice developers will be depriving themselves of learning what they're actually doing if they let the AI handle the easy things, and they'll be helpless to figure out the things that the AI can't do.
Going from assembly to C does put the programmer at some distance from the reality of the computer, and I'd argue that if you haven't at least dipped into some assembly and at least understand the basics of what's actually going on down there, your computer science education is incomplete. But once you have that understanding, it's okay to let the computer handle the tedium for you and only dip down to that level if necessary. Or learning sorting algorithms, versus just using your standard library's sort()
function, same thing. AI falls into that category too, I'd argue, but it's so attractive that I worry it's treating important learning as tedium and helping people skip it.
I'm all for making programming simpler, for lowering barriers and increasing accessibility, but there's a risk there too. Obviously wheelchairs are good things, but using one simply "because it's easier" and not because you need to will cause your legs to atrophy, or never develop strength in the first place, and I'm worried there's a similar thing going on with AI in programming. "I don't want to have to think about this" isn't a healthy attitude to have, a program is basically a collection of crystallized thoughts and ideas, thinking it through is a critical part of the process.

The entire country shifted red. They would've had to implement this system in all 50 states, even ones that didn't matter, and across 50 different voting systems, many of which are entirely paper-based, and not leave a single scrap of evidence. Individual ballots are secret, but lots of other records are not, including who did and did not vote in each precinct, and how many ballots were cast for each candidate, so if they were just injecting lots of fake ballots, the numbers wouldn't add up. The simple fact is, the 2020 election wasn't stolen, and neither was the 2024 one.

You know, they have pre-wrapped sausages, but they don't have pre-wrapped bacon.

I see this as an accessibility problem, computers have incredible power but taking advantage of it requires a very specific way of thinking and the drive to push through adversity (the computer constantly and correctly telling you "you're doing it wrong") that a lot of people can't or don't want to do. I don't think they're wrong or lazy to feel that way, and it's a barrier to entry just like a set of stairs is to a wheelchair user.
The question is what to do about it, and there's so much we as an industry should be doing before we even start to think about getting "normies" writing code or automating their phones. Using a computer sucks ass in so many ways for regular people, you buy something cheap and it's slow as hell, it's crapped up with adware and spyware out of the box, scammers are everywhere ready to cheat you out of your money... anyone here is likely immune to all that or knows how to navigate it but most people are just muddling by.
If we got past all that, I think it'd be a question of meeting users where they are. I have a car but I couldn't replace the brakes, nor do I want to learn or try to learn, but that's okay. My car is as accessible as I want it to be, and the parts that aren't accessible, I go another route (bring it to a mechanic who can do the things I can't). We can do this with computers too, make things easy for regular people but don't try to make them all master programmers or tell them they aren't "really" using it unless they're coding. Bring the barrier down as low is it can go but don't expect everyone to be trying to jump over it all the time, because they likely care about other things more.