Skip Navigation
Posts
5
Comments
82
Joined
2 yr. ago
  • Removed for being either chatbot slop or a pointless imitation of same.

  • However speaking as someone with success on informatics olympiads

    The rare nerd who can shove themselves into a locker in O(log n) time

  • I once saw the stage adaptation of A Clockwork Orange, and the scientist who conditioned Alexander against sex and violence said almost the same thing when they discovered that he'd also conditioned him against music.

  • Highlights from the comments: @wjpmitchell3 writes,

    Actual psychology researcher: the problem with IQ is A) We don't really know what it's measuring, B.) We don't really know how it's useful, C.) We don't really know how context-specific it is, D.) When people make arguments about IQ, it's often couched around prejudiced ulterior motives. No one actually cares about IQ; they care about what it's a proxy measure of and we don't have good evidence yet to say "This is a reliable and broadly-encompassing representation of intelligence." or whatever else, so if you are trying to use IQ differences to say that there are race differences in intelligence, you have no grounds. The best you can say is there are race differences in this proxy measure that we're still trying to understand. It's dangerous to use an unreliable and possibly inaccurate representation of a phenomena to make policy changes or inform decisions around race. The evidence threshold has to be extremely high because we're entering sensitive ethical spaces, which is something that rationalist don't do well in because their utilitarian calculus has difficulty capturing the intangibles.

    @arnoldkotlyarevsky383 says,

    Nothing wrong with being self educated but she comes across as being not as far along as you would want someone to be in their self-education before being given a platform.

    @User123456767 observes,

    You can kind of tell she grew up as a Calvinist because she still seems to think she's part of the elect she's just replaced an actual big G God with some sort of AI God.

    @jaredsarnie3712 begins,

    I feel like so much of what she says boils down to finding bizarre hypothetical situations where child sexual abuse is morally acceptable.

    And from @Fruuuuuuuuuck:

    Doomscroll gooner arc

  • "DS" in the Retraction Watch comments makes a good observation:

    What scientific book only has 46 references?

    A question for future work: This book is part of a "Transactions on Computer Systems and Networks" series. How many of the others in that series are also slop?

  • Oh, and looking back at the comments on titotal’s post… his detailed elaboration of some pretty egregious errors in AI 2027 didn’t really change anyone’s mind, at most moving them back a year to 2028.

    Huh, what's this I have open in another browser tab:

    The Great Disappointment in the Millerite movement was the reaction that followed Baptist preacher William Miller's proclamation that Jesus Christ would return to the Earth by 1844, which he called the Second Advent. His study of the Daniel 8 prophecy during the Second Great Awakening led him to conclude that Daniel's "cleansing of the sanctuary" was cleansing the world from sin when Christ would come, and he and many others prepared. When Jesus did not appear by October 22, 1844, Miller and his followers were disappointed.

  • It's a bird! It's a plane! It's... Evangelion Unit 1 with a Superman logo and a Diabolik mask.

  • Thomas Claburn writes in The Register:

    IT consultancy Gartner predicts that more than 40 percent of agentic AI projects will be cancelled by the end of 2027 due to rising costs, unclear business value, or insufficient risk controls.

    That implies something like 60 percent of agentic AI projects would be retained, which is actually remarkable given that the rate of successful task completion for AI agents, as measured by researchers at Carnegie Mellon University (CMU) and at Salesforce, is only about 30 to 35 percent for multi-step tasks.

  • It's like when Scott Aaronson got me to sympathize with a cop. A sneersmas miracle.

  • I poked around the search results being pointed to, saw a Ray Kurzweil book and realized that none of these people are worth taking seriously. My condolences to anyone who tries to explain the problems with the "improved" sources on offer.

  • Rather than trying to participate in the "article for deletion" dispute with the most pedantic nerds on Earth (complimentary) and the most pedantic nerds on Earth (derogatory), I will content myself with pointing and laughing at the citation to Scientific Reports, aka "we have Nature at home"

  • Wow, this is shit: https://en.wikipedia.org/wiki/Inner_alignment

    Edit: I have been informed that the correct statement in line with Wikipedia's policies is WP:WOWTHISISSHIT

  • You know, just this once, I am willing to see the "Dead Dove: Do Not Eat" label and be content to leave the bag closed.

  • Or was it a consequence of the fact that capital-R Rationalists just don't shut up?

  • I suppose you could explain that on the talk page, if only you expressed it in acronyms for the benefit of the most pedantic nerds on the planet.

  • There might be enough point-and-laugh material to merit a post (also this came in at the tail end of the week's Stubsack).

  • TechTakes @awful.systems
    blakestacey @awful.systems

    Credulous coverage of AI slop on Wikipedia

    Everybody loves Wikipedia, the surprisingly serious encyclopedia and the last gasp of Old Internet idealism!

    (90 seconds later)

    We regret to inform you that people write credulous shit about "AI" on Wikipedia as if that is morally OK.

    Both of these are somewhat less bad than they were when I first noticed them, but they're still pretty bad. I am puzzled at how the latter even exists. I had thought that there were rules against just making a whole page about a neologism, but either I'm wrong about that or the "rules" aren't enforced very strongly.

    TechTakes @awful.systems
    blakestacey @awful.systems

    Elsevier: Proudly charging you money so its AI can make your articles worse

    Retraction Watch reports:

    All but one member of the editorial board of the Journal of Human Evolution (JHE), an Elsevier title, have resigned, saying the “sustained actions of Elsevier are fundamentally incompatible with the ethos of the journal and preclude maintaining the quality and integrity fundamental to JHE’s success.”

    The resignation statement reads in part,

    In fall of 2023, for example, without consulting or informing the editors, Elsevier initiated the use of AI during production, creating article proofs devoid of capitalization of all proper nouns (e.g., formally recognized epochs, site names, countries, cities, genera, etc.) as well italics for genera and species. These AI changes reversed the accepted versions of papers that had already been properly formatted by the handling editors.

    (Via [Pharyngula](https://freethoughtblogs.com/pharyngula/2024/12/

    TechTakes @awful.systems
    blakestacey @awful.systems

    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 6 October 2025

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

    Like, there was one dude a while back who insist

    TechTakes @awful.systems
    blakestacey @awful.systems

    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

    Like, there was one dude a while back who insist

    TechTakes @awful.systems
    blakestacey @awful.systems

    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 7 July 2024

    Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh facts of Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

    Like, there was one dude a while back who insi