• 3 Posts
  • 52 Comments
Joined 6 months ago
cake
Cake day: August 27th, 2025

help-circle
  • I’m not sure they still teach the FANBOYS system - at least not as I learned it: a “use this, not that” prescription for tightening sentence structure.

    A quick DuckDuckGo search suggests they are now, and perhaps always have been, used in conjunction with commas. Which, frankly, makes my skin crawl.

    “She was tired, and she needed to eat.”

    “It was the best of times, and it was the worst of times.”

    Evil. Great Evil.

    Perhaps I’m caviling against flabby sentences rather than flawed punctuation but I maintain that the construction reliably signals the former.










  • That’s the thing. It’s not that the LLMs can’t solve the problem…it’s the way they’re optimized.

    To give the crude analogy: if most LLMs are set up for the equivalent of typing BOOBS on a calculator (the big players are happy to keep it that way; more engagement, smoother vibes etc), constraints first approach is what happens when you use a calculator to do actual maths.

    2+2=4 (always, unless shrooms are in play).

    I said this before, so pardon me for being gauche and quoting myself

    Every reasoning system needs premises - you, me, a 4yr old. You cannot deduce conclusions from nothing. Demanding that a reasoner perform without premises (note: constraints) isn’t a test of reasoning, it’s a demand for magic. Premise-dependence isn’t a bug, it’s the definition.

    People see things like Le-Chat fall over and go “Ha ha. Auto-complete go brrr”. That’s lazy framing. A calculator is “just” voltage differentials on silicon. That description is true and also tells you nothing useful about whether it’s doing arithmetic.

    My argument is this: the question of whether something is or isn’t reasoning IS NOT answered by describing what it runs on; it’s answered by looking at whether it exhibits the structural properties of reasoning. I think LLMs can do that…they’re just borked (…intentionally?). Case in point - see my top post.

    I literally “Tony Stanked” my way to it. Now imagine if someone with resources and a budget did it.




  • Still…1 in 3. Woof.

    A “charitable” read might be

    • Misunderstood the question
    • Assume priors (eg: You’re the King Of Londinium and people come to wash your car from nearby gas station?)
    • Schitzoid embolism
    • Trolollolo

    At the same time, I think it’s fair if we’re willing to do that for people, we extend a soupcon of it to the clankers. At least a bit. Like I said, I think there’s some interesting stuff going on under the hood.

    Having been accused of being a clanker myself (as recently as yesterday), I’m aware that having anything positive to say about AI (even bespoke, free range, home cooked LLM) is “stunning and brave”. But hey, sometimes you just have to tilt at windmills