• MalReynolds@slrpnk.net
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    2 days ago

    Cheap fuckers cheaping out, shocker (context is (V)RAM). AI speedrunning enshittification, who’d of thunk.

    • pixxelkick@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 days ago

      Uh… no its just the free models being free, theyre lower cost intentionally to provide free options for people who dont wanna pay subscription fees.

      (context is (V)RAM)

      Eh sort of, its more operating costs, the larger the context size the more expensive the model is to run, literally in terms of power consumption.

      Keep in mind we are on the scale of fractions of cents here, but multiply that by millions of users and it adds up fast.

      But the end result is that the agent will fuck stuff up, and will even quickly /forget/ it fucked that up if you dont catch it asap

      A lot of them have a context window that can be wiped out within like, 2 minutes of steady busywork…

      • Log in | Sign up@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 day ago

        I love how your response to the catastrophic results of stupidly trusting ai is “pay more money to ai companies”.

        Sane person’s response: don’t trust llms.

        • pixxelkick@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 day ago

          What are you talking about.

          No? I never said that.

          I just explained /why/ it happened, I literally nowhere in my post said, or implied, someone should pay for more expensive models. What are you smoking?

          You just have to be aware they have very short memory when using a cheap model and assume anything you wrote 1 minute ago has already left its memory, which is why they produce pretty dumb output if you try and depend on that… so… dont depend on that.

          • Log in | Sign up@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            23 hours ago

            Everyone else who has any sense: llms are shit and you shouldn’t trust them with executive power.

            You: just the cheap ones.

            Me: no, all of them. What kind of lunatic trusts control of anything important to a fundamentally stochastic process?

            • pixxelkick@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              18 hours ago

              You: just the cheap ones

              I never said that. I just said that the cheap ones are especially shitty.

              People on this site really lack reading comprehension it seems.