• 0 Posts
  • 4 Comments
Joined 3 years ago
cake
Cake day: June 30th, 2023

help-circle
  • Eh. I’ve seen enough 300+ HP cars with 10+ year old bald tires and paper thin brake discs to believe otherwise. I personally know two people whose cars have broken wipers that simply don’t work. They don’t care. I know one guy whose car’s passenger door can only be opened by sticking the designated door opening pliers, which are stored under the seat, into the door panel through the hole of that door lock indicator peg thing and then fishing for some lever or whatever. You’re simply not gonna be opening that door in an emergency. One dude at my office has an old manual BMW with a shifter knob that just loosely sits on its lever, and can easily come off if you are not careful. Gotta blindly maneuver the knob back onto its spot underneath the leather cover when that happens. He drives it like that daily. No shortage of hideously dirty diesel engines. No shortage of badly misaligned headlights, nonfunctional brake lights, overly loud engines etc.

    In short I not only think state inspections are a good idea, I even think they should be even stricter.



  • LLMs can’t learn. It’s one of their inherent properties that they are literally incapable of learning. You can train a new model, but you can’t teach new things to an already trained one. All you can do is adjust its behavior a little bit. That creates an extremely expensive cycle where you just have to spend insane amounts of energy to keep training better models over and over and over again. And the wall of diminishing returns on that has already been smashed into. That, and the fact that they simply don’t have concepts like logic and reasoning and knowing, puts a rather hard limit on their potential. It’s gonna take several sizeable breakthroughs to make LLMs noticeably better than they are now.

    There might be another kind of AI that solves those problems inherent to LLMs, but at present that is pure sci-fi.