Nope, not top, but apparently not awful (seems to be her husband coding). Some interesting things happening currently with memory systems for local AI which seem to truly enhance (make smarter) smaller thinking models, meaning you can use more, and more coherent context.
As with most of this stuff I’m waiting for a bit more maturity before looking deeply at it, but there’s definitely some excitement at the moment, and it has the potential to make models that fit on say a 16GB video card capable of many more use cases than previously.
Nope, not top, but apparently not awful (seems to be her husband coding). Some interesting things happening currently with memory systems for local AI which seem to truly enhance (make smarter) smaller thinking models, meaning you can use more, and more coherent context.
As with most of this stuff I’m waiting for a bit more maturity before looking deeply at it, but there’s definitely some excitement at the moment, and it has the potential to make models that fit on say a 16GB video card capable of many more use cases than previously.
Not her husband.
Just repeating what I read elsewhere, the reporting has been somewhat muddy.