Hybrid Attention

by JohannaAlmeida 9 comments 40 points
Read article View on HN

9 comments

[−] empath75 38d ago
Is this for just like auto complete, because you are not going to get anything very useful out of a code-only training set.
[−] JohannaAlmeida 38d ago
Yeah auto complete is an amazing use case. I needed a small model that used transformers , could fit on my weak consumer GPU .

So i needed to make fundamental arquitecture changes .Do some KV cache tricks.

And then prove the new arquitecture was faster with benchmarks and perplexity was acceptable.

[−] altruios 38d ago
I think it's more a proof of concept: locally trained. It would take lots of resources/time to train something non-trivial.
[−] bigbadfeline 38d ago
Well, coding is a kind of extended autocomplete - I prefer that way of working because I don't like the mess created by LLMs when you let them work on their own. Smaller models, specialized on a single language, make a lot of sense.
[−] bigbadfeline 38d ago
I've been interested in faster attention and smaller models for some time but haven't had the time to do serious research so I can't answer your questions.

However, everything you do sounds very interesting, useful and well thought out, please keep doing it, I'd encourage others to work in the same direction too.

I hope, more of us can find the time for more than best wishes in the near future.

[−] JohannaAlmeida 37d ago
Thank you so much . The next thing i want to tackle is the training bottleneck we have right now .

That will probably be another HN post when i figure it out.

[−] JohannaAlmeida 38d ago
Full attention O(n²): 17.96s / 5.6 tok/s

HybridAttention O(n·W + n·D): 0.35s / 286.6 tok/s

[−] woodson 38d ago
Look into RWKV.
[−] JohannaAlmeida 38d ago
Yeah RWKV is definitely related in spirit (recurrent state for long context). Here I’m combining local windowed attention with a gated recurrent path + KV cache compression, so it’s more hybrid than fully replacing attention
[−] MarcelinoGMX3C 38d ago
[dead]
[−] Aegis_Labs 38d ago
[dead]
[−] _2fnr 38d ago
[flagged]