Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Darn, I was hoping the RWKV people had finally obtained reportable results. This is still interesting, though. Maybe we will see more alternatives to transformers soon


RWKV had a paper accepted at EMNLP and released models which match the performance of equivalent transformers.

What else are you looking for?


They Do, the latest rwkv v5, matches mamba at 3b scale, and from the benchmarks I see, its similar to hyena


There are RWKV numbers in there, so you sort of got your wish sideways. :)

Has RWKV not released anything until now? I thought it was an open project that was in use, if sort of by a hipster 1%



V5 7b is out, close to hyena, gets 1400 t/s on a 3090, while an h100 llama 7b 8bit is 1200 t/s




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: