Besides that bugs flying seems an amazing task to me in terms of processing, specially if you compare the amount of power used to something like cars autopilot, bugs flying is part of bug survival, which in my opinion is closer to general intelligence than memorizing tokens.
Comparing "bug flying"/"bug survival" to "memorizing tokens" is disingenuous. They're not in the same category of task at all. You're essentially comparing the output of one system to the input of another system.
This kind of things make me think LLMs are quite far from AGI.