The rate of change in the AI space is super fast. What are some of the best ways to stay on top on what's going on in AI (let's say on a weekly or bi-weekly basis) without having to be glued to social media or compile a bunch of different sources together from across the Internet manually?
Let's say for example that I'm a founder in the space, and want to be abreast of what the major new things are this week.
I'm thinking, at the least:
* Press releases from relevant companies
* Relevant new research papers
* News articles
* Blogs
* Open-source library updates
* Videos
* Tweets
Precisely because "the rate of change in the AI space is super fast", there really isn't much of a point to keeping up, even if you're an academic researcher (for those you only need to keep up with your piece of the puzzle, and you likely know everyone in that space already).
For example, I wasn't working in the NLP space for a few years. I kept an eye on what was getting mentions in various circles but basically ignored everything until I had a problem to solve. I work with LLMs everyday now, and honestly, even though I do understand them pretty deeply, there's no real need. Prior to the rise of LLMs I spent some time building LSTMs because I felt that I needed to understand them better. Lots of fun projects, but if I had skipped all that it wouldn't have really mattered at all.
Even more dramatically, I was never particularly specialized in computer vision, but currently build things (for fun) with Staple Diffusion every night. I've spent a fair bit of time really understanding the underlying model well (still have much to learn) but, because the space moves so fast, it's not that big a deal that I didn't also spend nights building GANs. Even though Stable Diffusion is also perpetually changing, the community has largely stuck with 1.5 and is very focused on squeezing as much juice out of that as they can.
Most important: the fundamentals have never changed. GPT4 still projects information into some highly non-linear latent space and samples from that. Diffusion models are probably the most novel thing happening now, but more so for their engineering (they're essentially 3 models all trained together with differentiable programming). If you really understand the fundamentals then catching up when you need to is fairly easy.