Hacker News new | past | comments | ask | show | jobs | submit login

Smart people chuck the encoded video at the GPU and let that deal with it: e.g. https://docs.nvidia.com/video-technologies/video-codec-sdk/n... ; very important on low end systems where the CPU genuinely can't do that at realtime speed. Raspberry Pi and so on.

> 16.6ms

That's sixteen million nanoseconds, you should be able to issue thirty million instructions in that time from an ordinary 2GHz CPU. A GPU will give you several billion. Just don't waste them.




Agreed. GPUs support decoding a wide range of codecs (even though you are probably using something like H.264). So it doesn't make sense wasting the time to both decode the data and pipe it out to the GPU.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: