Hacker News new | past | comments | ask | show | jobs | submit login

Yeah, like so many other things in the GPU world, main RAM texture storage is more of a hint to the graphics card driver — "this buffer isn't going away and won't change until I explicitly tell you otherwise".

It definitely used to be that GPUs did real DMA texture reads though, at least in the early days of dedicated GPUs with fairly little local RAM. I'm thinking back to when the Mac OS X accelerated window compositor was introduced — the graphics RAM simply wouldn't have been enough to hold more than a handful of window buffers.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: