Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In terms of PC usage, JPEG, or most image codec decoding are done via software and not hardware. AFAIK even AVIF decoding is done via software on browser.

Hardware acceleration for lossless makes more sense for JPEG XL because it is currently very slow. As the author of HALIC posted some results below, JPEG XL is about 20 - 50x slower while requiring lots of memory after memory optimisation. And about 10 - 20 times slower compared to other lossless codec. JPEG XL is already used by Camera and stored as DNG, but encoding resources is limiting its reach. Hence hardware encoder would be great.

For lossy JPEG XL, not so much. Just like video codec, hardware encoder tends to focus on speed and it takes multiple iteration or 5 - 10 years before it catches up on quality. JPEG XL is relatively new with so many tools and usage optimisation which even current software encoder is far from reaching the codec's potential. And I dont want crappy quality JPEG XL hardware encoder, hence I much prefer an upgradeable software encoder for JPEG XL lossy and hardware encoder for JPEG XL Lossless.




Lossless JPEG XL encoding is already fast in software and scales very well with the number of cores. With a few cores, it can easily compress 100 megapixels per second or more. (The times you see in the comment with the DPReview samples are single-threaded and for a total of about 400 MP since each image is 101.8MP.)


HALIC does almost the same degree of compression tens of times faster. And interestingly, it consumes almost no memory at all. Unfortunately, this is the case.


HALIC being faster still doesn’t mean that lossless JXL is so slow as to warrant hardware acceleration.


Yes, I also think that HALIC should be destroyed ;)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: