Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It might be worth using a lightness estimate like OKLab, OKLrab[1], or CIE Lab instead of the RGB luminance weighting, as it should produce a more perceptually accurate result.

The other issue with your code right now, is that it is using euclidean distance in RGB space to choose the nearest color, but it would be probably also more accurate to use a perceptual color difference metric, a very simple choice is euclidean distance on OKLab colors.

I think dithering is a pretty interesting area of exploration, especially as a lot of the popular dithering algorithms are quite old and optimized for ancient compute requirements. It would be nice to see some dithering that isn't using 8-bits for errors, is based on perceptual accuracy, and perhaps uses something like a neural net to diffuse things in the best way possible.

[1] https://bottosson.github.io/posts/colorpicker/




If you are interested in color dithering with different color difference metrics [1], I've implemented just that [2]. You can find an example comparing metrics in my docs [3].

[1]: https://juliagraphics.github.io/Colors.jl/stable/colordiffer...

[2]: https://github.com/JuliaImages/DitherPunk.jl

[3]: https://juliaimages.org/DitherPunk.jl/stable/#Dithering-with...


If you want to do true arbitrary palettes you also need to do projection of the unbound Oklab space onto the convex hull of the palette points. This is a tricky thing to get right, but I've found that the Oklab author's published gamut clamping for sRGB also translate well to arbitrary convex hulls.

If anyone's curious I've implemented this here: https://github.com/DDoS/Cadre/blob/main/encre/core/src/dithe... I use it to map images from their source colour space to the lower gamut palettes of E Ink colour displays.


I moved my canvas library's reduce-palette filter over to OKLAB calculations a while back. The calculations are more computationally intensive, but worth the effort.

https://scrawl-v8.rikweb.org.uk/demo/filters-027.html


I quite like the look of the blue noise dithering on this. Are you using just a texture as a mask, or something else?


It's an array of pre-calculated values that I extracted from an image donated to the Public Domain by Christoph Peters (the link is an interesting read about bluenoise - recommend!) - http://momentsingraphics.de/BlueNoise.html

No textures or masks, just brute computing on the CPU.


I am weighting each of the channels according to the formula in my post.

I’ll try OKLab and compare, thanks for the comment :)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: