They may not want to imply that didder's linearized rabbit is wrong, but I'm comfortable saying so. It's not just a little dark, it's way dark, to the point of hiding detail.
The linearized RGB palette is similarly awful. It clobbers a whole swath of colors, rendering them as nearly black. Purples are particularly brutalized. Yellows disappeared and became white.
On my phone, the middle palette doesn't appear too bright to my eyes, either.
I agree. I think the problem is a banal missing color transformation somewhere in the pipeline, like converting the palette and image to linear colorspace, doing the dithering there and mistakenly writing the linear color values instead of sRGB color values into the image.
Others suggest that the error is using the wrong metric for choosing the closest color, but I disagree. That wouldn't such drastic systematic darkening like this, as the palette is probably still pretty dense in the RGB cube.
Where the linearisation really matters is the arithmetic for the error diffusion, you definitely want to diffuse the error in a linear colorspace, and you are free to choose a good perceptual space for choosing the closest color at each pixel, but calculate the error in a linear space.
Visual perception is weird. But when you squint your eyes to blur the image, you are definitely mixing in a linear colorspace, as that's physical mixing of light intensities before the light even reaches your retina. So you have to match that when diffusing the error.
edit:
It also doesn't help that most (all?) browsers do color mixing wrong when the images are scaled, so if you don't view the dithered images at 100% without DPI scaling than you might get significantly distorted colors due to that too.
You really need to open the image in a viewer where each image pixel is exactly one device pixel large, otherwise the color arithmetic used for scaling by viewers is of variable quality (often very poor).
Thanks for your comment! I'm glad you're seeing the same thing :)
I re-implemented the linearised dithering in python and got similar results.
I checked and rechecked the colour profiles in GIMP, nothing...
At this point I can only hope for an expert to appear and tell me what exactly I am doing wrong.
I got better results just dithering the rgb channels separately (so effectively an 8 colour palette, black, white, rgb, yellow, cyan, magenta). In p5js:
var img
var pixel
var threshold
var error = [0, 0, 0]
var a0
function preload() {
img = loadImage("https://upload.wikimedia.org/wikipedia/commons/thumb/4/44/Albrecht_D%C3%BCrer_-_Hare%2C_1502_-_Google_Art_Project.jpg/1920px-Albrecht_D%C3%BCrer_-_Hare%2C_1502_-_Google_Art_Project.jpg")
}
function setup() {
// I'm just using a low discrepancy sequence for a quasirandom
// dither and diffusing the error to the right, because it's
// trivial to implement
a0 = 1/sqrt(5)
pixelDensity(2)
createCanvas(400, 400);
image(img, 0, 0, 400, 400)
loadPixels()
pixel = 0
threshold = 0
}
function draw() {
if (pixel > 400*400*16) {
return
}
for (var i = 0; i < 2000; i++) {
threshold = (threshold + a0)%1
for(var j=0; j< 3; j++) {
var c = pixels[pixel + j]
pixels[pixel + j] = c + error[j] > threshold * 255 ? 255 : 0
error[j] += c - pixels[pixel + j]
}
pixel += 4
}
updatePixels()
}
Of course this isn't trying to pick the closest colour in the palette as you're doing - it's just trying to end up with the same intensity of rgb as the original image. It does make me wonder if you should be using the manhattan distance instead of euclidean, to get the errors to add correctly.
It looks like the images on your blog might have gone through a non-gamma-corrected scaler. The linear images produced the program look correct, they do overall match the original image in Krita when scaled in scRGB linear 32-bit float.
> We have just committed a mortal sin of image processing. I didn’t notice it, you might not have noticed either, but colour-space enthusiasts will be knocking on your door shortly.
For perceptual color difference, there are much better metrics than “distance in linear RGB”. CIE has some implementations of a metric called ΔE*, for instance.
I don't know if they actually do well in dithering, though. My experience with dithering is that it actually works better in gamma space than trying to linearize anything, since the quantization is fundamentally after gamma.
Yeah, every time I see articles about importance of linear color space for gradients, and see images there, I observe the opposite of what’s written in the text of these articles. Gradients in sRGB color space look better.
I have a suspicion that might be because I usually buy designer-targeted wide gamut IPS displays. I also set up low brightness on them, e.g. right now I’m looking at BenQ PD2700U display with brightness 10/100 and contrast 50/100. However, sRGB color space was developed decades ago for CRT displays.
It can be better than sRGB for the color part of gradients but is awful for the brightness axis. The reason why linear color is so important for operations like blending light, antialiasing, and dithering is also why it is bad where perceptual uniformity is desired. sRGB isn't as good as Oklab or CIELAB perceptually, but a grayscale ramp rendered in linear color is so distorted towards white it's useless. Image formats also encode non-linear color for good reason. "Use linear color everywhere" is overly simplistic and bad advice.
Your monitor and your browser 100% affect the appearance. After calibrating your monitor, try opening the image in full resolution and take a few steps back.
For me, viewing the images on my phone makes them look off.
The linearized RGB palette is similarly awful. It clobbers a whole swath of colors, rendering them as nearly black. Purples are particularly brutalized. Yellows disappeared and became white.
On my phone, the middle palette doesn't appear too bright to my eyes, either.
Even the linearized gradient looks worse, .
Maybe linear is not best for perceptual accuracy.