Hacker News new | past | comments | ask | show | jobs | submit login

This is actually a very common problem with 3-D stuff and transparency in textures. This isn't an issue with the colors of the pixels themselves, it's an issue with texture filtering. nVidia has a pretty good explanation as it applies to games and 3d graphics: https://developer.nvidia.com/content/alpha-blending-pre-or-n...

Say you have two adjacent pixels using floating point RGBA values of (0,0,0,0) and (1,1,1,1), and you apply it to a 3-d shape. Because of the rasterization algorithm, you will be sampling weighted averages of the two pixels, either because you're scaling up and need to interpolate, or because you're scaling down and need to average.

The average of (0,0,0,0) (fully transparent) and (1,1,1,1) (opaque white) is (0.5,0.5,0.5,0.5), a half transparent gray. But you'd intuitively expect (1,1,1,0.5), half transparent white. This is the essence of the problem. The fix is to make sure that your transparent pixel was (1,1,1,0) and not (0,0,0,0).




Surely the answer if you want this is to weight the final RGB by the transparency. E.g. the final red channel would be (R1xT1 + R2xT2)/(T1+T2)


You're mostly right. The industry-wide accepted answer is to multiply the opacity into the colors before interpolating, so the formula would be (R1xT1 + R2xT2)/2 for the average, and then to do later transparency blending as if the opacity term was already multiplied in.


Which means that your output pixel cant be both white and low transparency. I guess its a typical graphics 'close enough and better performance' outcome (where mine is marginally more difficult to calculate and needs some more logic to avoid divide by 0)


> Which means that your output pixel cant be both white and low transparency.

Did you mean low opacity?

If so, that's not quite right. (0.1,0.1,0.1,0.1) premultiplied is the same color as (1,1,1,0.1) "normal." They're both white and low opacity, just in different representations. You don't actually lose much granularity because the graphics card has to multiply the color channels by the opacity value sooner or later.

Separately, your formula doesn't work for interpolation. It works for averaging, but in order to do texture sampling, you need interpolation, so your formula can't actually be used unless you can adjust it to deal with interpolation gracefully.


Thanks for the answer - I did not know of this premultiplacation. This makes it effectively the same or very close? Assuming output transparency/alpha is (T1+T2/2), dividing by this gives the difference

I don't quite get your point on interpolation, but I'll look up when I have the chance


A better interpolation would do the premultiply for you automatically to get the proper result. Since that requires a couple of extra multiplies and a divide, it gets skipped most of the time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: