LEDs tend to change emission wavelength when changing the current. This is quite an issue if you want to combine them in an RGB display, because the human eye is extremely sensitivty to relative color variation.
I'd love a source for this claim (not sarcastic, I really would). I've have done a lot of testing of LEDs for scientific uses and my experience and what I've read show that temperature is what effects the center wavelength of LEDs. Not current/voltage. The reason for this, is that in monochromatic LEDs (so not white LEDs which have a phosphor coating) the emission wavelength is defined primarily by the bandgap in the semiconductor material. This bandgap is the difference in energy between the valance electron band and the conducting electron band (and this band "gap" is the reason for the "semi" in semiconductors).
This bandgap corresponds to the photon energy of the emitted light as electrons get excited due to the applied voltage as electrons are excited to the conduction band and then relax back to the ground state giving off light.
The bandgap energy changes as a function of temperature. The primary reason for this is that the lattice constants increases as temperature increases. This causes the bandgap to decrease, meaning the energy of the photons is less giving a longer wavelength.
The opposite effect is also true, cooling a LED will lead to a shorter wavelength. Here is a cool video showing the effect![1]
Increasing the current through the LED may change the temperature by a little bit but you need large temperature changes to have any effect.
The temperature has a much greater impact on the intensity of light emitted by the LED. I have seen a typically 1% decrease in intensity per degree C for the LEDs I have tested. This is the effect that matters most when using RGB leds as if the red led gets dimmer cause it is hot, than the green or blue, it will be seen as a color change, even though the center wave length of its emission is unchanged.
I mostly just wanted to share things I have learned about LEDs over the past year or two and your comment gave me a good opportunity!
While not a source I would wager that the LEDs are already driven close to the bandgap (for efficiency's sake) and meaningfully lower voltages would cease to produce any light output - necessitating PWM to control brightness.
I could plausibly see some color shift at close to bandgap voltage if there isn't a perfect uniformity in bandgap across a diode, inconsistent or even just gaussian distributed doping would result in some holes being preferentially excited if there isn't a sufficient surplus V?