What range of wavelengths are in the original images? Do you produce multiple RGB images for looking at different things? c'mon, what does that entail? ;-)
The filters used for this range from near infrared to near uv. We used 4 different filters in all (for this image, the telescope has more). In general yes to fully appreciate all the color information as a human we need to generate different color combos so our eyes can pick up different contrasts.
However, what we strive for is being accurate to "if your eyes COULD see like this, it would look like this". To the best our our ability of course. We did a lot of research into human perception to create this and tired to map the information of color and intensity in a similar way to how your brain constructs that information into an image.
Let me tell you, I did not appreciate how deep a topic this was before starting, and how limited our file formats and electronic reproduction capabilities are for this. The data has such a range of information (in color and intensity) it is hard to encode into existing formats that most people are able to display. I really want to spend some time to do this in modern HDR (true HDR, not tone-mapping) where the brightness can actually be encoded separately than just RGB values. The documentation on these (several competing) formats is a bit all over the place though.
Edit:
I wanted to edit to add, if anyone reading this is an expert in HDR formats and or processing, I'd live to pick your brain a bit!
I'm impressed so much thought went into how to colorize the image! Sometimes it seems like space photos are just colorized thoughtlessly, or to increase the "wow" factor, so it's great to hear how careful and thoughtful you guys were in mapping this data to color-space.