¿ªÔÆÌåÓý

ctrl + shift + ? for shortcuts
© 2025 Groups.io

Re: Jupiter 01/08/2014 from DEC


 

Hey Stan,

I agree with what you said. But, the monitors we use and the images we process in Photoshop (or whichever program) are set to display pictures in the human visual acuity, around 60% green, 30% red and 10% blue. So even if you do narrowband (and I do), the "narrow" nm spectrum is mapped to the wider visual spectrum. S2 to red, HAlpha to green, OIII to blue, (the hubble palatte) and displayed in those percentages.

Deep sky imagers "try" and get the color correct by matching the color balance to a G2V star, like our sun. Hey, I agree, it is still not very accurate, and I goose stuff all the time, hence why my buddy calls me "Captain Crunch". However, I do believe that most people that do planetary and deep sky RGB and Bayer images are locked in to the human visual prsentation by the filters that are used, 400-700 nm, human vision, and the programs that process them, transmissive color, RGB for human visual. You can swap channels, insert narrowband, add Infrared, but essentially you are making a false color image that is mapped into the normal visual spectrum percentages.

So for general color correctness, a red filter that captures photons in the red zone of the electromagnetic spectrum for human visual cone response is considered visually correct for a transmissive color device (your monitor). Same for green and blue. When you add an IR pass to the red channel, or substitute the IR for the red filter, you have created a false color image and it must be labeled in such a matter that the viewer knows he/she is seeing data outside the visual range.

Dan L

Join [email protected] to automatically receive all group messages.