Hello, I would have liked to make a suggestion: it is related to the fact that the nebulae seen in Space Engine are not so close to reality as they should be.
A key element that is very important in the interpretation of what we see in reality is the vision of colors
. Indeed, a nebula has a composition defined in various gases
, determining the visible colors
. But if the colors are visible on Space Engine explicitly, this is not the case of reality. The photographs are taken in different wavelengths corresponding to various elements
such as oxygen or hydrogen. But to the naked eye, we can only observe shades of gray
It should be taken into account and implemented as an option.
So tell me what do you think about it
First and foremost, it is important to note that SE generally simulates a camera, not the human eye (and a good thing too, as human vision is not sensitive enough to see many of the things it's possible to view in SE).
The human eye can only perceive shades of gray (if anything at all) for nebulae because human photopic vision (color vision) is only sensitive to relatively bright light, while human scotopic vision (grayscale vision) is sensitive to dimmer light - this is why dim stars appear gray to the eye, while bright stars are often visibly red or blue. It has nothing to do with the intrinsic emission or reflection spectra of the nebulae, but rather a limitation of the human eye. Cameras do not have this limitation.
And yes, while some commonly-circulated images of nebulae have color channels assigned to specific narrow wavelengths, there are plenty of other images, especially from amateur astrophotographers, which show the true color of nebulae via RGB or LRGB images. Nebulae in SE are usually based on these colors.
As an illustration, here's a false color image (top) and true color image (bottom) of the Trifid Nebula: