Pictures from space are fascinating, but what am I looking at?

Images of objects in space are often coloured to help astronomers work out what's going on. This post looks at what we can work out.
🔭✨ Stargaze with us: Star Safari Observatory

💖 this photo? Take your own!

Can’t make it to Wairarapa? Or, you know… New Zealand? No worries! You can still explore the Universe with us and SLOOH—no passport, no telescope, just stellar stuff!  🚀🔭✨

There are astrophotography guides, and then there are guides that actually make sense.

Our friend and award-winning astrophotographer Alex Conu wrote a down-to-earth, no-nonsense guide that cuts through the confusion.

So good, that we wholeheartedly recommend it. Check it out here 📷✨

Over the last 30 years, we have been treated to a revolution in the quality of images we see of objects in space. Most of this was led by deploying the Hubble Space Telescope (HST) and the digital revolution in amateur astronomy. What we are often asked by guests at Star-Safari is whether the colours are real and what they mean. These are great questions; we must first understand how the eye works to answer them. This is so we know the difference between what we can see and what instruments on powerful telescopes can see. Once we know how the eye and other instruments work, we need to know a little about the physics of what’s happening in space. Then, we will be able to interpret the images we see.

The human eye

The human eye is an amazing sensor, and our brain is an extremely powerful processor. Put the two together, and it’s the equivalent of a super-powerful camera. We have a flexible lens that can adjust focus from very close to very distant. Our eyes are extremely sensitive to light; we can detect as little as 100 individual photons hitting our retina up to the extraordinarily huge number going into our eyes on a sunny day. On top of this, our brain processes all of that into an image with incredible dynamic range. Added to that amazingness, we have two eyes to give us binocular vision and our brain figures out the complex maths and engineering of interferometry to combine the images.

Making the eye better

Despite our eyes’ amazing capabilities, there are some limitations. They wear out as we age, and we’re not great at nighttime even though our eyes are sensitive. Because we evolved as a daytime species, we never selected the mutations that helped us much at night. This also affects our ability to do astronomy. So, to appreciate the night sky and the richness of the universe, we had to find a way to get more photons into our eyes (or at least one eye). Fortunately for us, Galileo was the smart chap who invented the telescope. This changed how we viewed the universe forever.

A diagram of the human eye looking from the side, lens and iris on the left and the retina on the right with oversized rod and cone cells visible (image credit: the author and blender).

The telescope gave us a huge boost to see distant objects such as star clusters, galaxies and gas clouds. If you’ve ever looked in a telescope, you’ll know what we see is nothing like the amazing pictures from HST. What we see with our eyes are grey streaks and blobs. This is because we use rod cells at night and in other low-light situations, which don’t pick up colour. We perceive light intensity ranging in frequency from about 380 to 750 nanometers. To see in colour, we need to use a camera and take our eyes out of the loop.

A Little Bit About Light

Light is a bit weird; it travels in little packets of energy called photons and behaves like a wave. The energy of a photon relates to the wavelength. A short wavelength carries more energy than a longer one. Blue light has a wavelength of around 470 nanometres, and red light is about 660 nanometers, so blue light carries more energy than red light. At wavelengths longer than 750 nanometres, we get into the infrared. You can feel the heat from a bar heater when you stand in front of it. Infrared photons transmit the electromagnetic energy from the heater to you. These photons are at wavelengths longer than 750 nanometres, so we can’t see them.

The bar would glow bright if we could see in infrared. So there’s light we can see, and there’s light we can’t see. The light we see is only a tiny fraction of the light floating around the universe; collectively, it is called electromagnetic radiation. We can learn much about the universe from the light we can’t see. The James Web Space Telescope sees in infrared only. Astronomers add colour to the pictures to highlight interesting aspects. They do this by assigning a colour palette to the image. Otherwise, we’d get a black-and-white image.

Where does light come from

The universe has a lot of light, and most of it comes from stars. Deep in the core of stars, nuclear fusion generates a lot of photons and these illuminate much of the universe, we see. The other main light source occurs when an electron jumps to a lower energy state in an atom. The energy of the emitted photon relates to the starting and ending energy states. We can easily calculate the energy of these photons due to what we know about quantum mechanics.

Getting back to cool space pictures

This is an image of the Triffid Nebula (M20, NGC6514) (image credit: the author).

The above image of the Triffid Nebula can not easily be seen with the human eye. You can see it, but it is grey and not very bright. If we used a black-and-white camera, we would not see the differences between the blue and red parts of the image. We get a better understanding of what is going on by using colour. For this image, I used a colour camera, which basically splits each pixel into four, with two green, one blue, and one red pixel. More powerful setups use a monochrome camera and filters to combine the images.

What do the colours tell us?

The colour in the above image tells us quite a bit about what is happening. The emission nebula is the red area, showing the presence of hot gases. The reflection nebula is blue in colour, showing the presence of dust. Hydrogen atoms emitting photons cause the red colour. This happens when an electron moves to a lower energy level in the atom. The frequency is specific at about 657 nanometres for a certain energy level change, in this case, called hydrogen alpha and part of the Balmer series. Something has to increase the energy level of the electron first.

There are a few ways this can happen. The atom can absorb high-energy photons, which raises the electron to a higher energy level. The atom may have been ionised, which means it had its electron removed, and then an electron was captured. Recombination is the process of the atom re-capturing an electron. That electron then cascades down energy levels, releasing photon(s) in the process.

Electrons

Cast your mind back to school, and you probably learned the Rutherford/Bohr model of the atom. This puts electron(s) in a nice orbit around the atom’s nucleus. Since that model was developed, we’ve learned things are much more complex. Electrons are not like planets orbiting the Sun; they are more like clouds around the nucleus. The exact location of the electron is a probability somewhere in that cloud. The electron can absorb energy and an electron can be ejected from the atom if it absorbs too much energy. The energy can come from photons or collisions with other atoms and free electrons.

In the red gas cloud are hydrogen atoms with electrons and some without. The atoms are energetic, meaning they are moving around quickly. Something is driving this; there has to be a source of energy to speed up the atoms and knock electrons away. The hot new stars are the source of this energy in the Triffid Nebula. These stars are blasting out powerful stellar winds and high-energy photons that carry enough energy to strip electrons from the surrounding hydrogen cloud. The electrons recombine with the atoms, and as they drop energy levels, they emit photons to give that characteristic red colour.

Why is the dust blue?

Most of the matter in space is hydrogen, followed by helium (they make 98% of the Universe). Tiny particles of dust made from silicates and carbon-based molecules are also in space. These particles are tiny at only tens of nanometers across, meaning they are much smaller than the wavelengths of the light emitted by the nearby stars. Some of the longer wavelengths get absorbed and heat the dust, while others pass straight through. The blue colour comes from the scattering of the shorter wavelengths. This is called Rayleigh scattering and it’s the reason our sky appears blue.

This is the same image as above with the saturation decreased to remove the colour (image credit: the author).

If we just had a black-and-white image, we would not be able to determine the colour difference in the nebula. This is one of the reasons why astronomers apply colours to the data images captured by the large space telescopes.

Images of objects in space are often coloured to help astronomers work out what's going on. This post looks at what we can work out.

Discover more from Milky-Way.Kiwi

Subscribe now to keep reading and get access to the full archive.

Continue reading