How to improve detail in your nebula astrophotos

Combine RGB and narrowband data to give deep, rich images.

 Half Price Sale! Subscribe to BBC Sky At Night Magazine today and save 51%!
Published: July 8, 2023 at 11:33 am

In 2022 my image of NGC 6888, the Crescent Nebula was shortlisted for the Astronomy Photographer of the Year competition.

In this article I hope to give you an idea of how the image was created.

The first step with any astrophoto is to collect the data.

This stage is crucial because the quality of the starting data will determine how good the finished image is; 45 hours of very high-quality data was collected for this image

Once collected, I moved onto processing, and the first step here was image calibration and integration.

These steps removed artefacts from the raw data and averaged together a large dataset to remove random noise from the final image.

This was absolutely necessary because the signal in any single capture is very faint.

We had to average together many images to improve it, and this had to be done for each filter used.

In the case of this image, there were five different filters, shown in the image below.

Bray began with 45 hours of data, captured using RGB (left), and Ha and OIII narrowband filters (right)
Bray began with 45 hours of data, captured using RGB (left), and Ha and OIII narrowband filters (right)

Next, the five filtered images had to be composed to generate a colour photograph, but this process was not so straightforward thanks to the two narrowband images in hydrogen-alpha (Ha) and oxygen (OIII).

These were captured with very narrow-spectrum filters, so they couldn’t be reconciled easily with the more standard red, green and blue (RGB) filtered images.

The five images were composed into two, which were then carefully blended.

The first of these was a natural-light RGB image and there was also a ‘HOO’ image.

Here the Ha data occupied the red channel, while the OIII data occupied green and blue.

This created relatively natural tones for this object, as shown in the image below.

The natural-light RGB images combined together (left) and the combined Ha and OIII (HOO) image (right), where Ha data occupies the red channel and OIII data occupies both the green and blue channels
The natural-light RGB images combined together (left) and the combined Ha and OIII (HOO) image (right), where Ha data occupies the red channel and OIII data occupies both the green and blue channels

Here you can see the RGB image on the left and the HOO image on the right.

While the colours of the HOO image are not strictly true, you can see they lined up relatively well with the true colours of the RGB image.

This is because hydrogen-alpha emission is truly red, while OIII emission is truly teal.

So, the data from one occupied red and the other occupied green and blue to form teal.

In this way I was able to mimic the natural colours with narrowband filters, which gave the final image much more contrast.

Now, while these images were colour-combined, they didn’t look like this right after capture.

They were ‘linear images’, which means the data within them was still how the camera perceived it.

I had to brighten the images to reveal what was contained within, but in reality they looked like they do in the image below.

Comparison between the raw unprocessed image (left) – note it’s normal for it to look all black – and the processed data (right) after performing a histogram stretch
Comparison between the raw unprocessed image (left) – note it’s normal for it to look all black – and the processed data (right) after performing a histogram stretch

The left is the raw image and its histogram. As you can see, it is mostly black.

The histogram shows all the image details are buried within the shadows of the image.

I needed to do a ‘histogram stretch’, which is a maths function that transforms small numbers into big ones.

This took the shadows and made them into midtones, using the ‘Midtones transfer function’, usually configured with three sliders on the histogram in image-processing programs.

By moving these sliders, I expanded and stretched the raw photo to reveal the detail.

Once this process was completed for both RGB and HOO photographs, I was ready to post-process them for aesthetics.

The principal issue at this stage was combining the RGB and HOO photos together.

The RGB image had desirable natural star colours, while the HOO image had high-contrast nebula details.

To take the best features from each, I needed to get rid of the stars in the HOO image to make room for

the RGB stars, as seen in the image below.

The starless HOO image after boosting contrast, prior to reintroducing the RGB stars
The starless HOO image after boosting contrast, prior to reintroducing the RGB stars

This was my favourite part of editing, because the stars obscure a lot of the detail present in the image.

Without them, the nebula is portrayed in its full detail.

With this stage done, I performed some colour and contrast changes to make the image more striking.

I also extracted the stars from the RGB image (where the stars retained good colours) and blended them back in using the ‘Screen’ layer-blending mode in Photoshop.

This gave me the best of both worlds: the high-contrast details and the natural star colours, all in one finished image, which you can see at the very top of this article.

3 quick tips

  1. If you skimp on image calibration, you will pay for it in post-processing.
  2. Your raw images must be stretched to see the data within; it is normal if they look all black.
  3. Narrowband colours are not ‘true’ and HOO isn’t the only colour palette. You can get creative and try flipping it to OOH!

Are you an astrophotographer? Don't forget to send us your images.

This guide originally appeared in the July 2023 issue of BBC Sky at Night Magazine.

This website is owned and published by Our Media Ltd. www.ourmedia.co.uk
© Our Media 2024