[A longer (!) version is available in Italian]
This is a bit of the chronicle of a long night spent struggling with my equipment and fighting with a sky that, although very clear, gave me a hard time due to the usual local light pollution, for once even due to a wonderful bright moon.
A premise is first needed: this very long post is aimed at those who are completely unfamiliar with astrophotography and curious to get more information on the behind the scenes of these images. If you have tried your hand at this type of photography it is very likely that this long rant I am about to write is going to get you bored as you take its content for granted.
M101, or "Pinwheel", is a galaxy 21 million light years far away from us and the simple idea of being able to catch an object so far away in space (and time) with my camera seems to me to be somehow dizzying.
It is that kind of vertigo and wonder that I felt the first time when I was able to take a picture of Saturn and its rings with a single shot of my Canon and I almost did not believe to my eyes: it was not so much the idea of being able to see a planet without a telescope, having it almost at hand - after all we are used to observing Venus and Mars in the sky with the naked eye, and Saturn is much farther away, but all in all why not - it was more the fact of being able to even distinguish the rings which struck me as completely unbelievable and unnatural (and it still seems to me today), a kind of magic.
Saturn taken with a DSLR equipped with a 500mm lens
I felt that same wonder even more a second time when I succeeded in taking a shot of Andromeda: not an object belonging to our solar system, or even to our galaxy, but something infinitely farther in the deep space, five million light years far away.
Five million light years! What an abyss!
In my natural perspective, one thing is finding these kinds of images in books or browsing the internet, knowing that they've been taken thanks to powerful telescopes, or by sensors aboard on satellites sent into space: I take it for granted, it's part of my ordinary experience and knowledge. Maybe I do not fully understand its deep meaning, but it seems to me like something falling within the ordinary way of perceiving the things of the universe around me: I rely on technology which is alien from my daily life and I accept its mysteries and mechanisms with wonder.
Quite a different story is my DSLR being able to bring the universe in my home, that same camera I used so far to take pictures of landscapes and of my kids playing with the snow; at most the Moon, which after all is there within three days of human travel and shines in the sky at your fingertips.
M101, the Pinwheel galaxy, stands at a distance four times greater than Andromeda and the sensor of my camera, while catching it, actually makes a journey in time back to the Neogene period, an era in which the continents were not even where they are now.
Just thinking about that drives me crazy.
M101 is one of those objects that often appear in beautiful astrophotos and is coveted by all those who engage in this activity, even because it is the typical galaxy of our fantasies: spiral-shaped and oriented perpendicular to the direction of observation, so that we can perfectly admire the arms moving away from its center with their characteristic circular shape.
The huge distance and the apparent low brightness of M101 (which is totally invisible to the naked eye and extremely "small" even when framed by very powerful telephoto lenses) on the other hand mean it is not at all an object suitable to be shot with a normal camera without using at least a good amateur telescope. Catching it without the help of traditional tools used by astrophotographers to hunt for deep space oblects, things like "finders", "go to" and autoguides connected to a PC, relying instead only on references observable with the naked eye and some free software, it is a challenge that requires some experience and a lot of study, patience and luck, especially if the sky you are dealing with is not perfectly clear and dark.
In short, an excellent test to push the possibilities offered by traditional photographic equipment to the limit and venture into a fascinating journey into space.
Telling how I got picture of M101 is also a bit of a pretext to tell about all the work behind the scenes of images like this, or rather what happens before starting to take the countless photos to be processed on the computer to get the final result. For it's easy to find dozens tutorials on the web explaining how to process image of galaxies or nebulae on your PC, but much less resources telling about the troubles upstream you need to deal with, while most of the pages I came across are basically reviews of the equipment used.
The set up for a kind of photo session like this, to be lucky, takes at least a couple of hours, after which, depending on the time available for shooting, during the night it is a good practice take a look at the state of the art every two hours and, if needed (well, it is always), adjust the framing, a step which can take again further time spent in the dark and cold (the best season for astrophotography is winter) to argue with technology.
These steps are usually pretty accelerated if you own an expensive motorized equatorial mount and a go-to system for localizing targets in the sky, since their purpose is just to automate most of the complex preliminary operations that must instead be faced step by step with patience by whom, like me, have only a cheaper star tracker in their availability, an object I wrote about in my previous post and which is the minimum indispensable stuff for anyone approaching this type of photography.
In summary, the set up steps include finding the region of the sky where the DSO to be shot is located (M101 in this case), balancing the photographic equipment and adjusting the polar alignment of the star tracker. All these steps are dependent one another and must be repeated several times until the best overall set up is found.
Searching for M101 in the deep space
Of course, as a first step I need to understand where in the sky M101 is located: don't forget that we're looking for something actually invisible both to the naked eye and the camera sensor, even by pushing ISO to the highest values.
In other words, I need to identify which reference points in the sky, that means visible stars, I can rely on to delimitate the area where to point my lens at. An operation that in the clear night of a desert or from a lodge high on the mountains would be quick and easy, in the bright atmosphere of any metropolitan area can turn out to be a nightmare even in an evening of perfectly clear skies.
From the terrace of my home, on particularly exceptional clear nights, to be optimistic I can perhaps count a hundred stars visible to the naked eye in the midst of a sky that turns in a yellowish color tending to dark gray. Most of these stars are located around the Orion constellation, the easiest reference to find; usually I can also distinguish Cassiopeia, the Big Dipper, the pentagon-shaped Auriga constellation and maybe a few more objects.
The Polaris, which I need to find for the star tracker alignment, appears as a tiny dot whose very faint light comes and goes: I can find it mostly since I know where to look at, especially with the sunset lights, mainly as it is there alone in that area of the sky. With the aid of the little scope supplied with the tracker it is quite easy to locate it, provieded you point it in the right direction.
To understand where M101 is located and turn the camera lens accordingly, I rely on three solutions: Stellarium, perhaps the most widespread free software for browsing the sky distributed as an open source for all platforms; SkySafari Pro, a paid app available for both Android and iOS: it is not particularly cheap, but it is quite indispensable to orientate by sight in the sky when you're in the field, at least to find objects like M101; eventually, Astrometry.net, a website allowing users to upload photographs of the stars and able to locate and tag them with extreme precision.
To delimitate the region of the sky where looking for M101, I prepare my photo session on Stellarium by selecting the galaxy from its database and checking out the route in the sky it will follow during the night, so that I can understand where to look at and the best time to shoot at it. In fact, I have to consider that under 35° of declination the light pollution is too high and tends to "burn" the pictures: therefore, I take a note about the night time when M101 is high enough in the sky so that the region where it is located is affected as less as possible by light pollution.
Stellarium accurately reports the route in the sky of any star based on the observer's GPS coordinates, for any date and time, returning a simulated image of the sky even (if requested) with light pollution and, for instance, the Moon halo. Light pollution data are taken from an open source database and used to adapt the image rendering accordingly.
In practice, Stellarium can either simulate what an observer should see with the naked eye based on its location and light polllution, or show the sky with perfect clear conditions. More, you can set your camera sensor size and the focal length used for shooting, allowing the software to simulate the framing and draw it in the sky, so that you can have a sharp idea of what you can expect to see on the camera screen: this is an excellent feature to perfectly center your photo.
In summary, thanks to Stellarium I can get an idea of what I should see on the camera screen when I point my lens at the sky region where M101 is located at any given time.
I start simulating the larger field of view available with my lens: since I use a 150-600mm zoom lens, I begin from 150mm hoping that a few stars visible to the naked eye fall in the frame enough to allow me a draft orientation, then I gradually tighten up till 600mm.
Although not useful to orienting with the naked eye, I also focus on what I should see at 600mm focal length in perfect sky conditions, in order to have a sharp idea of the local stars distribution and the most characteristic shapes they form in that small region of the sky, since I'll need this information later to refine the final shot with another search method.
Stellarium rendering of the framing at 150mm centered on M101 with light pollution
Stellarium rendering of the framing at 600mm centered on M101 with light pollution
Stellarium rendering of the framing at 600mm centered on M101 with clear sky
The first figure shows what I should see on the camera screen when framing at 150mm, that is what in theory I should see with the naked eye while aiming my lens at the region where M101 is located, and it is essential for finding a first orientation in the sky.
The second figure shows what I should see when taking a picture with the lens at 600mm, that is what the camera sensor should be able to catch at the largest magnification regardless of light pollution, useful as a reference for the final shot.
The third figure is the most optimistic, that is what I should see in a picture taken at 600mm if the sky were perfectly clear: it is useful to have a complete knowledge of the context I'm going to shoot at, since significant differences can exist between simulation and reality in what I see and do not see, for many reasons, so I need to know everything which is potentially visible in the target area.
Notice that in the second and third figures Stellarium shows M101 as it were actually visible even if, as mentioned, it is not. This is because it is possible to simulate its positioning in the frame to get an idea of how the final photo is expected.
Once I've got a sharp idea of the target scenario, as soon as the sun is set and before it gets dark I move outside to set up the equipment.
Polar alignment and balancing
When the object to be photographed sets in the early hours of the evening, the equipment set up and the framing phase are likely to take much longer than the time spent for the actual shooting session. Luckily M101 is a so called "circumpolar" DSO, that means it revolves around the Polaris without ever setting, and consequently you have many hours during the night for shooting.
Let's start taking a look at the inventory: a tripod, as sturdy and stable as possible; a heavy star tracker that must be oriented so as to perfectly center the Polaris within a targeting reticle visible through a small telescope; a rotating bracket mounted on the star tracker, at one end of which the attached camera must be turned towards the galaxy which, I remind you, moves in the sky; a pair of counterweights mounted at the other end of the bracket, whose purpose is to balance the camera so that the whole system rotation occurs without any friction and without straining the star tracker engine; a secondary bracket to hook the mobile phone to the camera, letting to easily access the needed apps; cables and cables, power banks, remote controls, etc.
Star tracker, revolving bracket and counterweight
I already briefly told about the use of the star tracker in my previous post and therefore I will not return on it.
In winter, I also wrap a heating band on the camera lens to prevent condensation and ice which could badly affect the photos. Furthermore, since a photo session can last several hours, rather than relying on the batteries I prefer to connect the star tracker, the camera, the heating band and any other accessories to a 50,000 mAh power bank, to avoid running dry.
There is not an ideal sequence for setting up the equipment until a perfect balancing, polar alignement and framing are achieved, and every astrophotographer has his/her own recipe depending on personal experience and the equipment which they must to deal with. What is for sure is that every time you get the right set up for a piece of stuff, you have just changed the whole arrangement previously achieved, in a continuous looping to rule things that can be extremely frustrating, especially when framing a DSO is very challenging, as it is the case with the M101 galaxy.
In a scenario in which your target is to frame a tiny object invisible to the naked eye with a heavy 600mm telephoto lens, with several kilos of equipment that must be perfectly balanced on a tripod mounting a very delicate engine, any slightest move of the camera even a few millimeters results in a variation of framing, polar alignment and structure balancing. A nightmare, in short.
Generally I start the equipment set up before sunset for a few reasons, first of all that there is still light and everything is easier.
The second reason is that the Polar alignment of the star tracker, to make its motion being perfectly synchronized with the Earth's rotation, is much easier at dusk when Polaris is among the first stars to appear in the sky and easily identifiable. If the sky is clear, as the darkness advances it gets harder and harder to distinguish it from any nearby stars through the small telescope of the astro-tracker, in whose grid it must be precisely positioned.
Polar alignment is essential for the correct functioning of the instrument and it is one of the key factors behind a successful beautiful astrophotography. It must be done for each photo session by using the small telescope inserted inside the instrument and any subsequent operation that involves dealing with tripod, moving camera, trying shots, balance adjusting, modifies that precious alignment. Therefore, choosing whether to carry out polar alignment before dealing with the camera balancing and orientation, or later, depends on many factors, on the experience and also on your mood.
Usually I do a quick alignment even before the Polaris is visible, relying on an app able to locate the stars in the sky and based on the fact that the Polaris is in an almost fixed position. So I just need to check the approximate direction where to look at from my location and turn the tripod accordingly to make the star tracker telescope pointing there. Then I can ground the whole structure and take care of other operations trying to be as soft as possible in adjusting the remaining components.
Doing so, any subsequent correction of the polar alignment does not require the tripod to be moved. In any case, regardless of when and how many times I check the alignment during the setup operations, I always do one last check before starting the star tracker.
As a following step, I take care of a first camera balancing on the star tracker, extending the lens to its maximum 600mm length and pointing it at the expected location for M101 at the time I'll start the photo session based on Stellarium's calculation, so as to deal with an equipment positioning and weigth distribution as much similar as possible to the final one.
I use the counterweights to balance the camera on the star tracker bracket allowing it to rotate without forcing the engine. The balancing is very critical and must be done as carefully as possible: the whole system must rotate freely without "falling" in any direction, avoiding the star tracker engine will jerk while trying to move the stuff load and, at worst, risk being damaged.
Looking at the following photos it is clear how much the balance set up can change depending on the camera positioning and the zoom lens length: any slightest movement of the camera can compromise the balancing and consequently the counterweights should need to be repositioned. Also, the balance must be checked and adjusted after the target final framing has been reached, which will be done later in the evening.
In any case, I prefer to immediately look for a stuff arrangement close to the definitive one, in order to subsequently intervene with final tunings as limited as possible.
Basically, my goal is to work step by step with a progressive approach, adjusting each component several times with incremental fine tunings.
Camera, star tracker and counterweights: a complex balance to be solved
As soon as the Polaris is visible I adjust the polar alignment of the star tracker, which basically consists in turning the instrument so that the Polaris is positioned in a specific point of the reference grid displayed inside the polar telescope. The position of the Polaris in the grid depends on the date, time and local GPS coordinates, and can be calculated using a mobile app released by the star tracker manufacturer.
The polar alignment is carried out by acting on small screws allowing to move the orientation of the tracker along the horizontal and vertical axes, ensuring that the tool and the camera rotation are perfectly synchronous with Earth rotation, so that the target framing is kept throughout the whole shooting session.
In the following figure you can see the grid displayed within the polar scope and the corresponding point in which I must position the Polaris, represented as a yellow cross as it was at the time I took the app screenshot.
Polar alignment takes time the first few times you deal with it, but you can learn to do it pretty quickly.
Polar scope grid compared with the app reference grid showing the right Polaris positioning
Looking for the framing
As soon as the sky is dark enough and the stars appear, I take care of the framing fine tuning, starting with the lens minimum focal length available, 150mm, which corresponds to the widest possible field of view.
To turn the camera in the right direction I rely on SkySafari Pro, an app available both for iOS and Android allowing to locate the stars simply by pointing the mobile camera towards the sky. To make sure I aim the camera in the same direction where the mobile is pointing, I hook the phone to the camera body with a special bracket and align the mobile camera in parallel to the Canon lens, so that they both frame the same region of the sky.
I then select M101 from the SkySafari database and locate the galaxy in the sky by following the indications provided by the app, then orienting the phone and the attached camera in the right direction.
As in Stellarium, you can even set your camera and lens parameters in SkySafari, allowing the app to simulate the framing in the sky depending on the used focal length. I turn therefore the camera until SkySafari shows M101 as it is perfectly centered within a 150mm framing.
In the following figure you can see a screenshot taken from SkySafari simulating the framing for the 150 and 600mm focal lengths when they are both centered on M101: now the camera and the mobile phone are pointing perfectly at the galaxy, at least theoretically (for convenience the mobile phone is kept horizontally).
Pointing at M101 with SkySafari Pro simulating 150 and 600mm framing
The aiming accuracy is only theoretical since actually the instruments alignment is never perfect: at best, M101 is probably somewhere inside the frame, but in general it could be only nearby. Above all, although M101 may fall within the frame, unless an absolutely precise centering, bringing the focal length to 600mm will surely push it out of it since the field of view will shrink a lot.
It's now that the preliminary study done with Stellarium comes to rescue me: I need to understand exactly what the camera is framing and compare it with my expectations, considering that in theory the shot taken based on SkySafari's indications should match the Stellarium simulation, i.e. I should see exactly the same stars in the same position.
I can make a direct evaluation by comparing the visible stars in the camera screen with my Stellarium simulation, but if the region I'm targeting is particularly dark or affected by light pollution, I may not find enough visible references (that is the most frequent situation). In this case I need to take a first photo at high enough ISO to catch the more stars as possible.
Usually a shot between ISO800 and ISO1600 a few seconds long is enough, but I must coinsider the stars motion: either I shoot at higher ISO while keeping as short as possible the exposure time, considering that the photo could be burned by light pollution, or I shoot at lower ISO while prolonging the exposure time and activating the star tracker.
By the way, this is a good time to check the polar alignment before switching on the star tracker and fine tuning the framing.
Starting from now I keep the star tracker running, although moving the camera while the instrument is active may not be a very good idea due to the risk of interfering with the engine.
On the other hand, finding the right framing step by step takes time, many shots and compares, while stars move across the sky and change position even in few minutes: it's quite challlenging to correct the framing based on moving reference points whose position change while you're checking where the hell you are and where you're expected to move on the next step.
By keeping the star tracker in operation, the camera keeps to point at the same stars regardless of the time passing by and every time you're back on your screen you will find the framing unchanged, helping you to make the target research much easier.
By comparing the photo just taken with the Stellarium simulation I can get an immediate idea of "where I am", or at least I can understand whether and how much I'm centering M101, or I am lost elsewhere in the space. The truth is that in the absence of clear references, as the case of M101 region is, it takes a lot of luck to be successful on the first shot.
So, if I'm elsewhere, where am I and how do I get to where I should be? What I know is that thanks to SkySafari I should be next enough to M101: maybe the framed is tilted a few degrees compared to the Stellarium simulation and I can't recognize the stars; maybe the stars shown by Stellarium are not actually visible and vice versa those ones I can see in my photo have not been taken in account by Stellarium; maybe the GPS signal caught by SkySafari is not enough accurate. In any case, it happens quite often that the localization of the photo just taken fails, even just because there are more stars than expected and I struggle to find any correspondence between the theoretical model and what I experience in the field.
The answer, that is the precise location of my photo, is given by Astrometry.net, a site allowing to upload photos of the starry sky, mapping them with great precision and labeling the stars within the photograph, finally returning the exact localization of the photo in the sky.
Since downloading the photo from the camera and uploading it on the web site via PC is a long way, the quicker (working!) solution is to take a rough shot of the camera preview screen with my iPhone and directly upload it from the iOS browser. Astrometry.net is powerful enough to be able to categorize the stars even with a horrible "photo of a photo of a screen" taken in the dark with this method. Great!
I take a photo of the camera screen with my iPhone and upload it on Astrometry.net to check where I'm pointing my lens
Photo localization made by Astrometry.net
From now on I proceed step by step: at each step I take a shot, Astrometry.net shows me where I am and I make a comparison with the scenario drawn by Stellarium, also understanding how my shot is tilted. Then I gently move the camera one step towards the target and take a new shot, moving from star to star, until stars in my screen gradually line up within the framing in the right way and coincide exactly with my expectations.
Once I get the perfect centered framing at 150mm, I bring the lens to 600mm and repeat the procedure: I take a photo, upload it to Astrometry.net, check result vs. Stellarium, correct the framing, until M101 (which, I remember for the umpteenth time, is not visible) is perfectly centered based on all the information I got.
Whether I really centered it or not, anyway, will remain uknown at least till the next day, at the end of the photo session and after the hundreds of photos I took have been processed to bring out the final image.
The following photo was obtained with about four and a half hours of shooting, after which I had to interrupt the session since I had lost the initial framing and M101 was almost out of it.
M101, the "Pinwheel" galaxy
Articolo davvero interessante. Mi piacerebbe sapere come avviene l’elaborazione delle foto per giungere a un risultato finale esauriente