Sunday, March 19, 2017

Sun Photos

Testing my new AVX mount. With a level bubble, compass and GPS coordinates, I was able to get it aligned fairly accurately, enough to capture the sun and have it remain in the field of view on the order of minutes.

The setup. A 127mm Maksutov-Cass on a Celestron AVX mount, with a Baden solar filter. The camera used is a DSLR Canon Rebel T3.
When I took image, I was a little disappointed, there were some defects (which at first I thought were sunspots :-( ). This is due to dust motes.
Sun. You can see defects in image
Unfortuntaley, I didn't take a flat field. However, I did have a few images of the sun that were shifted, since my tracking wasn't perfect. So I masked out the dust motes:

Masking defects

And chose two images decently shifted apart, and re-shifted them. There are sophisticated methods out there for this but for this application, I decided just to shift manually and look at the absolute difference of the image and the second image reshifted. When they align, you should see a uniform noisy sun. (Else, you'll see some bright regions which represent the overlap)
Taking two images, top left and top right are two images where the sun moved in field of view. If I shift them back by the correct amount and take difference, we see a nice noisy image as lower left figure.
After creating a mask and knowing the shift, you can then write some quick code to average the two together, as below:
def combinephotos(imgs, shifts, mask):
    imgresults = np.zeros_like(imgs[0],dtype=float)
    imgcts = np.zeros_like(imgs[0],dtype=float)

    for i in range(len(imgs)):
        imgresults += np.roll(np.roll(imgs[i]*mask,shifts[i][0], axis=0), shifts[i][1],
                axis=1)
        imgcts += np.roll(np.roll(mask,shifts[i][0], axis=0), shifts[i][1],
                axis=1)

    imgresults /= imgcts

    # parts that average more than one image, add poisson noise
    #w = np.where(imgcts > 1)
    #imgresults[w] = np.random.poisson(imgresults[w])
    return imgresults, imgcts

shifts_array = [
        [0,0],
        [-28, -51],
        ]

img_final, imgcts = combinephotos([Grey1, Grey2], shifts_array, mask)

There are much more efficient methods of course, but this allows more control over your data, and better understand exactly what it is you want. For example, I don't like inpainting. This method does not require any inpainting or blurring whatsoever. This is the raw data. The only potential issue is the loss of pixel resolution from the error in the estimation of your shifts. However, there are ways to actually get subpixel resolution, something I'll ignore here because this image is already quite large!
The resultant number of images averaged perpixel. White represents two while black represents one. There are no zero regions.

The final image after combining.

Image saved to a JPEG.

Getting Hands Dirty in Python

I learned python about a year and a half ago and it's been a huge time saver. I've moved from a language where you had to write a library for everything to a language where a library for everything already exists. Here's something I wrote quickly this morning. Very easy, and you can do it too. If you're interested in better understanding your astronomy data for educational purposes, this post will help get you started.
Before we begin, you'll need to have installed these libraries:
Python
pylab
rawkit
Pillow
LibRaw (Without this library, we couldn't do this thanks to developers for all this hard work!)
I'm happy to provide more instructions if need be.
I finally bought a mount (Celestron AVX) and I'm beginning to take my own images.



Here is some quick code to do this. Note, some steps are not very 'pythonic'. I'm a strong believer in breaking conventions to better understand data, and worry about conforming to them when packaging code.
This allows for much quicker analysis and gaining a deeper understanding of the data, which is *much* more important than the code!!! (coding is easy, making sense of data is the challenge...)

In this code, you'll learn to:
1. Extract the raw data from your image
2. Extract each color from your image (if needed), and make a grey scale image
3. Rescale your image in different ways to try to better emphasize features without brining out the noise

This code should be fairly straightforward and is meant just as a motivational exercise. There are better ways to do these things, and nice statistics you can use to better understand your data (like histograms). I hope this motivates you in the same way it excites me to have so much control over my data!
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
''' 
    Astronomy grey image analysis
'''
# This imports functions into global namespace
# useful for quick hacking, NOT recommended for writing programs
from pylab import *
# makes plots appear if in interactive mode (comment out if not)
ion()

# used for saving images
from PIL import Image
# used for reading Raw images
from rawkit.raw import Raw



# choose your filename here
import os.path
filename = os.path.expanduser("~/pictures/17-03-18-custer-whirlpool/IMG_3787.CR2")


#  Grab the data, really easy
im = Raw(filename)
data = np.array(im.raw_image())


# The data is in a Bayer Color Matrix here for Canon, so let's
# just quickly extract each color by choosing the right pixels
# quick way to get from Bayer matrix. Note there are way more
# sophisticated algorithms out there than this method... But this
# yields the rawest untouched data possible
#
# On Rebel T3, pixels are:
# [ R G ]
# [ G B ]
# so index every other row/column from the right start (0 or 1)
# start:end:skip means start at start, end to end and skip every other row
# if "end" not specified, then just go to end of array
# so 0::2 means start at 0, go to the end, and skip every other row/column
# data is indexed as data[row, colum]a (row goes up/down in image, column goes
# left/right)
R = data[0::2,0::2]
# for green, you average the two together
G = (data[0::2,1::2]+data[1::2,0::2])*.5
B = data[1::2,1::2]

# sum colors to get greyscale (note this isn't 100% correct, you need
# to weight colors appropriately, but this is just messing around first)
Grey = R + G + B
# take logarithm, sometimes this could look better
LGrey = np.log10(Grey)
# try other combinations
Grey_squared = Grey**2

# so now, let's plot the image, plot it first without "vmin" and "vmax"
# and mouse over the plot to see what values you see
# when happy, choose a good "vmin" and "vmax", this worked for this image


set_cmap("Greys_r")
figure(4);clf();
imshow(LGrey[200:900,600:1300],vmin=3.8,vmax=3.9)

figure(5);clf();
imshow(Grey[200:900,600:1300],vmin=10**3.8,vmax=10**3.9)

figure(6);clf();
imshow(Grey_squared[200:900,600:1300], vmin=4e7, vmax=6e7)

# now, if we want to save image, it needs to have its pixel values go from
# 0 to 2**8-1 (255). You don't want a larger range since your eye can't really tell 
# the difference beyond that, roughly

# normalizing function:
def normalize(img, mn, mx, logimg=True):
    ''' normalize to a uint8 
        This is also known as byte scaling.
    '''
    dynamic_range = 2**8-1
    img = img-mn
    img /= (mx-mn)
    img *= dynamic_range
    img = np.minimum(np.maximum(img, 0), dynamic_range)
    img = img.astype(np.uint8)
    return img


img_grey = normalize(Grey[200:900, 600:1300], 10**3.8, 10**3.9)
limg_grey = normalize(LGrey[200:900, 600:1300], 3.8, 3.9)
img_grey_squared = normalize(LGrey[200:900, 600:1300], 4e7, 6e7)

# Image comes from the PIL library (imported at top)
img = Image.fromarray(img_grey)
limg = Image.fromarray(limg_grey)
img_grey_squared = Image.fromarray(img_grey_squared)
# saving 
img.save("../storage/01-whirlpool-oneimage.png")
limg.save("../storage/01-whirlpool-log-oneimage.png")
limg.save("../storage/01-whirlpool-squared-oneimage.png")
Finally, you can run the code by running it in ipython (interactive mode):
$ ipython
[1] : %run mycode.py
or just python:
$ python run mycode.py
The whirpool galaxy. 5 min exposure taken with a Canon Rebel T3 on an Orion Maksutov-Cass 127mm on a Celestron AVX mount, autoguided.
The whirpool galaxy, logarithm of image. 5 min exposure taken with a Canon Rebel T3 on an Orion Maksutov-Cass 127mm on a Celestron AVX mount, autoguided.

The whirpool galaxy, square of image. 5 min exposure taken with a Canon Rebel T3 on an Orion Maksutov-Cass 127mm on a Celestron AVX mount, autoguided.


Right now, all images look the same, but there will be cases where different transformations help (this is similar to playing with the linearization curve). We'll mess around with colors later.





Saturday, December 5, 2015

Quick optics 101

Here's a quick fun thing to do with a telescope on a rainy day.

Try this:
- point your telescope at something bright
- remove the eyepiece and place a piece of cardboard where it should be. Move the cardboard back and forth and you'll resolve an image!



Scroll over to 30 seconds or so to see the end result.

How does it work? We know from simple optics that if you have an image a distance d0 away, that another object will appear d1 away:



where f is the focal length of your telescope. Here is a schematic of the setup. The object is labeled "O" and the image "I".

Analyzing my setup, we see that d0 = 2.5m, and f is 700mm. This means with some algebra that we should expect an image at d1 = 1.03 m.
The object is at S1 and the image at S2+S3

What happens with images from the sky?


Images from the sky are actually located so far away that d0 is very large, and as a consequence 1/d0 goes away. This means then that:
1/d1 = 1/f

So if you did the same thing with the moon, you could use it to find the focal length of your mirror!



Saturday, November 21, 2015

Using a webcam to image the moon

I'll try to make this as brief as possible, in hopes the images describe the process. I basically did this in one evening (20 min to make the focuser, and then 30 min to get the imaging set up. Clouds came in so I had to postpone for now).


The goal of this project is to revive an old webcam and an old telescope to take some images of the night sky. I have many ideas in mind that will stem from this.

Step 1: Find your image


Simple optics will tell you that an image coming from very very far away will resolve as an image in plane at the focal length of the optics assembly.

The location of the image plane is important as when imaging, this is where you want to place your 2D detector (i.e. CCD chip from a webcam).


Here is an example of the process. I removed the eyepiece from my Tasco Newtonian (700mm focal length), pointed it at the moon and stuck a piece of cardboard near where I knew the image plane to be located. At the image plane, you will see a very sharp image of the moon. This is where your CCD chip should be.



The bright light on the piece of cardboard is an image of the moon resolved in plane at the focal length.


Some small detail: you'll also want to double check that the image being resolved is well centered coming out of the focus. If it's not, then adjust your secondary mirror. Note: if you're not sure, don't touch the secondary mirror. Often this piece does not need to be aligned and if you bring it out of alignment, it can be frustrating to bring back (it has 3 degrees of freedom, and is often a delicate effort to adjust the screws that control its orientation).

Step 2: Make the focuser


I haven't received my focuser yet, so I decided to concoct something out of regular household materials. Picture will say all...


The makeshift focuser. I cut and shorten the diameter of one tube so it can slide in the other. I add tape to add more friction. It works quite nicely.

Step 3: Mount the webcam, point and measure!





The scope mounted, connected to pc. I use cheese (free open source) for the image capture.




The end result

Since we know we see just brightness and darkness on the moon (grayscale), not color, we can sum the colors together. Here is the image again in grayscale. Unfortunately, it's saturated. Will have to attenuate the image next time.



Grayscale version of the end result.

After thoughts



Why is the moon pink?


When I use the webcam with the lens, I don't see pinkish colors. When I use it without the lens (for the telescope), I do see the pink colors. The conclusion must be that the lens must also be filtering out red light. This also suggests that the webcam's red is more sensitive than it should be. Will check this out later.


If you see a photo of the lens below, you see that it reflects red.



The webcam on a toilet paper tube.

Time lapse evening log

Finally, here's proof of the non-astro friendly weather tonight :-( :-(
(Cleardarksky.com was much more optimistic than the end result this time)



Time lapse at Custer institute. We waited 2 hours, no signs of clearing.


Sunday, July 19, 2015

Weather Data - Part II : The reader

So how do we extract cloud data? One method is simply by observation. However, why observe when we have the satellites to help us.

The data

The data can be found here although I currently see that the server is down (hopefully this will change soon). 

The data format

The data format is sort of summarized in this document here.

The reader

The reader written in C can be found here. However, beware, this code only works for a big endian machine. For little endian, make this substitution:
data.Preptr->npix becomes ntohl(data.Preptr->npix)


The reader (Python version)

At a recent hackathon in Montreal, I took the liberty to rewrite the library in python just to help with the visibility of this data, and also well, learn python.
Here is the source.

I would like to point out that there is also a java version written by Andre Mas here. We wrote the library in parallel just to challenge ourselves and better understand this data at a lower level.

Anyway, so what's the result?
Here is a quick result on the infrared radiance.

data set: ISCCP.DX.0.GOE-7.1991.01.01.0000 the infrared radiance measured.
Why did I choose the radiance? Because infrared is visible during the day or night. (For those who are not familiar with infrared, click this wiki link. It is like light, but longer wavelength and radiation we can't see with our eyes. But your camera can see it.) Anyway, so the satellite data also does contain data showing what the sky would look like with our eyes (visible spectrum), but then I would be limited to data that was taken during the local daytime of each place measured.


So now we have a database that can tell us the cloudiness of a certain region. Will be be able to do anything more with this? We'll see

Saturday, July 18, 2015

Weather Data - Part I

(This will be a very transparent post but will be useful for the next post involved in extracting this mentioned number of prob of success)

As amateur astronomers, we all share the problem of unwanted "nebulosities" in the sky, yes I'm talking about clouds. Well, actually, if you live in a truly light pollution-free environment, then maybe it's an unwanted bok globule.

Even worse, if you want to introduce and share your passion with others, you will find these others quickly dissuaded by your failed attempts to show them the night sky because a combination of these unwanted objects and Murphy's law.

This is where statistics comes in. First let's introduce the problem and see how it is related to a simple probability distribution.

Binomial Distribution
Let's say you have observed over a few years that in the month of January, it is cloudy 2/3rds of the time. If you organize an astronomical event, you have a 1/3rd (33%) chance of success. 

Rain Date
That's difficult because most likely you'll fail. How do you improve the odds? Let's say you choose a rain date. Your chance of success is now:
(chance of success 1st day)  + (chance of failure first day) *(chance of success second day) = 1/3 + 2/3*1/3=55%

You have almost doubled your chance but not quite. What about a 2nd rain date?
Chance success = 1/3 + 2/3*1/2 + 2/3^2 *1/3 = 70%. Your chances are increasing in diminishing returns.


The importance here is that by doubling your dates, you haven't doubled your chances. It's obvious, but we need to formalize it for the next step.
Probability of success for n-1 rain dates where perecentage success for one day is 10% (black), 33% (red), 50%(green), 75%(blue) and 80%(magenta). The horizontal black line is 99% chance of success.


So let's calculate this for multiple rain dates. The calculation is done here for n-1 rain dates above for probability of success 1/3 (red curve). The flat black line at the top of the curve is the chance of 99% success. We see that for region with 33% chance of no cloudiness, we would have to schedule 9 rain dates (total of 10 days) to beat Murphy's law with a chance of success! You see that even with 50% success you need 6 rain dates to achieve this confidence! And you can forget about it if your chance of clear skies is 10%.


What does this all mean? Basically, if you're an amateur astronomy club in a region where the chance of clear skies is 50% or lower, you'll have to think a little carefully before planning an event.

side note
I would like to note that this is an extreme simplification of the problem and that two things are important: 
1. The cloudiness on dates chosen are independent of one another. If they're dependent, this just worsens your chances. 
2. This prob of success p will probably vary month to month. The easiest simplification to this is assume the change is not large and base your calculations on the worst of the chances.

The solution
Basically, the solution to this problem (a depressing one) as an astronomy club organizer is to make sure to hold at least the number of events throughout the year that would guarantee you one event with a 99% chance of success... Holding 10 rain dates (in Montreal, where we believed the success prob to be at worst 33% for any given month) isn't really popular for any club. It's best just to hold 10 separate events throughout the year.


If you're doing this for your friends, well at least now you have a chance of warning them what they're getting into.

I'll look next at how to estimate this value of the chance of success, which I'll just call P1.

Feel free to look at code below just to see how easy Yorick is (nice alternative to matlab). However, the user base is quite small so I would recommend using python if you're just starting. (I use Python for long code I share with others, yorick when I want to quickly program something.)


Yorick code (using source code formatter): 
1:  func binom(p,n){  
2:      /* DOCUMENT Compute the binomial prob for each  
3:      *    element n with prob p  
4:      */  
5:      res=array(0., numberof(n));  
6:      for(j = 1; j <= numberof(n); j++){  
7:          for(i = 1; i<=n(j);i++){  
8:              res(j) += p*(1-p)^(i-1);  
9:          }  
10:      }  
11:      return res;  
12:  }  
13:  n = indgen(20);  
14:  window,0;fma;  
15:  plg,binom(.10,n);  
16:  plg,binom(.33,n),color="red";  
17:  plg,binom(.5,n),color="green";  
18:  plg,binom(.75,n),color="blue";  
19:  plg,binom(.8,n),color="magenta";  
20:  pldj,n(1),.99,n(0),.99;  
21:  lbl,,"n","P(n)";  

Saturday, July 12, 2014

M82 - Image Stacking

Now that I have a successful reading routine to read the images, I can process them. I have written some home image stacking routines to show how relatively simply this can be done to an amateur.

This post will be long. If you want to see the end result, just scroll to the bottom.
Please click the images to see the actual photo in full quality.

The problem
Astronomers are always starved for light. We want to capture as much light (signal) as possible to help reduce the noise. It turns out the first easy solution to capturing as much light as possible is just to leave the camera exposed to the image for hours on end. However, this does not work for two reasons:
  1. Tracking systems are imperfect and the image will have shifted during the course of the measurement. This will end up with the well known star trail like effects. I will explain this soon.
  2. Your detector actually saturates at a maximum value. You can think of each pixel in your camera as a mini detector. Now imagine as a bucket for light. It can only capture so many photons of light until the bucket starts to overflow. After this point it makes no sense to fill your bucket further because you won't be able to count the extra photons you've caught.
Now let's visit each of these caveats before we move on to the solution.

Tracking systems aren't perfect
The first thing all amateur astronomers learn is that, tracking systems aren't perfect. If you want to have pristine tracking, it takes a lot of work (even with the more modern GOTO mounts). Time and resources are not a leisure for hobbyists and this is a known problem.
Take a look at M82 here. The second image was taken about 1-2 min after the first. I tried my best in perfecting the tracking and I used a Celestron C8 AVX which is an equitorial GOTO mount. Still no luck, there's always a shift... You can see them below.

M82 - one image

M82 - another image. The shift relative to the previous image is marked by the arrow.
So we see there is a shift and we'd like to correct for it. It turns out this shift is not that big, approx 1/5th of a pixel per second. Thus if I expose the camera for 10 seconds, the image will have moved at most 2 pixels or so. This can actually be corrected for quite easily. Before explaining the correction, let's discuss the second problem: that of image saturation.

Image Saturation
A good example of image saturation (that isn't a star) is one our our planets: Saturn. The planets are some of the brightest objects in the sky. No wonder the word comes from the greek word astēr planētēs which means "wandering star". They're one of the brightest "stars" in the sky but oh... such rebels!

So what happens if I overexpose the image? See for yourself (units are arbitrary):
 
Overexposed Saturn Image

A cross section of the overexposed image.
 Take a look at the image. Near the center of the object, all the values look to have the same color. Even worse, by looking at a cross section, you see that they are the same! This is the result of overexposure. The little "buckets" of light in your camera can only gather so much energy before it spills over. Each little bucket here could only hold so much before it could not report any more signal. So how do you correct for this issue? Easy, you read out multiple images before these buckets "spill over," as will be described next.

How do you correct for this?
Now how would one go about correcting for this? The correction is actually a two-piece process:
  1. Take multiple images of shorter exposure before the light buckets spill over, and before the image shifts appreciably.
  2. Re-shift the images back in position and then average them.
The first piece is simple. I have taken here 10 second exposures. This is enough to avoid both saturation and star trails. The second piece is explained in two steps:
1. The shifting process
2. The averaging process

First Step: The image shifting process
One method is simple, you can manually shift each image so that it looks like it overlays on top of the image. But how do you know when you've shifted by just the right amount? Easy: Just take the absolute value of the difference of the two images. Here is a result of the absolute value of the difference of two M82 images shown earlier and look at just one star:




M82 difference of two images, zoom in on a star.

The shift is now pretty clear. It looks to be 5 pixels in x and 23 pixels in y. Now, let's shift these images by this amount and take the difference again:


Difference of two images of M82, where the second is shifted by 5 pixels in x and 23 pixels in y.
We can see the image now looks almost to be zero (although not perfect). Now, a sensible solution would be to have a program try to take the absolute value of the difference of images for a few values, and check to see where there is a minimum. Where you find your minimum will be the region that the image has most likely shifted. A sensible solution.

It turns out there is a slightly better and quicker trick using fourier transforms, but I have run out of time. I will post it later with code and examples. Anyway, so for now we have some way of determining the shift of the images numerically.


Second step: average them!
So now, you have a sequence of images, and you also have a method of shifting them back in place. In other words, you have a sequence of the same image. So what next?
Images will always contain noise, like it or not, which will amount to graininess in your images. There is an easy way to reduce this graininess. It is well known in statistics that to reduce the noise in any measurement, all you need to do is make the same measurement again and again and average the results together. Say you averaged 10 measurements together. If each measurement was truly independent, then your noise (graininess) will be reduced by a factor of 1/sqrt(10) where "sqrt" means "the square root of".  If you have N images, then your noise is reduced by 1/sqrt(N). Remember, this only holds for independent measurements.

Here is an example. I take a cross section of M82 and I plot it below. I then take the average of 20 independent measurements of M82 and plot the cross section in red. The graininess (noise) appears to have been reduced by 1/sqrt(20) or approximately 1/5th. So it works!

 



The Final Result
So there you have it. Now let's see what happens when we apply this to M82 as below:

M82 - One image 800ISO 10 s exposure with Nikon D90, converted to grayscale, byte-scaled between 60 and 100 ADU. Click to see the actual photo!

 And shift+stack it:

M82 - Stacked images at 800ISO 10 s exposure with Nikon D90, converted to grayscale, byte-scaled between 60 and 100 ADU. Click to see the actual photo!
Voila! A much nicer image. Note that this has been done with only 20 images, in grayscale and an extremely light polluted (McGill Observatory) area. It would be interesting to compare the same measurements (on the same day, with same moon phase) both at the heart of light polluted montreal and its darker rural skies.