Tuesday, December 30, 2014

Tantalus 2


So I'm claiming that sliced images is the best thing since sliced bread.  Well if I recall correctly, the image I used as an example in yesterday's post actually has an asteroid streak in it.  (In fact, it has the only decent streak I have yet acquired, but we'll get to that later.)

So -- what effect did this slicing process have on that streak.

Let's zoom in.



The streak is there, but the central part of it is too bright for this image, and has been blacked out.

That's good news!  The brightness of Tantalus when I took that image -- the 'apparent magnitude' -- was 17.62.  I want to be able to see streaks that are much dimmer than that, and this result suggests that dimmer streaks might work very well.

And there's a way we can estimate how dim we might have been able to go with this telescope, on that night.  We can simulate our own streak, in this very image.


The Streak Simulator

I haven't told you about how I do this software yet, and -- I probably won't.  It's not terrifically interesting, compared to pictures of rocks in the sky. 

OK, I'll make it quick.  I write my image processing software in C, on a Fedora 20 system with a GCC compiler.  I write it all from the ground up, using no ancillary image processing libraries.  I like it that way.  Programming the bare metal.

I use the tool-building philosophy of all intelligent programmers: make simple tools that do a single job well, and combine them together with a scripting language.  Bash, actually.  (Until recently I used csh, which means that I got started doing this stuff when the universe was quite a bit less red-shifted than it is now.)

So -- the streak simulator is a program I wrote recently.  You tell it how much total energy you want it to deposit on the image, where the x,y start point is, what the direction of travel is, how far it should go, and how many seconds that should take. 

The program then moves its idea of where the asteroid is in tenth-second increments, at every moment doling out its increment of energy in a randomly-chosen direction, and at a random (normally-distributed) distance from the asteroid's 'true' position.

It's not perfect, but it's pretty close.  Here is what it did when I used it to try emulating the real streak that Tantalus made in my image.  (The simulated streak is just to the right of the real streak, and I made the simulated one perfectly vertical.




That looks pretty good!  Except I had to put 45,000 grayvalues of brightness into it, when the real streak only used 31,000.  Hmm.   And it's still a little scrawnier-looking than the real one.   Hmm.  

So, I don't know how good a model this really is, but just in case it is predictive, here's what it predicts:



If that is really what a mag 19 streak looks like, I think I can detect that like falling off a log.  And this is with a 20" telescope!  Through iTelescope.org, I have access to a 27" that collects twice as much light.  With that, I might hope for mag 20, or better!

But!  Simulation is one thing.  Ground-truthed (so to speak) data is another.

What I really need now is more images of known rocks, at known brightnesses.


Monday, December 29, 2014

Out of the Wilderness

I've wandered far since my last post, lost in the dark places of the southern sky and lost in the myriad possibilities of the things that can be done, trying to distinguish them from the things that should be done.

I think I understand better now, and hopefully even well enough to explain.



A hunter does not simply load his rifle and go running into the brush waving it about.  A hunter makes a plan before even choosing the weapon, and the first part of the plan must be: What am I looking for?


Well, what I am looking for is very faint streaks -- streaks that are just barely above the background noise.

OK, good.  Now we're getting somewhere.  This clear statement of purpose immediately suggests a question.  What is the background noise?  What does it mean to be 'just barely above it'?


To answer that, let's look at my raw image again.





Let's see.  How will we determine what the background of this image is?  Using my years of training in machine vision, together with an innate talent for noticing facts that are glaringly obvious, I soon determine -- that this image is all background.

All we need to do is look at a histogram of the image, select the most popular value, and we will have found the mean of the background.

So, let's get the histogram.





The red parts are the plotted points, showing how many pixels were found in the image at a given gray value.  That spike to the far left is so large that it makes the rest of the plot look flat.  That's what I mean by an image that is "all background".


Let's zoom in on the part of the histogram that only shows the darker values that are in the background.

Here is a graph of just the darkest 4% or so of possible grayvalues.





Now you can see some detail.  The background, that simply looks black in the image, actually makes what looks like a nice, normal distribution whose mean is just below grayvalue 1000.  Practically all of the pixels in the image are represented by the spike we are seeing here.  This is the distribution of the background.

Let's look a little closer yet.




That's about as nice of a normal distribution as you will ever see.  Its mean looks like it's at about 940, and the point at which it falls to half that value looks to be about 900 on the left and 1000 on the right, which means that the "Full Width at Half Maximum" of this distribution is 100, which means that the standard deviation is about 42.    (The FWHM of a normal curve is always 2.35 sigma.)

But what does this tell us about how we can view the pixels we care about?



What it says is that we can do something beautifully simple.  Make an eight-bit image with its paltry 256 gray values simply centered on the highest point of that histogram.  That will show us pixels that are 3 standard deviations above and below the mean, which should be plenty.

I call this a 'slice' image, because I am slicing out 256 gray values from the 65,536 possible from the 16-bit original.

Here's how we do it.  Look at the peak of that histogram at 940.  Go 128 below that, to grayvalue 812, and make that our zero point.

Now make an 8-bit image as large as the original.  Go through and subtract 812 from every value in the original.  Any pixel that would be below zero, we just set to 0 in our new image.  And any pixel that would be above 255, we also set to 0.

Because we don't care about those pixels.  They are not close enough to the background to be part of the really faint streaks we want to find.

So what does the result look like?
Let's see.




We just got rid of all the stars.  They have turned black because they are too bright to be in our new 8-bit image.  What you are seeing now is a really nice smooth picture of just the background, against which we will be able much more easily to find our faint streaks.

SInce May, I have been wondering how in the heck I could make the stars go away, and here it is in a simple thresholding operatrion that took all of 134 milliseconds of a simple processor on my laptop -- and that was for the full sized original image that is 3056x3056.  That is a dirt-cheap operation.

This job just got a lot easier.