Monday, May 2, 2016

The Take-Away with Jacob Moul

May 1, 2016

Over the past couple of weeks I have concluded all of my testing and am left with a fair amount of data. 

What Does It All Mean??

Running all the algorithm combinations resulted in lots of control networks (files that contain all the match points between the two images). After looking through all these networks, I have determined that one feature detector algorithm finds a lot of points (in the 5000s), and the other two detectors both find significantly less (in the 200s). In addition, each detector-extractor algorithm resulted in distinct clusters in different areas on the Moon's surface. I will have to do more analysis to see if the effectiveness of each combination was due to the level of each match condition (geometry, lighting, overlap), but at the moment, there is no clear pattern.

What these results tell us is that more work needs to be done, both in developing the code and in testing the code to determine the problems. My advisor told me something the other day that I did not realize: all of the algorithms we've been using have been developed for use on images of the Earth's surface, and have just been used on extraterrestrial images out of convenience. Due to the sometimes extreme difference between terrestrial and extraterrestrial surfaces, the algorithms will occasionally fail.

Fortunately, my project has been able to identify some of the issues that these algorithms run into (i.e. clustering and small numbers of match points). The purpose of my project was to help the USGS improve their image matching software. Because there was no concrete goal involved in that purpose, I would consider my project to be moderately successful. Even though I don't have the technical knowledge to address the problems myself, just identifying them provides a sort of roadmap for developers at the USGS to follow.

What Did I Learn?

Computers are frustrating. Debugging took up a good third of my project time, and that was a pain in the butt. Especially because I only have a basic understanding of programming so it was extremely difficult to identify and fix the problems. Luckily, my advisor did most of the heavy lifting in this respect...

I also learned that space is super cool. I already knew this to a certain extent, but staring at pictures of the Moon all day really sparked my interest in space. During my project I started reading a book about cosmology (the study of the universe) and dark matter and things like that and I think it's extremely interesting. Right now I'm declared as mechanical engineering major, but who knows, maybe I'll end up studying astrophysics instead. OR maybe I'll double major. We'll see what happens.

In all seriousness though, the very basic programming skills I learned during the course of my project will provide a good base to build off of in the future. I got a lot out of my project and I think it definitely has shaped the academic path I want to pursue in college.

To all who read my blog, thank you for keeping up with my project and for being interested!

Jacob Moul

Friday, April 15, 2016

Week 10: Holy Guacamole!

April 15, 2016

Catching Up From Last Week

After my last post, I started choosing manual match points and comparing them to automated match points. It turns out that this task is perhaps the most tedious task known to mankind. I'll spare you all the boring details, but the bottom line is that in the image pairs that actually gave us matches, the computer consistently performed as well as (or better than) a human (me). Here are some animations showing the shifts that occur between images for a few match points.

Computer

Computer
Me
Me

Sometimes the computer performed worse than I did, but that's alright. The "breathing" that you see is due to geometry changes from different camera angles and things like that.


This Week! (holy guacamole)

I am excited! Remember back in Week 3 when I said we got to put it all together? I 100% lied. This week we really put it all together. Let me explain.

The first half of my project (the part that was completed before capstones ended) was all about learning how to program. And I did that decently well. I learned the basics of the Python programming language and learned some cool things about file manipulation and the like.

The second half of my project has been all about helping out the USGS, and I have also done that to a certain degree. I learned how to use some of their software in order to analyze the effectiveness of some of their image matching algorithms while learning some cool and useful skills along the way.

Now when I say we put it all together, I mean we put both halves of my project together. Last week I talked about the problems we were having with the code especially when it came to communicating between my Mac and the Linux that is on my Mac. We remedied those problems by abandoning the Mac side altogether and doing everything on the Linux side.

See, before, we were matching images on the Mac side and sending the control networks that were generated to the Linux side through some computer black magic. When I say control network, all that is is a file that contains all the matched points and where they exist on each image, more or less. Anyway, since moving over to the Linux side, it has been much easier to manipulate the algorithms ourselves, and we have discovered there are about 32 total combinations of algorithms between feature detectors, matchers, and feature extractors (for each image pair), sort of like the combination problems where Jenny has to choose between 5 blouses, 3 skirts, and 2 pairs of shoes.

Here's the cool part: (wait for it.........) because there are wayyyy too many combinations to test by hand in a truly organized manner, my advisor helped me WRITE A SCRIPT that RUNS ALL OF THE POSSIBLE ALGORIthm combinations sequentially! Isn't that amazing??????????? I think it's flippin' sweet! If you couldn't tell, I'm pretty excited. Running all of the combinations will definitely take a while, perhaps up to a week or longer, because it can take 30 minutes or more to run a particular combination. It may take even longer because we keep running into issues with the script we wrote so we'll see what happens!

So that is what's going on now and will be what I talk about in more detail in my final presentation/paper. With that said, the next time you'll hear from/see me is the day of my final presentation (May 2), so adiós amigos!

Friday, April 8, 2016

Week 9: Aaaaaaand We're Back!

April 9, 2016


My advisor is back! That means that my project is back to business as usual. Unfortunately, business as usual means a lot of debugging. As my advisor says, that's the problem with being on the "bleeding edge" of technology like this: you're going to run into implementation problems no matter what you do.

The plan for the foreseeable future is the following:
  1. Get all the code I'm using up to date with the code that's being written by developers at the USGS.
  2. Figure out how to use the Linux programs in tandem with the software that I was using before.
  3. Analyze the success of our matches.
Those are the main parts of my project going forward, and step 3 is going to take the most amount of time and will be the most involved. Although, based on the developments that have been made this week (namely bugs), Step 2 might take the longest.

Step 3 will involve many steps. First, I'll perform an image match. After the match has been completed, I'll look closely at the points that were created and do a blink comparison. This means that I'll pick a couple of points, probably one pair that corresponds to each of the images and then flash each image on the screen back and forth with the match point as a reference to see if there is any shift. A shift in this case would be the whole image moving a couple pixels to the right or left, or something similar. If a shift occurs, I'll try to determine how many pixels I'll need to move one of the points, and record what kind of manipulations I had to do.

This process was definitely not what my advisor told me verbatim, but it's close enough to understand the gist of what I'll be doing.

On a side note, I ran into an interesting problem this week where all the programs I was running on my computer taxed it completely and killed it; the memory filled completely and everything just stopped working all together. To fix this I had to get an external drive to store my virtual Linux on because I didn't realize that the Virtual Linux was really its on computer with its own memory and all that stuff (silly me!). So basically I had an entire computer on my computer (computer-ception) and that's super cool! I don't know how that escaped me.

Thursday, March 31, 2016

Week 8: Nothing to See Here, Please Disperse

March 31, 2016


Hello! I am sorry to report that absolutely no progress has been made this past week. Fortunately, I am 95% confident that this will be the last lacking week, because my advisor will return soon and everything will pick back up! Fingers crossed for lots of progress.

Friday, March 25, 2016

Week 7: What We Have Here is a Lack of Communication

March 25, 2016

At a Standstill

After my last post, in which little to no progress was made, I am happy to report that little to some progress has been made! The root of all this un-productivity is the fact that my advisor has been out of town for the past two weeks, and I am more or less incapable of doing my project without the advice of my advisor. This is because I am not one with the ways of the computer so I really need help figuring things out. In addition, the programs I using to carry out my tests and tasks are in no way intuitive, so a teacher who knows the programs is necessary.

Some Forward Motion

I mentioned above that some progress has been made. This progress comes in the form of learning how to use the Virtual Linux. Within the Virtual Linux, I have to use the Terminal application that I talked about in Week 1 in order to do many things.

One of the programs I'm using on the Linux is called QView. Don't ask me what the Q stands for... Within this program I can manipulate different pairs of images (if qview displays more than two images, it moves unbearably slowly). Here's an example:


I'm supposed to be able to pick match points between the images by hand using this program, but for now, all I've encountered is frustration. As I said before, it's not exactly intuitive. Until my advisor returns from his travels and is able to rescue me from the land of research project limbo, this will remain the state of my project.

Until next time!

Thursday, March 10, 2016

Week 6: Limbo

March 10, 2016

Hola, amigos! Unfortunately, this week was very unproductive so I don't have much to report. The lack of progress is due to some logistical issues.

Like I talked about last week, I began working on the manual selection of match points this week, but I didn't get very far. This is because in order to do manual matching, I have to use a program that was developed by the USGS on Linux computers and still only works on Linux computers. To get around this, I need to use a "virtual machine" which emulates another computer's operating system on my computer.

Using a Scientific Linux on my Mac (How cool!)

The problem is that the good folks at the USGS are working on a way of accessing outside drives through this virtual box, and because they haven't yet come up with a way to do this, I'm sitting here in limbo until they get that all figured out. In addition, my external advisor has been swamped with meetings and deadlines and such all week so we haven't had a lot of opportunities to meet and discuss the project. Fingers crossed that everything is resolved soon!

Thursday, March 3, 2016

Week 5: More Tests, New Algorithm

March 3, 2016

Woohoo! More testing! If you've been keeping up with my blog, you'll know that last week my advisor told me that we would be using a new method to find match points that would hopefully be more accurate. This method, which uses the "Fundamental Matrix" should be more accurate and make it easier to find errors. I still don't entirely (or at all) understand how the Fundamental Matrix works (thanks Saxon math...), but that's okay because I just need to understand how to use it in the program.

The Results

Here are some of the results we got when using the Fundamental Matrix method compared to the homography method. As you can see, in a couple different image situations, the fundamental matrix gave more match points. If you look at the corresponding images, you'll notice that in addition to more match points, the fundamental matrix gives a slightly more evenly distributed set of points than the homography.

Image Set Number of Points
Control - Homography
624
Control - Fundamental Matrix
827
5º Rotation - Homography
482
5º Rotation - Fundamental Matrix
574

Control - Homography

Control - Fundamental Matrix

Rotation - Homography

Rotation - Fundamental Matrix
Based on these results, I think we will begin replacing the homography with the fundamental matrix and epipolar lines.

Looking Ahead

Next week, the plan is for my advisor to teach me how to use the OpenCV software to choose manual match points. Once I get that down, we will be comparing the quality of automated chosen match points to the quality of our manually chosen match points. We'll see how it goes!