Code to Semi Global Matching

I’ve received a few emails asking me for the code to my implementation of Semi Global Matching. So here it is in the state that I last touched it. This is my icky dev code and I have no plans for maintaining it. In another month I’ll probably forget how it works. Before you look at the code, take to heart that this is not a full implementation of what Hirschmuller described in his papers. I didn’t implement 16-path integration or the mutual information cost metric.

TestSemiGlobalMatching is my main function. I developed inside a GTest framework while I was writing this code. [source]

The core of the algorithm is inside the following header and source file links, titled SemiGlobalMatching. As you can see it is all in VW-ese. Most of the math looks Eigen-like and there’s a lot of STL in action that I hope you can still read along. [header] [source]

Also, I haven’t forgotten that I promised to write another article on what I thought was a cool correlator idea. I’m still working on that code base so that I can show off cool urban 3D reconstruction using the satellite imagery I have of NASA Ames Research Center and Mountain View. (I think I have Google’s HQ in the shot.)

Update 3/3/2014

Someone emailed me to figure out what is the MOC imagery that the code keeps referring to? In this case MOC stands for the Mars Orbital Camera on board the 1996 Mars Global Surveyor mission. The stereo pair is epipolar rectified images of Hrad Vallis that I use regularly and represent a classical problem for me when performing satellite stereo correlation. A copy of the input imagery is now available here [epi-L.tif][epi-R.tif].

Planetary Data Workshop

2 Weeks ago I had the pleasure of going to Flagstaff AZ for the Planetary Data Workshop. It was a conference for scientists to express their needs and for engineer types to discuss their solutions. As can be guessed from the name, our topics were the dispersal, the processing, and the tools used for scientific imagery of the planets in our solar system (primarily from NASA’s robotic missions).

I was at the conference to discuss my software product, Ames Stereo Pipeline. I gave two talks and eventually they’ll be available on YouTube. I also gave an hour long tutorial that is now online. It’s the video above. I’m not sure how interesting it is to watch but it was a lot of fun performing the tutorial. I now realize how much of a nerd I sound like. I ended up throwing away my prepared HiRISE and LRO-NAC imagery. The crowd that attended seemed more interested in the mass processing of CTX imagery. I, unfortunately, did not have any CTX data on my laptop. Instead, I asked Fred Calef from JPL for a CTX stereo pair that he wanted processed. To my benefit, ASP v2 processed it autonomously without hassle during the demo! My laptop managed to chunk through the data in 15 minutes. I spent most of the tutorial just talking about the ancillary files and what users can look into to see if their output will turn out all right.

Shooting from the hip for a tutorial could have bitten me pretty badly. But I think ASP really has improved a lot and is ready for mass production environments. I’m trying to push for one for earth polar imagery but there are many more datasets that could have this same treatment. I think that idea became clear to the 30 guests who attended my tutorial. We’ve had an uptick in downloads and I hope that means I’ll be seeing some cool 3D models in the future.

Sidenote:  I found out that Jay Laura from Penn State has a blog going called Spatially Unadjusted. He’s a GIS guy who also uses ASP (Aww yeah). He presented a poster on his experience of using ASP v1.0.5 for LRO-NAC imagery.