Second Thought on LRO-NAC NoProj

Mark Rosiek from USGS Astrogeology expressed some doubt in my noproj recipe for LRO-NAC. This is completely reasonable because if everything were perfect, there would be no offset after the noproj step between the LE and RE CCDs. He requested 2 DEM samples of Tsiolkovsky Crater so he could compare to USGS work. I decided I’d try the same during free time through out the day. Unforunately I couldn’t find a trusted reference DEM to compare against. I can’t find ASU’s result of this area on their RDR webpage and it is impossible to use LMMP Portal. Can you even download the actual elevation values from LMMP? No I don’t want your color render or greyscale! 

Next best idea is to just difference the two DEMs I created and compare them. I didn’t bundle adjust these cameras at all so their placement to each other is off by some ~150 meters. After ICPing them together and then differencing them I can produce an error map. The gradient or shape of the error can help clue into how bad my fiddling in the ideal camera was. Below is the result between the stereo pairs M143778723-M143785506 and M167363261- M167370048.

You can definitely see the CCD boundary in this error map. You can see the CCD boundary in the hillshade. It’s about a 1 meter jump when looking at a single DEM. Error map seems to agree with this and shows a peak error at 4 meters in the CCD boundary location.

So what is the cause of these errors? Well we have two sources, (1) the projection into the ideal camera model and (2) bad spacecraft ephemeris. I’d argue that most of the difference between these two DEMs is the spacecraft ephemeris. That’s top priority in my book to correct. However when I look at the disparity map for these stereo pairs, there is a definitive jump at the CCD boundary. The cause of that would be an improper model of the angle between LE and RE cameras in ISIS. This should be expected. They’re currently not modeling the change in camera angles with respect to temperature. Also when looking at the vertical disparity, it seems one side is always brighter than the other. This suggests that I didn’t quite have the correct values for input to handmos.

Trying to fix all of this now might be a mistake on my part. I know that ASU will eventually produce new SPICE data that contains corrections from LOLA and a USGS study. I also imagine that eventually the work of E. J. Speyerer et al. will be implemented in ISIS.

I pretend to be a part of HET SPHERES

I do other things beside develop Ames Stereo Pipeline. I actually have to this month because my projects’ budgets are being used to pay for other developers. This is a good thing because it gets dug in developers like me out of the way for a while so that new ideas can come in. One of the projects I occasionally get to help out on is HET Spheres.

This is a picture of that robot. The orange thingy is the SPHERE robot designed by MIT. The blue puck is an air carriage so we can do frictionless testing in 1G. There have been 3 SPHERES robots onboard the ISS for quite some time now and they’ve been hugely successful. However we wanted to have an upgrade of the processing power available on the SPHERES. We also wanted better wireless networking, cameras, additional sensors, and a display to interact with the Astronauts. While our manager, lord, and savior Mark listed off all these requirements, we attentively played angry birds. That’s when it suddenly became clear that all we ever wanted was already available in our palms. We’ll use cellphones! So, though crude, we glued a cellphone to the SPHERE and called it a day.

Actually a lot more work happened then that and you can hear about that in Mark’s Google Tech Talk. I also wasn’t involved in any of that work. I tend to do other stuff that is SPHERES development related. But I spent all last week essentially auditing the console side code and the internal SPHERE GSP code. I remembered why I don’t like Java and Eclipse. (I have to type slower so Eclipse will autocomplete. :/) This all collimated into the following video of a test of having the SPHERE fly around a stuffed robot. We ran out of CO2 and our PD gains for orientation control are still out-of-whack, but it worked!

500 hp in a Miata

DSC01946.JPGDSC01949.JPG

My friend has a cooler car than me. This Miata has been fitted with a LS3 V8 engine, a transmission from a Viper, a CTS-V’s differential, custom suspension, and the required reinforcement of the car’s frame. He can go faster than you.

OpenJPEG might be usable

You know about Jpeg2000 right? Wavelet compression, top notch work of the 90’s, an image compression format that promises better results than JPEG and can be lossless for some pixel types. Well it totally exists and commercial software uses it quite a bit. It has taken quite a hold of the satellite imaging sector as it allows image compression to 1/6th the size of a traditional TIFF. Unfortunately there doesn’t seem to be any good open source libraries available for everyone else. There’s Jasper, OpenJPEG, and CQJ2K but they were always a magnitude or more slower than the commercial product Kakadu.

OpenJPEG had an official 2.0.0 release on December 1st of last year and it is actually worth a glance. Unfortunately the current release of GDAL, version 1.9.2, doesn’t support this new release. It was designed for a prototype of OpenJPEG found at revision 2230 of OpenJPEG’s SVN repo. If you are willing though, the new OpenJPEG v2 release contains the executables opj_decompress and opj_compress for conversion of JP2 files to and from TIFF, PNG, and JPEG formats. Another alternative is also downloading the current development version of GDAL 1.10 which has support for the new OpenJPEG v2 library and can leverage it to read things like NTF. I performed some rough / unscientific tests of these configurations this weekend and my notes are below. My conclusion is that OpenJPEG 2.0.0 is decently nice and I can’t wait for the next release of GDAL so that I can roll it into Ames Stereo Pipeline.

Conversion times for 400 MB JP2 to 2 GB Tiled TIFF
Command Time Peak Memory
OJP r2230's j2k_to_image greater than 2 days ~1 MB
GDAL 1.9.2's gdal_translate w/ OJP r2230 greater than 2 days ~10 MB
OJP v2's opj_decompress 4 min ~2 GB
GDAL 1.10's gdal_translate w/ OJP v2 w/ GDAL_CACHEMAX=512 5 min ~600 MB

Computer Vision and Cellphone

I’m applying to graduate school again. To help advertise myself, I decided to take a stab at reproducing a student’s work from one of the labs I want to join. Here’s a short video demonstrating the work:

The original author’s thesis was about the application of a new style of kalman filter. The computer vision measurements were just inputs to get his research going. Thus, this fiducial work is extremely simple and fun to implement. The only novel bit was that everything was working on a cellphone and I didn’t use any Matlab.

This entire project works by simply segmenting the circles from the image via a threshold operation. Then two passes of connect blob searches are applied, one for white and one for black. The code looks for blobs with aligned centers. If there are 4 in the scene, it calls it a success. Yes, there are many ways to break these assumptions.

After locating the 4 circles, it identifies them by their ratio of black to white area. The boldest circle is defined as the origin. The identified location of these circles, their measured locations on the paper, and the focal length of the camera are then fed to an implement of Haralick’s iterative solution for exterior orientation. That algorithm solves for translation and rotation of the camera by reducing projection error through the camera matrix.

Code for the blob detection and exterior solving are available here. The java user interface is simply a straight rip from the demos that come with Android OpenCV. So it shouldn’t take long to reproduce my work.

To see what inspired me, please investigate the following papers:

[1] B. E. Tweddle, “Relative Computer Vision Based Navigation for Small Inspection Spacecraft,” presented at the AIAA Guidance, Navigation and Control Conference and Exhibition, 2011.
[2] B. E. Tweddle, “Computer Vision Based Proximity Operations for Spacecraft Relative Navigation,” Master of Science Thesis, Massachusetts Institute of Technology, 2010.