I pretend to be a part of HET SPHERES

I do other things beside develop Ames Stereo Pipeline. I actually have to this month because my projects’ budgets are being used to pay for other developers. This is a good thing because it gets dug in developers like me out of the way for a while so that new ideas can come in. One of the projects I occasionally get to help out on is HET Spheres.

This is a picture of that robot. The orange thingy is the SPHERE robot designed by MIT. The blue puck is an air carriage so we can do frictionless testing in 1G. There have been 3 SPHERES robots onboard the ISS for quite some time now and they’ve been hugely successful. However we wanted to have an upgrade of the processing power available on the SPHERES. We also wanted better wireless networking, cameras, additional sensors, and a display to interact with the Astronauts. While our manager, lord, and savior Mark listed off all these requirements, we attentively played angry birds. That’s when it suddenly became clear that all we ever wanted was already available in our palms. We’ll use cellphones! So, though crude, we glued a cellphone to the SPHERE and called it a day.

Actually a lot more work happened then that and you can hear about that in Mark’s Google Tech Talk. I also wasn’t involved in any of that work. I tend to do other stuff that is SPHERES development related. But I spent all last week essentially auditing the console side code and the internal SPHERE GSP code. I remembered why I don’t like Java and Eclipse. (I have to type slower so Eclipse will autocomplete. :/) This all collimated into the following video of a test of having the SPHERE fly around a stuffed robot. We ran out of CO2 and our PD gains for orientation control are still out-of-whack, but it worked!

Computer Vision and Cellphone

I’m applying to graduate school again. To help advertise myself, I decided to take a stab at reproducing a student’s work from one of the labs I want to join. Here’s a short video demonstrating the work:

The original author’s thesis was about the application of a new style of kalman filter. The computer vision measurements were just inputs to get his research going. Thus, this fiducial work is extremely simple and fun to implement. The only novel bit was that everything was working on a cellphone and I didn’t use any Matlab.

This entire project works by simply segmenting the circles from the image via a threshold operation. Then two passes of connect blob searches are applied, one for white and one for black. The code looks for blobs with aligned centers. If there are 4 in the scene, it calls it a success. Yes, there are many ways to break these assumptions.

After locating the 4 circles, it identifies them by their ratio of black to white area. The boldest circle is defined as the origin. The identified location of these circles, their measured locations on the paper, and the focal length of the camera are then fed to an implement of Haralick’s iterative solution for exterior orientation. That algorithm solves for translation and rotation of the camera by reducing projection error through the camera matrix.

Code for the blob detection and exterior solving are available here. The java user interface is simply a straight rip from the demos that come with Android OpenCV. So it shouldn’t take long to reproduce my work.

To see what inspired me, please investigate the following papers:

[1] B. E. Tweddle, “Relative Computer Vision Based Navigation for Small Inspection Spacecraft,” presented at the AIAA Guidance, Navigation and Control Conference and Exhibition, 2011.
[2] B. E. Tweddle, “Computer Vision Based Proximity Operations for Spacecraft Relative Navigation,” Master of Science Thesis, Massachusetts Institute of Technology, 2010.

Winging a DEM for a mission using World View 1

The group I work for at NASA has a big robot that likes to drive in a quarry at speed. Doing this is risky as we could easily put the robot in a position to hurt itself or hurt others. One of things we do to mitigate the risk is by having a prior DEM of the test area. The path planning software can then use the DEM to determine where it is and what terrain is too difficult to cross.

Since ASP recently gained the ability to process Digital Globe and GeoEye imagery (more about that in a later post), I was given a request to make a DEM from some World View 1 imagery they purchased. The location was Basalt Hills, a quarry at the south end of the San Luis Reservoir. To process this imagery with any speed, it is required to map project the imagery on some prior DEM. My choices were SRTM or NED. In my runs, both DEMs have problems. SRTM has holes in the middle of it that needed to be filled so ASP would work correctly. NED had linear jumps in it that ASP couldn’t entirely reverse in its math.

I ended up using SRTM as a seed to create my final DEM of the quarry. If you haven’t seen this, the process looks like the following commands below in ASP 2.0+. What’s happening is that ASP uses an RPC map projection to overlay the imagery over SRTM. When it comes time for ASP to triangulate, it reverses math it used to map project, and then in the case of Digital Globe it will triangulate using the full camera model. Another thing worth noting is that ASP needs control over how the interpolation is performed when doing RPC map projection. This forces us not to use the GDAL utilities during this step and instead use our own custom utility.

parallel rpc_mapproject --tr 0.5 \
      --t_srs'"+proj=utm +zone=10 +datum=WGS84 +units=m +no_defs"' \
      filled_srtm_dem.tif {} {.}.XML {.}.srtm.crop.tif ::: left.TIF right.TIF
stereo left.srtm.crop.tif right.srtm.crop.tif left.XML right.XML \
      r1c1_srtm_crop/r1c1_srtm_crop filled_srtm_dem.tif

Afterwards we got a pretty spiffy result that definitely shows more detail than the prior DEM sources. Unfortunately the result was shifted from the NED DEM source that my crew had previously been using. This ideally would be fixed by bundle adjusting the World View camera locations. It was clearly needed as most of our projected rays only came within 3 meters of each other. Unfortunately ASP doesn’t have that implemented.

EDIT: If I had paid closer attention to my data I would have noticed that a large part of the differences I was seeing between my DEM and USGS’s NED was because the NED data uses a vertical datum. My ASP DEM are referenced against the WGS84 ellipsoid. NED data is referenced against WGS84 plus the NAVD88. This would account for a large part of the 30 meter error I was seeing. (11.19.12)

My “I’m-single-with-nothing-going-on-tonight” solution was the Point Cloud Library. It has promising iterative closest point (ICP) implementations inside it and will eventually have the normal distribution transform algorithm in it. It also has the benefit of having its libraries designed with some forethought compared to the hideous symbol mess that is OpenCV.

PCL's pcd_viewer looking at the quarry.

I achieved ICP with PCL by converted my PC (point cloud) file from ASP into a PCL PCD file [1]. I also converted the NED DEM into a PCD file [2]. I then subsampled my ASP point cloud file to something more manageable by PCL’s all-in-memory tactics [3]. Then I performed ICP to solve for the translation offset I had between the two clouds [4]. My offset ended up being about a 40 meter shift in the north and vertical direction. I then applied this translation back to the ASP PC file [5] so that the DEM and DRG could be re-rendered together using point2dem like normal.

I wrote this code in the middle of the night using a lot of C++ because I’m that guy. Here’s the code I used just for reference in the event that it might help someone. Likely some of the stuff I performed could have been done in Python using GDAL.

1. convert_pc_to_pcd.cc
2. convert_dem_to_pcd.cc
3. pcl_random_subsample.cc
4. pcl_icp_align.cc
5. apply_pc_offset.cc

After rendering a new DEM of the shifted point cloud, I used MDenoise to clean up the DEM a bit. This tool is well documented at its own site (http://personalpages.manchester.ac.uk/staff/neil.mitchell/mdenoise/).

I’ve also been learning some QGIS. Here are some screen shots where you can see the improved difference map between NED and my result after ICP. Generally this whole process was very easy. It leaves me to believe that with some polish this could make a nice automated way to build DEMs and register them against a trusted source. Ideally bundle adjustment would be performed, but I have a hunch that the satellite positioning for Earth targets is so good that very little shape distortion has happen in our DEM triangulations. I hope this has been of interest to some of you out there!

Difference map between the USGS NED map and ASP's WV01 result.