PC Align

Last month we had a new release of Ames Stereo Pipeline, version 2.3! We’ve performed a lot of bug fixing and implementing new features. But my two new prized features in ASP are pc_align and lronac2mosaic. Today I’d like to only introduce pc_align, a utility for registering DEMs, LIDAR points, and ASP point clouds to each other. All you have to do is specify an input, a reference, and an approximate estimate for how bad you think the misplacement is.

Does that sound like magic? Under the hood, pc_align is performing an implementation of the iterative closest point algorithm (ICP). Specifically we are using internally libpointmatcher library from ETH. What ICP does is iteratively attempt to match every point of the input with the nearest neighbor in the reference point cloud. ICP then solves for a transform that would globally reduce the distances between the current set of matches. Then it repeats itself and performs a new round of matching input to reference and then again solves for another global step. Repeat, repeat, repeat until we no longer see any improvements in the sum of distances between matches.

This means that failure cases for PC_align are when the reference set is too coarse to describe the features seen in the input. An example would be having a HiRISE DEM and then only having a single orbit of MOLA that intersects. A single line of MOLA does nothing to constrain the DEM about the axis of the shot line. The 300 meter post spacing might also not be detailed enough to constrain a DEM that is looking at small features such as dunes on a mostly flat plane. What is required is a large feature that both the DEM and the LIDAR source can resolve.

CTX to HRSC example

Enough about how it works. Examples! I’m going to be uncreative and just process CTX and Gale Crater because they’re fast to process and easy to find. I’ve processed the CTX images P21_009149_1752_XI_04S222W, P21_009294_1752_XI_04S222W, P22_009650_1772_XI_02S222W, and P22_009716_1773_XI_02S223W using stereo options “–alignment affineepipolar –subpixel-mode 1”. This means correlation happens in 15 minutes and then triangulation takes an hour because ISIS subroutines are not thread safe. I’ve plotted these two DEMs on top of a DLR HRSC product, H1927_0000_DT4.IMG. It is important to note that this version of the DLR DEM is referenced against the Mars ellipsoid and not the Aeroid. If your data is referenced against the Aeroid, you’ll need to use dem_geoid to temporarily remove it for processing with pc_align who only understand ellipsoidal datums.

In the above picture, the two ASP created DEMs stick out like sore thumbs and are misplaced by some 200 meters. This is due to pointing information for MRO and subsequently CTX being imperfect (how much can you ask for anyway?). You could run jigsaw and that is the gold standard solution, but that takes a lot of manual effort. So instead let’s use pc_align with the commands shown next.

> pc_align --max-displacement 200 H1927_0000_DT4.tif P22-PC.tif \
        --save-transformed-source-points  -o P22_align/P22_align \
> point2dem --t_srs "+proj=sinu +lon_0=138 +x_0=0 +y_0=0 \
        +a=3396000 +b=3396000 +units=m +no_defs" \
        --nodata -32767 P22_align/P22_align-trans_source.tif

There are 3 important observations to make from the command line above. (1) We are using the max displacement option to set the upper bound of how bad we think we are, 200 meters. (2) I’m feeding the ASP PC file as the input source instead of ASP’s DEM. This is because in (3) with the save transformed source points we’ll be writing out another PC file. PC align can only export PC files so we always have to perform another round of point2dem. It is possible to run stereo -> point2dem -> PC Align -> point2dem, but it means you are unneccesarily resampling your data once. Using the PC file directly from stereo saves us from potential aliasing and removes a point2dem call.

Here’s the final result where both CTX DEMs are plotted on top of the HRSC DEM. Everything looks really good except for that left edge. This might be because the DEM was rendered with incorrect geometry and the output DEM is subtly warped from a perfect solution.

Another cool feature that pc_align author Oleg Alexandrov added was recording the beginning and ending matching errors in CSV files. They’re found with the names <prefix>-{beg,end}_errors.csv. You can load those up in QGIS and plot theirs errors to visualize that pc_align uniformly reduced matching error across the map. (Thanks Ross for showing me how to do this!)

CTX to MOLA

Quite a few MOLA shots can be found inside the CTX footprints. Above is a plot of all MOLA PEDR data for Gale crater. I was able to download this information in CSV format from the MOLA PEDR Query tool from Washington University St. Louis. Conveniently PC_align can read CSV files, just not in the format provided by this tool. PC_align is expecting the data to be in format long, lat, elevation against ellipsoid. What is provide is lat, long, elevation against aeroid, and then radius. So I had to manually edit the CSV in Excel to be in the correct order and create my elevation values by subtracting 3396190 (this number was wrong in first draft) from the radius column. The other added bit of information needed with CSV files is that you’ll need to define the datum to use. If you don’t, pc_align will assume you’re using WGS84.

> pc_align --max-displacement 200 P22-PC.tif mola.csv\
        -o P22_mola/P22_mola --datum D_MARS --save-inv-trans \
> point2dem --t_srs "+proj=sinu +lon_0=138 +x_0=0 +y_0=0 \
        +a=3396000 +b=3396000 +units=m +no_defs" \
        --nodata -32767 P22_mola/P22_mola-trans_reference.tif

Two things to notice in these commands, the inputs are backwards from before and I’m saving the inverse transform.  You can keep things in the same order as when I was aligning to HRSC,  it is just that things will run very slowly. For performance reasons, the denser source should be considered the reference and then you must request the reference to be transformed to the source. You’ll likely routinely be using this inverse form with LIDAR sources.

In the end I was able to reduce alignment error for my CTX DEMs from being over 50 meters to being less than 15 meters against MOLA and from over 100 meter to 40 meters error against HRSC. A result I’m quite happy with for a single night processing at home. You can see my final composited MOLA registered CTX DEMs on the left. The ASP team will have more information about pc_align in LPSC abstract form next year. We also hope that you try out pc_align and find it worth regular use in your research.

Update:

I goofed in the MOLA example! Using D_MARS implies a datum that is a sphere with 3396190 meter radius. I subtracted the wrong number from MOLA’s radius measurement before (the value 3396000). That probably had some effect on the registration result shown in the pictures, but this mistake is smaller than the shot spacing of MOLA. Meaning the horizontal registration is fine, but my output DTMs are 190 meters higher than they should have been. FYI, D_MOON implies a datum that is a sphere with radius 1737400 meters.

Processing Antarctica

I’ve been sick all last week. That hasn’t stopped me from trying to process World View imagery in bulk on NASA’s Pleiades supercomputer. Right now I’m just trying to characterize how big of a challenge it is to process this large satellite data on a limited memory system for an upcoming proposal. I’m not pulling out all the tricks we have to insure that all parts of the image correlate. Still that hasn’t stopped ASP from producing this interesting elevation model of a section of Antarctica’s coastline, just off of Ross Island. Supposedly Marble Point Heliport is in this picture (QGIS told me it was the blue dot at the bottom of the coastline).

I’m using homography alignment, auto search range, parabola subpixel, and no hole filling. The output DEMs were rasterized at 5 meters per pixel. The crosses or fiducials in the image are posted 5 km apart. This represents a composite of 10 pairs of WV01 stereo imagery from 2009 to 2011 and no bundle adjustment or registration has been applied. The image itself is just a render in QGIS where the colorized DEM has had a hillshade render of the same DEM overlayed at 75% transparency.

I haven’t investigated why more of the mountains didn’t come out. When it looks like a whole elevation contour has been dropped, that’s likely because auto search range didn’t guess correctly. When it looks like a side of the mountain didn’t resolve, that’s likely because there was shadow or highlight saturation in the image. Possibly it could also be that ASP couldn’t correlate correctly on such a steep slope.

Winging a DEM for a mission using World View 1

The group I work for at NASA has a big robot that likes to drive in a quarry at speed. Doing this is risky as we could easily put the robot in a position to hurt itself or hurt others. One of things we do to mitigate the risk is by having a prior DEM of the test area. The path planning software can then use the DEM to determine where it is and what terrain is too difficult to cross.

Since ASP recently gained the ability to process Digital Globe and GeoEye imagery (more about that in a later post), I was given a request to make a DEM from some World View 1 imagery they purchased. The location was Basalt Hills, a quarry at the south end of the San Luis Reservoir. To process this imagery with any speed, it is required to map project the imagery on some prior DEM. My choices were SRTM or NED. In my runs, both DEMs have problems. SRTM has holes in the middle of it that needed to be filled so ASP would work correctly. NED had linear jumps in it that ASP couldn’t entirely reverse in its math.

I ended up using SRTM as a seed to create my final DEM of the quarry. If you haven’t seen this, the process looks like the following commands below in ASP 2.0+. What’s happening is that ASP uses an RPC map projection to overlay the imagery over SRTM. When it comes time for ASP to triangulate, it reverses math it used to map project, and then in the case of Digital Globe it will triangulate using the full camera model. Another thing worth noting is that ASP needs control over how the interpolation is performed when doing RPC map projection. This forces us not to use the GDAL utilities during this step and instead use our own custom utility.

parallel rpc_mapproject --tr 0.5 \
      --t_srs'"+proj=utm +zone=10 +datum=WGS84 +units=m +no_defs"' \
      filled_srtm_dem.tif {} {.}.XML {.}.srtm.crop.tif ::: left.TIF right.TIF
stereo left.srtm.crop.tif right.srtm.crop.tif left.XML right.XML \
      r1c1_srtm_crop/r1c1_srtm_crop filled_srtm_dem.tif

Afterwards we got a pretty spiffy result that definitely shows more detail than the prior DEM sources. Unfortunately the result was shifted from the NED DEM source that my crew had previously been using. This ideally would be fixed by bundle adjusting the World View camera locations. It was clearly needed as most of our projected rays only came within 3 meters of each other. Unfortunately ASP doesn’t have that implemented.

EDIT: If I had paid closer attention to my data I would have noticed that a large part of the differences I was seeing between my DEM and USGS’s NED was because the NED data uses a vertical datum. My ASP DEM are referenced against the WGS84 ellipsoid. NED data is referenced against WGS84 plus the NAVD88. This would account for a large part of the 30 meter error I was seeing. (11.19.12)

My “I’m-single-with-nothing-going-on-tonight” solution was the Point Cloud Library. It has promising iterative closest point (ICP) implementations inside it and will eventually have the normal distribution transform algorithm in it. It also has the benefit of having its libraries designed with some forethought compared to the hideous symbol mess that is OpenCV.

PCL's pcd_viewer looking at the quarry.

I achieved ICP with PCL by converted my PC (point cloud) file from ASP into a PCL PCD file [1]. I also converted the NED DEM into a PCD file [2]. I then subsampled my ASP point cloud file to something more manageable by PCL’s all-in-memory tactics [3]. Then I performed ICP to solve for the translation offset I had between the two clouds [4]. My offset ended up being about a 40 meter shift in the north and vertical direction. I then applied this translation back to the ASP PC file [5] so that the DEM and DRG could be re-rendered together using point2dem like normal.

I wrote this code in the middle of the night using a lot of C++ because I’m that guy. Here’s the code I used just for reference in the event that it might help someone. Likely some of the stuff I performed could have been done in Python using GDAL.

1. convert_pc_to_pcd.cc
2. convert_dem_to_pcd.cc
3. pcl_random_subsample.cc
4. pcl_icp_align.cc
5. apply_pc_offset.cc

After rendering a new DEM of the shifted point cloud, I used MDenoise to clean up the DEM a bit. This tool is well documented at its own site (http://personalpages.manchester.ac.uk/staff/neil.mitchell/mdenoise/).

I’ve also been learning some QGIS. Here are some screen shots where you can see the improved difference map between NED and my result after ICP. Generally this whole process was very easy. It leaves me to believe that with some polish this could make a nice automated way to build DEMs and register them against a trusted source. Ideally bundle adjustment would be performed, but I have a hunch that the satellite positioning for Earth targets is so good that very little shape distortion has happen in our DEM triangulations. I hope this has been of interest to some of you out there!

Difference map between the USGS NED map and ASP's WV01 result.