LatLon Notebooks

I finally got around to checking my mail at work and found some LatLon notebooks! They’re all island themed with topo maps on their covers. The one pictured is my favorite since it features Mars and the elevation data was from CTX data processed by ASP. Yay!

Thank you Aitor Garcia. I’m now the engineer with the best stationery at NASA ARC.

Mass CTX Processing

A few weeks back, Ross Beyer presented my blog posts on autonomous HiRISE DEM processing at the HiRISE team meeting in Tucson, AZ. This brought about a question of could this be performed for CTX. Of course! ASP can be applied to do bulk processing for any of the missions it supports. Earth or any place ISIS and CSpice have defined coordinate system for. Just put in some safeties about run time into the processing scripts because ASP still occasionally goes mad and eats a whole bunch of processing time for no output. (Were working on it!)

Processing CTX stereo pairs however is in fact a little more difficult compared to the HiRISE processing I was doing before. HiRISE lists all their stereo pairs from their website or Dr. Shane Byrne’s website. There’s no equivalent for CTX. Luckily for me, some folks at UofA wrote PairendipityCTX (Chris Schaller?). They provided Ross and I a detailed report of overlapping files and other statistics. I cut out everything but the filenames for my own use and you can get a copy of the list with its 1,542 stereo pairs here.

Another difference was how these two missions stored their data on PDS. I can look at a HiRISE filename and work out its download path in PDS with no trouble. CTX on the other hand seems to have arbitrary volume breaks, thus the download URL is not predictable. My solution is a bad solution, but a quick solution. I wrote a python script that scraped PDS’s servers and identified all the CTX images it has and what their URLs are. I then just ‘grep’ for the URL in the processing scripts. Here’s the resulting text file that lists all of the 50,708 CTX images that existed at the time of my scraping. This is a mean trick because my script can make HTTP requests much faster than a human can. In a sense, I’m DOS’ing the PDS servers. So please copy my results rather than my methods.

Processing scripts

Previously with my autonomous HiRISE processing efforts, I just wrote a Bash script and then spawned it multiple times on my server using GNU parallel. That’s great, but looking around my office I saw a lot of unused computers that I’d like to do my bidding. What I wanted was a job management system like PBS, which is what they use on the super computer. But PDS is a little too heavy and sometimes cost money. So I instead found two alternative projects that seemed to fit the bill, Gearman and Celery. Gearman is the minimalist. Celery required a database backing and multiple ports open with each slave worker. I decided to use Gearman since it seemed simpler to learn.

My code for this little project is available on Github in my Mars3DGearman project. I hope to eventually add HiRISE support. Here’s how it plays out. All machines make a working directory that contains the CTX stereo list, the CTX url lookup list, and the folders DEM and DRG. One machine is designated the server, in my case, my workstation at home. It starts the only instance of ctx_processor.py with the help of a backing ‘gearmand’ executable (gearman daemon). All of the slaves then SSH back to my server and forward 4730, the port used by Gearman for communication. Then all the slave machines can start one or more instances of ctx_worker.py.

The part I haven’t worked out is how to send home the completed results to my main server. The ctx_worker script produces a DEM and orthophoto and then just dumps it locally into the DEM and DRG folder. Gearman does allow sending binary strings back the main server, but I’m betting a 100 MB long string would break something. I chose a different route. Since I already have all the slaves SSH’ing back to my main server, I decided to simply rsync all the slaves’ DEM and DRG folder back home. I don’t have to re-enter my password as I’ve enabled SSH ControlMaster which re-uses previous connections. For now, I just put that rsync in a watch command that fires every 2 hours. Alternatively it could be inside the ctx_worker script. A better bet would be to use SSH keys.

Another worthwhile improvement compared to my HiRISE processing scripts is the inclusion of a timeout method for each step. When it comes to CTX, if the correlation doesn’t finish in 2 hours, I’m not interested. This timeout is achieved through the run_cmd and process_timeout functions in the ctx_worker script. The Internet helped me a lot in figuring out how to make that a reality. Thanks Internet!

Results

These last few days I’ve roped 4 machines into doing my bidding. So far they’ve produced ~260 DEMs. 5 new DEMs completed just while I was writing this article. There are still some hiccups in the process. But when the stars align, I seem to produce over 50 new DEMs every day. I’m not going to show you all of them as that felt like a lot of work to post on to the blog. Instead I’m just going to show off a couple screenshots of some interesting places in Google Mars. The color ramp is a little funky because someday I need to learn to reference everything against the Mars Aeroid and not just the sphere.

Not everything looks as great as those screenshots. Here are some places that failed to correlate. I’m not sure what went wrong and I unfortunately don’t have the time to investigate.

In conclusion, this is just another proof of concept of what is possible. I hope that someday someone else will attempt this and do a better job than me. ASP is not perfect, but it can achieve a lot of processing on its own that could be beneficial to the scientific community.

Related

Shean, D. E., et al. “MRO CTX Stereo Image Processing and Preliminary DEM Quality Assessment.” Lunar and Planetary Institute Science Conference Abstracts. Vol. 42. 2011.

David Shean, MSSS, and Larry Edwards actually already attempted this once before! In the above abstract you can see that they went above and beyond my little demo and processed 1180 stereo pairs. They also developed some initial steps for registering the output DEMs to MOLA and plotted the relationship convergence angle has on the outcome of a stereo pair.

Automatic Production of HiRISE DEMs 4

Server is still running. ASP has changed during this run. We introduced a new IP filtering technique, MPI Parabola was sped up, and added a faster point2dem. I don’t remember when these changes happened to my server while processing these images. I also added a timeout for stereo pairs that take longer than 3 days to process. This is to stop a bad pair from just dominating the machine for 2 weeks. During this run, I also changed the settings so that two stereo pairs are always being processed at the same time. This is to keep my CPU pegged.

HiRISE Processing Log
Left Image Time What Happened
ESP_011543_1665 N/A Timed out
ESP_011563_2200 Lost Too narrow search
ESP_011582_1730 Lost Too narrow search
ESP_011592_1595 1d 10h I dunno
ESP_011608_1525 3h Too narrow search
ESP_011661_1410 N/A Timed out
ESP_011662_1750 1d 5h Pretty good
ESP_011672_1395 1d 12h Awesome
ESP_011675_1470 20h Too narrow search
ESP_011676_1700 1d 10h Awesome
ESP_011677_1655 2h Too narrow search
ESP_011683_1540 2d I dunno
ESP_011688_1760 N/A Timed out
ESP_011701_1790 11h Okay
ESP_011715_1800 15h Pretty good
ESP_011717_1910 2d 12h Pretty good
ESP_011720_1835 21h I dunno
ESP_011722_1460 N/A Timed out
ESP_011728_0985 N/A Timed out
ESP_011740_1830 1d 7h Too narrow search
ESP_011743_1725 5h Pretty good
ESP_011761_1485 13h Too narrow search
ESP_011765_1780 N/A Failed to download
ESP_011767_1640 4h Pretty good
ESP_011772_1790 2h Too narrow search
ESP_011773_1460 21h Bad search range
ESP_011780_1515 7h Awesome
ESP_011785_1875 4h Too narrow search
ESP_011818_1505 2h Awesome

Images

Automatic Production of HiRISE DEMs 3

It’s still kinda cold in San Jose so I’m still processing HiRISE DEMs. Please look at this post to see how I’m creating these elevation models.

HiRISE Processing Log
Left Image Time What Happened
ESP_011377_1835 1h 30m Good
ESP_011384_2185 7h 20m Good
ESP_011385_1695 13h 10m Goodish
ESP_011386_2065 6h 0m Goodish
ESP_011392_1370 2h 0m Good
ESP_011397_1535 2.5 weeks Correlated Noise
ESP_011399_2045 N/A Failed to make cube files
ESP_011406_0945 4h 0m Too small search range
ESP_011417_1755 12h 40m Good
ESP_011425_1775 3h 20m Good
ESP_011440_1335 3h 20m Good
ESP_011443_1380 18h 10m Too small of search range
ESP_011447_0950 1h 40m Good
ESP_011494_1535 2h 30m Good
ESP_011504_1700 1 week Not sure – Maybe poor texture?
ESP_011523_1695 7h 20m Not sure

Images

Prettiest DEM Award of the bunch goes to user ‘jwray’ for picking out ESP_011494_1535 and ESP_011639_1535. Supposedly this is “Sirenum crater floor LTLD with Al-OH”. I don’t know what that means.

New CTX Layer in Google Earth

The Google community silently released a few new features for their Mars mode in Google Earth. MER-B, Opportunity, now has an updated traverse path thanks to a fellow at the Unmanned Spaceflight forum along with a new base map that Ross created. However I’m really excited about a new CTX Global Map that is available. Below is a screen shot:

From this high view you can see that that CTX hasn’t imaged all of Mars. This is expected. CTX isn’t trying to image all of Mars, it is simply the context imager for HiRISE. Meaning that CTX tends to roll tape only when HiRISE is. Never the less, the imagery is still beautiful and, in my opinion, shows more detail that the default base layer from an HRSC composite.

This mosaic was created by simply downloading all CTX data from NASA’s PDS servers and then calibrating and map projected the imagery with USGS’s ISIS software. The composite was then made with in house software from our team at NASA Ames Research Center. This is a Vision Workbench combination plus our Plate Filesystem. It is the same software we used to do a global MOC-NA and HiRISE mosaic for Microsoft’s World Wide Telescope. Unfortunately, I believe those servers have bit the dust. Regardless, if you have free time, I encourage you to check out the CTX Mosaic and Opportunity traverse in Google Earth. Click the planet, select “Mars”, and then in the bottom left select “CTX Mosaic” under “Global Maps”.

Planetary Data Workshop

2 Weeks ago I had the pleasure of going to Flagstaff AZ for the Planetary Data Workshop. It was a conference for scientists to express their needs and for engineer types to discuss their solutions. As can be guessed from the name, our topics were the dispersal, the processing, and the tools used for scientific imagery of the planets in our solar system (primarily from NASA’s robotic missions).

I was at the conference to discuss my software product, Ames Stereo Pipeline. I gave two talks and eventually they’ll be available on YouTube. I also gave an hour long tutorial that is now online. It’s the video above. I’m not sure how interesting it is to watch but it was a lot of fun performing the tutorial. I now realize how much of a nerd I sound like. I ended up throwing away my prepared HiRISE and LRO-NAC imagery. The crowd that attended seemed more interested in the mass processing of CTX imagery. I, unfortunately, did not have any CTX data on my laptop. Instead, I asked Fred Calef from JPL for a CTX stereo pair that he wanted processed. To my benefit, ASP v2 processed it autonomously without hassle during the demo! My laptop managed to chunk through the data in 15 minutes. I spent most of the tutorial just talking about the ancillary files and what users can look into to see if their output will turn out all right.

Shooting from the hip for a tutorial could have bitten me pretty badly. But I think ASP really has improved a lot and is ready for mass production environments. I’m trying to push for one for earth polar imagery but there are many more datasets that could have this same treatment. I think that idea became clear to the 30 guests who attended my tutorial. We’ve had an uptick in downloads and I hope that means I’ll be seeing some cool 3D models in the future.

Sidenote:  I found out that Jay Laura from Penn State has a blog going called Spatially Unadjusted. He’s a GIS guy who also uses ASP (Aww yeah). He presented a poster on his experience of using ASP v1.0.5 for LRO-NAC imagery.