Man on the Moon

During my internship at NASA in 2009, I helped produce an elevation model and image mosaic from Orbit 33 of Apollo 15. This mosaic was later burned into Google Earth’s Moon mode. Earlier this week it appears people have found an image of a man walking in the region of the Moon I stitched together. Here’s links to articles about this supposed extra terrestrial at The Nation, News.com.au, AOL.com, and Examiner.com. Thank you to LROC’s Jeff Plescia for bringing this to my attention.

I quickly traced out that this section of the image mosaic comes from AS15-M-1151. This is a metric camera image from Apollo 15 that was scanned into digital form sometime in 2008 by ASU. What is shown in Google Earth is a reprojection of the image on to a DEM created by Ames Stereo Pipeline using said image. The whole strip of images was then mosaicked together using ASP’s geoblend utility. So this man could have been created by an error in ASP’s projection code. Below is the man in the moon from the raw unprojected form of the Apollo Metric image. Little man perfectly intact.

Unfortunately if you look at the next image in the film reel, AS15-M-1152, the man is gone. This is true also for 1153 and 1154. After that, the Apollo command module was no longer over looking the area. The metric camera takes a picture roughly every 30 seconds, so maybe the guy (who must be like 100 meters tall) just high tailed it.

These images come from film that had been in storage for 40 years. They were lightly dusted and then scanned. Unfortunately a lot of lint and hair still made it into the scans that we used for the mosaic. So much so, that Ara Nefian at IRG developed the Bayes EM correlator for ASP to work around those artifacts. Thus, this little Man in the image was very likely some hair or dust on the film. In fact if you search around the little man in image 1151 (in the top left corner of the image, just off an extension of a ray from the big crater) you’ll find a few more pieces of lint. Those lint pieces are also visible in Google Moon. However, it is still pretty awesome to find out others have developed a conspiracy theory on your own work. Hopefully it won’t turn into weird house calls like it did for friends of mine over the whole hidden nuclear base on Mars idea.

Update: You can find the Bad Astronomer’s own debunking of this man here. The cool bit is he tried to find the artifact in LRO and LO imagery. He then links to a forum where someone identifies that the dust was actually in the optics of the camera or in the scanner bed. So the man and other pieces of lint can be seen at roughly the same pixel location in consecutive frames.

Hah, it was even covered on SGU.

Man, I’m late to debunking my own work. 🙁

Rendering the Moon from AMC

Ames Stereo Pipeline is currently a candidate in the running for NASA’s Software of the Year award. We needed a pretty graphic and decided that making a cool and possibly realistic rendering of Moon would fit the bill. This is a little more difficult than simply hill shading because the Moon has a specular component to it. Hill shading can be interpreted as being only the diffuse component of the phong model. An interesting example of the Moon’s specular compoent is this picture taken with a Hasselblad during Apollo 17.

Below, are videos of my results where the Sun’s projected coordinates sweep from 90 W longitude to 90 E. Both these views are showing map projected imagery, thus this isn’t a true perspective shot. The difference between these videos is the input observer’s altitude above the surface. Lower altitude and more of the specular component can be seen.

I’m using nothing but Apollo Metric imagery for this example. The DEM source was our product for LMMP. The Albedo source was the Apollo Metric Albedo map that Dr. Ara Nefian produced and will eventually be in NASA’s PDS. The photometric model was the Lunar-Lambertian model as described by McEwen’s paper. Shadows were not rendered because that seemed harder than I could accomplish in 24 hours.

Finished 3D from Apollo

Render of a DIM and DEM map from Apollo Metric Images It’s been a long 3 years in the making, but today I can finally say that I have finished my 3D reconstruction from the Apollo Metric cameras. After ten of thousands of CPU hours and several hundreds of liters soda, the Mapmakers at the Intelligent Robotics Group have managed to produce an Image mosaic and Digital Elevation Map. The final data products are going up on LMMP’s website for scientists to use. I encourage everyone else to instead take a look at the following KML link I’ve provided below.

IRG’s Apollo Metric/Mapping Mosaic

It’s so pretty! But don’t be sad! IRG’s adventure with Apollo images doesn’t end here. Next year we’ll be working on a new and fancier Image Mosaic called an Albedo Map. Immediately after that, our group will be working with the folks at USGS’s Astrogeology Science Center to include more images into the Apollo mosaic. In that project we’ll include the images that are not only looking straight down on the Moon, but also the images that look off into the horizon.

All of the above work was produced using our open source libraries Vision Workbench and Ames Stereo Pipeline. Check them out if you find yourself producing 3D models of terrain. At the very least, our open source license allows you to look under the hood and see how we did things so that you may improve upon them!

Apollo 15 Mosaic Completed

Image of the Moon with images from Apollo 15 projected onto it.Let me secretly show you a cool project I just finished. During the later missions of Apollo during the 70’s, NASA came to understand that their funding would be cut back. In attempt to extract as much science as possible from the last few missions to the Moon they increased the astronauts’ time on the surface, gave them a car, and added a science bay to the orbiting spacecraft. Inside that science bay (called SIM) were two repurposed spy cameras. One was the Apollo Metric Camera, whose 1400 images from Apollo 15 are seen projected above. Recently ASU has been digitally scanning this imagery. This has allowed me and my colleagues to be able to create a 3D model of a large section of the near side of the Moon and to create a beautifully stitched mosaic.

3D model of Tsiolkovsky CraterBesides these being pretty pictures, I’m proud to say that all of this work was created by open source software that NASA has produced and that is also currently available on GitHub. Vision Workbench and Stereo Pipeline are the two projects that have made this all possible. The process is computationally expensive and is not recreate-able at home, but a university or company with access to a cluster could easily recreate our results. So what does the process look like?

  1. Collect Imagery and Interest Points (using ipfind and ipmatch).
  2. Perform Bundle Adjustment to solve for correct location of cameras (using isis_adjust).
  3. Create 3D models from stereo pairs using correct camera models (with stereo).
  4. Create terrain-rectified imagery from original images (with orthoproject).
  5. Mosaic imagery and solve for exposure times (using PhotometryTK).
  6. Export imagery into tiles or KML (with plate2dem or image2qtree).

This long process above is not entirely documented yet and some tools have not yet been released in the binary version of Stereo Pipeline. Still, for the ambitious the tools are already there. Better yet, we’ll keep working on those tools to improve them as IRG is chock-full of ideas for new algorithms and places to apply these tools.