Computer Vision and Cellphone

I’m applying to graduate school again. To help advertise myself, I decided to take a stab at reproducing a student’s work from one of the labs I want to join. Here’s a short video demonstrating the work:

The original author’s thesis was about the application of a new style of kalman filter. The computer vision measurements were just inputs to get his research going. Thus, this fiducial work is extremely simple and fun to implement. The only novel bit was that everything was working on a cellphone and I didn’t use any Matlab.

This entire project works by simply segmenting the circles from the image via a threshold operation. Then two passes of connect blob searches are applied, one for white and one for black. The code looks for blobs with aligned centers. If there are 4 in the scene, it calls it a success. Yes, there are many ways to break these assumptions.

After locating the 4 circles, it identifies them by their ratio of black to white area. The boldest circle is defined as the origin. The identified location of these circles, their measured locations on the paper, and the focal length of the camera are then fed to an implement of Haralick’s iterative solution for exterior orientation. That algorithm solves for translation and rotation of the camera by reducing projection error through the camera matrix.

Code for the blob detection and exterior solving are available here. The java user interface is simply a straight rip from the demos that come with Android OpenCV. So it shouldn’t take long to reproduce my work.

To see what inspired me, please investigate the following papers:

[1] B. E. Tweddle, “Relative Computer Vision Based Navigation for Small Inspection Spacecraft,” presented at the AIAA Guidance, Navigation and Control Conference and Exhibition, 2011.
[2] B. E. Tweddle, “Computer Vision Based Proximity Operations for Spacecraft Relative Navigation,” Master of Science Thesis, Massachusetts Institute of Technology, 2010.

Fly by Night application of ASP to Video

This is a video of horizontal disparity from a video stereo rig onboard the International Space Station. Specifically it was this camera. I used ffmpeg to split up the data into individual frames and then I applied ASP to the individual frames. I attempted solving for the focal length and convergence angle of this set but unfortunately I didn’t properly constrain focal length. (My algorithm cheated an brought its error to zero by focusing on infinity). Regardless, I’m pretty happy with the result from a Tuesday night at home.

Android SDK on Ubuntu 11.10

I’ve made several false starts on this. So, I’ve decided to document my process so maybe I’ll help out some other unfortunate souls.

Word on the street is that Sun’s Java is the one to install. Those packages are no longer available in Ubuntu default repos anymore, so we’ll have to install by adding another package repository.

sudo apt-get install python-software-properties
sudo add-apt-repository ppa:ferramroberto/java
sudo apt-get update
sudo apt-get install ant sun-java6-jdk sun-java6-jre sun-java6-bin sun-java6-plugin

The Android SDK is currently shipped just as a 32 bit package. If you have a 64bit OS, you should consider doing the following

sudo apt-get install ia32-libs

Now you’re ready to install the Android SDK, which is available through this link.

tar xfz android-sdk_r16_linux.tgz
cd android-sdk-linux/tools
./android update sdk --no-ui

At this point you should have adb in ‘android-sdk-linux/platform-tools‘. This is pretty much everything you need to compile unless you want to use Eclipse. If you are just learning how to use the SDK, you should continue on to Android’s Dev Guide or perform the ‘Hello World’ Tutorial.