The previously mentioned Vytas Sunspiral recently gave a Google Tech Talk on his work. Here it is with plenty of videos of moving robots.
I know I promised a different blog post than this one. However I’m a bit crammed for time lately. However I do have something quick to show …
This summer, my interns Abe Shultz and Benjamin Reinhardt rejuvenated an old gantry system that is used for mimicking zero gravity for small spacecraft. This is part of my work with the NASA SPHERES program. In this video I demonstrate the gantry responding to external forces applied on a wooden analog for SPHERE. In the future the real SPHERE will be loaded up and will use CO2 thrusters to move itself about the room with the gantry responding accordingly.
This test apparatus is one facility I plan to use to test out a new visual navigation system that my coworkers and I are developing. What I’m looking forward to is the vomit comet flight we have scheduled towards the end of the year. This is why I’ve left some of the software development of ASP with others.
I do other things beside develop Ames Stereo Pipeline. I actually have to this month because my projects’ budgets are being used to pay for other developers. This is a good thing because it gets dug in developers like me out of the way for a while so that new ideas can come in. One of the projects I occasionally get to help out on is HET Spheres.
This is a picture of that robot. The orange thingy is the SPHERE robot designed by MIT. The blue puck is an air carriage so we can do frictionless testing in 1G. There have been 3 SPHERES robots onboard the ISS for quite some time now and they’ve been hugely successful. However we wanted to have an upgrade of the processing power available on the SPHERES. We also wanted better wireless networking, cameras, additional sensors, and a display to interact with the Astronauts. While our manager, lord, and savior Mark listed off all these requirements, we attentively played angry birds. That’s when it suddenly became clear that all we ever wanted was already available in our palms. We’ll use cellphones! So, though crude, we glued a cellphone to the SPHERE and called it a day.
Actually a lot more work happened then that and you can hear about that in Mark’s Google Tech Talk. I also wasn’t involved in any of that work. I tend to do other stuff that is SPHERES development related. But I spent all last week essentially auditing the console side code and the internal SPHERE GSP code. I remembered why I don’t like Java and Eclipse. (I have to type slower so Eclipse will autocomplete. :/) This all collimated into the following video of a test of having the SPHERE fly around a stuffed robot. We ran out of CO2 and our PD gains for orientation control are still out-of-whack, but it worked!
Congrats to Cosmogia and the PhoneSat team for hitching a ride on the first Antares flight into space!
I was totally the one that went to Sports Authority for more CO2.
I’m applying to graduate school again. To help advertise myself, I decided to take a stab at reproducing a student’s work from one of the labs I want to join. Here’s a short video demonstrating the work:
The original author’s thesis was about the application of a new style of kalman filter. The computer vision measurements were just inputs to get his research going. Thus, this fiducial work is extremely simple and fun to implement. The only novel bit was that everything was working on a cellphone and I didn’t use any Matlab.
This entire project works by simply segmenting the circles from the image via a threshold operation. Then two passes of connect blob searches are applied, one for white and one for black. The code looks for blobs with aligned centers. If there are 4 in the scene, it calls it a success. Yes, there are many ways to break these assumptions.
After locating the 4 circles, it identifies them by their ratio of black to white area. The boldest circle is defined as the origin. The identified location of these circles, their measured locations on the paper, and the focal length of the camera are then fed to an implement of Haralick’s iterative solution for exterior orientation. That algorithm solves for translation and rotation of the camera by reducing projection error through the camera matrix.
Code for the blob detection and exterior solving are available here. The java user interface is simply a straight rip from the demos that come with Android OpenCV. So it shouldn’t take long to reproduce my work.
To see what inspired me, please investigate the following papers:
 B. E. Tweddle, “Relative Computer Vision Based Navigation for Small Inspection Spacecraft,” presented at the AIAA Guidance, Navigation and Control Conference and Exhibition, 2011.
 B. E. Tweddle, “Computer Vision Based Proximity Operations for Spacecraft Relative Navigation,” Master of Science Thesis, Massachusetts Institute of Technology, 2010.
This video is my friend’s Google Tech talk about cell phones in space! IRG connected one to the MIT SPHERES platform in a study of control and what additional CPU power could provide.
Ross Beyer will hopefully be doing the next NASA @ Google Tech Talk.