Tango take the Wheel

Johnny Lee, my boss a couple links up, recently showed off at Google I/O a reel of recent research improvements on Tango. Possibly not as exciting as the other stuff in the video, but at the 1:19 mark in the video you’ll see some research by me and my coworkers to see how well Tango works on a car. As you can see it works really well and we even drove it 8 km in downtown San Francisco through tourist infested areas. Surprisingly or not, the mass number of people or the sun didn’t manage to blind out our tracking.

How did we do it? Well we took Tango phone and quick clamped it to my car. Seriously. Here’s a picture of a Lenovo Phab 2 Pro and a Asus Zenphone AR attach to my car and me in my driving glasses. We ran the current release of Tango and did motion tracking only and it just worked! … As long as you shut off the safeties that reset tracking once you exceed a certain velocity. Unfortunately you users outside of Google can’t access this ability as in a way these velocity restrictions are our own COCOM Limits.

Also, this is something really only achievable with the new commercial phones. The original Tango Development Kit didn’t include good IMU intrinsics calibration. The newly produced cellphones at the factory will solve for IMU scale, bias, and misalignment. At the end of each factory line, a worker places the phone in a robot for a calibration dance. Having this calibration is required for getting the low drift rates. Remember our IMUs are cheap 50 cent things and have a lot of wonkiness that the filter to needs to sort out.

Goodbye and Thanks for all the Fish!

July will be my final month working at NASA Ames Research Center. It has been a great run working under the ill-defined job title as “Geospatial Software Architect”. I’ve gotten a chance to work as Software Dev, a Principal Investigator, and most recently as a Flight Software Lead on a new robot for the ISS called Astrobee. I’m leaving this comfortable blanket I’ve had within NASA for 6 years for a chance to do computer vision at Google for Project Tango. Something I find exciting and terrifying, but I view this as an opportunity to learn ever more.

Despite this new opportunity I feel like I’m leaving my baby, Ames Stereo Pipeline (ASP). Honestly, though, I’ve been doing less and less with ASP for a while now. Oleg has been lead developer for quite sometime. Under his guidance the software has gotten more features that everyone wants and more people have started using it for Earth Science. Clearly Oleg is doing a great job! Recently another coworker, Scott, has started improving ASP as well. With those two on the job, I feel like ASP will continue to grow.

I’m extremely proud that a community has developed around ASP. I’m also grateful to APL UofW and the PGC for putting faith into the software. Their time spent using ASP, requesting changes, and offer solutions to bugs has made ASP a product worthwhile. I’m sad I won’t get to be involved anymore or at least hear about what new applications scientists have thought up. My time involved developing ASP was wonderful and was perfect for honing my skills. I hope others can do the same through the use and understanding of how it works.

Thank you ASP users and the Intelligent Robotics Group. It was fun!

I’m not Dead

However I’ve been really busy working with Google’s project Tango. I encourage you to watch the video if you haven’t already.

What is NASA doing with project Tango? Well currently there is a very vague article available here. However the plan is to apply Tango to the SPHERES project to perform visual navigation. Lately, I’ve been overwhelmed with trying to meet the schedule of 0-g testing and all the hoops there are with getting hardware and software onboard the ISS. This has left very little time to write, let alone sleep. In a few weeks NASA export control will have gone over our collected data and I’ll be able to share here.

In the short term, project Tango represent an amazing opportunity to perform local mapping. The current hardware has little application to the large-scale satellite mapping that I usually discuss. However I think the ideas present in project Tango will have application in low-cost UAV mapping. Something David Shean of U of W has been pursuing. In the more immediate term I think the Tango hardware would have application to scientists wanting to perform local surveys of a glacial wall, cave, or anything you can walk all over. It’s ability to export its observations as a 3D model makes it perfect for sharing with others and perform long-term temporal studies. Yes the 3D sensor won’t work outside, however stereo observations and post processing with things like Photoscan are still possible with the daylight imagery. Tango will then be reduced to providing an amazing 6-DOF measurement of where each picture was taken. If this sounds interesting to you, I encourage you to apply for a prototype device! I’d be interested in helping you tackle a scientific objective with project Tango.

This picture is of Mark and I dealing with our preflight jitters of being onboard the “Vomit Comet” while 0-g testing the space-rated version of Project Tango. This shares my current state of mind. Also, there aren’t enough pictures of my ugly mug on this blog. I’m the guy on the right.