Follow the reluctant adventures in the life of a Welsh astrophysicist sent around the world for some reason, wherein I photograph potatoes and destroy galaxies in the name of science. And don't forget about my website,

Monday, 27 May 2013

Why NASA Are Paying Me To Photograph A Potato

A PhD in astrophysics and 11 years of graphics design experience. When they asked me if I'd like to work for NASA*, photographing a potato isn't exactly what I had in mind.

* Technically, being partly paid by a NASA grant may not be the same as actually working for NASA, but I don't care.

There's no clever word play or sophistry in the title.
This article is exactly what it sounds like.
Sondy walks in one morning holding a potato. This isn't so strange, because Sondy is quite keen on healthy natural foods (and is on a mission to reach the number one spot for Googling "gluten free in Japan"), and can often be found wandering the corridors clutching strange things she calls "vegetables". More unusual is the resemblance of this particular asterid to an asteroid.

To cut a short story even shorter. the upshot is that Sondy thinks I should make a digital model of the potato so that we can fool hapless planetary radar astronomers into believing it's asteroid data (radar is pretty dang useful for determining the 3D structure of asteroids without having to send a spacecraft there). Or even make an online activity wherein lucky members of the public try and guess what's an asteroid and what's a root vegetable.

Asteroid 25143 Itokawa may look a lot like a potato but it's 500m long and made of rock.
A thought occurs that what we'd have here is an honest-to-God Astro Farm. Way cooler than Galaxy Zoo, obviously, because it's got its own theme tune.

Anyway, I decided to humor Sondy's hair-brained scheme and went away to photograph the potato from every angle. Since there are an infinite number of angles, I decided to stop when I got bored, which took about ten minutes.

Then I fed the photographs into Autodesk's 123DCatch program which can automatically convert image sequences into 3D models. In the past I've also used the Python Photogrammetry Toolbox with MeshLab. Normally I prefer to use open source whenever possible but the PPT/MeshLab combination is considerably more clunky at this stage of developement, and 123D gives better results. More importantly, it's just a frickin' potato.

If this isn't technology abuse then I don't know what is.
I cleaned up the mesh in Blender (having a virtual asterato/potataroid) stuck to a table isn't much good), slapped on a pre-existing asteroid texture, did a little mesh sculpting and added some craters for good measure (asteroids are supposed to have craters, everyone knows this). In a short while I had this :

Which is not too shabby, I think. But the whole project became much more interesting with a "well, actually..." moment from the planetary radar boss. Turns out having a synthetic, well-defined asteroid model is pretty darn useful. Normally the radar team spend their time turning radar maps into 3D models, but you could also go backwards and turn a model into simulated observations. Then you could add noise and try to reconstruct the 3D model from the simulated observations, and have a really good comparison to model to figure out exactly what sort of problems crop up. Heck, you could even give the thing some rotation and model its optical lightcurves...

What started as a surreal - albeit hilarious - joke is rapidly turning into a fully-fledged science project. And that's how I got paid by NASA to photograph a potato. Think I'll go and update my C.V.