Projected Image
Augmented Tanks is a game I've been working on for a while now...and am still working on.

Building off of my solar powered iPhone controlled Arduino tank, I've added an infrared LED to my robot so that it can be tracked and also modified the tank's control system so that it now uses two XY grids and a fire button.

The first XY grid now controls the tanks direction and the second controls the tanks aim. There is also a large button spanning the entire top part of the interface which fires the weapon.
iPhone interface for tank control
File Size: 0 kb
File Type: touchosc
Download File

The goal of the game is simple, shoot tanks on the opposite team until your team has reached a total of 10 kills. Each time a member of your team is killed you get -1 point.

There are ammo pickups and each tank starts out with a shield that can take 10 hits before the tank is dead.

Right now every bullet fired is tracked with a firedfromID and teamID so that each bullet knows to go through team mates and only damage the opposite team. Also it help to keep track of kills so that at the end of the game statistics can be shown.
Currently in Progress:

Of course fun part of this is building the actual tanks/robots and controller. I'm still working out the details of what the tanks will be made out of, early tests were simply using Ardubots driven by SparkFun customers during our little meetup/class. How the controller communicated to the server running Processing is pretty important in order to be able to keep all the signals real-time and so that when a bot is killed it's IR led will be turned off for a few seconds and controls disabled.

The students were using xBee modules with unique channels to control each robot via serial commands send by their laptops, but that may be changing...or getting a bit more complicated since they all need to connect to the server via one xbee.
File Size: 149 kb
File Type: zip
Download File


I just finished installing some hardware for SparkFun's robot tracking system. It's sort of a mixed reality system combining home made robots and hacked together control systems, IR tracking and sweet high res projections from above.

It uses a modded PS3 eye, an ultra-short throw projector mounted above the ceiling, Community Core Vision and Processing running MSAFluid.

The robots can consist of pretty much anything, all that the robots need to be tracked is a simple IR LED pointing upward towards the modded PS3 webcam which now only sees infrared. In this case Tim has attached a small throwy style cell battery, resistor, switch and IR LED together and literally just taped it to his robot.

Tim's sumo robot recently competed and placed at Robothon in Seattle, but now is controlled via a hacked Wiichuck connected to an Arduino and xbee for wireless control.

The InfraRed tracking is done by the PS3 eye with the IR blocking filter removed and an IR filter that ONLY allows infrared light. The result is an all black image with only the IR LED showing up in the webcam. The webcam is directly above projected image hidden in the ceiling.

Community Core Vision is a great piece of free software with a calibration system which has mainly been used for multi-touch screen, but it can also be used for other types of tracking. CCV can output the tracked points via Open Sound Control to Processing...just like TouchOSC in my last project. Or it can output to Flash, which is also really cool, but I don't know much Flash.

Processing can read the OSC signals and interpret them in any number of ways. My favorite easy to tweak demo is Memo's MegaSuperAwesomeFluid TUIO particle system. I just barely modded Memo's code to fit the size of the projection and the speed of the particles emitted from the front of the robot.

That's about it for now, there is definitely more to come. Also if anyone wants to come check out this project or just hack some electronics with us...SparkFun is having a free electronics hacking meetup tomorrow, Dec 5th from 9:30am-2pm in Boulder, CO. Hope to see you there!