Friends, this is amazing. I will shortly present you trackmate, an open source initiative from, guess…yes Tangible Media Group in the MIT Media Lab. Don´t ask me how they do it but they again left me completely stunned. It works really easy: just a webcam, some fiducials printed on a common office printer, glue and light and open source code for the LusidOSC tracking. (a protocol layer for unique spatial input devices). It´s so remarkable to see the approach of making it inexpensive and available to all users, developers and those who are just interested in. It works like any other fiducial tracking system, so you stick a printed out marker under any object you like, put it on a translucent, frosted plexiglass plate that you film with a standard webcam from underneath… ready for making sound, visuals, interactions. Try it out and enjoy !! If you have any questions, there is of course a wiki that will clear up all questions easily.
Next to this nice video I found the link to an event taking place in New York on 6th June, while the new york internet week: The global tangible interface hackdays. It´s about hacking and making a physical interface with trackmate and to share it with loads of people living on the internet via text chat video streams and shared code!
When it comes to tracking with fudicial markers erveryone reminds to reactivision. What was known as a table top application is now pulled up into space with a special feature of recognizing markers even if they are not visible in one piece. With the midi interface linked to reactivision, the ball walks like a midi instrument triggering drumloops in this case, that are modified with a korg nano or any other midi device controlling Avleton Live. Additional information can be found here.