report, TEI°09

day 3


The last day of the conference started with a paper session on “tangible and embedded interaction – in the wild and in the lab”. There has been a lot of good talks, for example Bart Hengeveld was talking about his and his teams learning language thesis linguabyte that aims to stimulate and develops the language skills of toddlers. As well Anne Marie Skriver Hansen was presenting the Pendaphonics project, that you can see in the yesterdays report.

The second session was about “Tabletop Tangibles and Augmented Surfaces”. There has been again a lot of interesting and amazing news, but after overloading this blog with table top applications yesterday, I will only show you one nice thing, which i think displays the embedded interaction approach very good. The project is called The Other Brother by John Helmes, who is research student at Microsoft Research in Cambridge. The Other Brother project is also hosted by Caroline Hummels from the technical university in Eindhoven who was publishing an inspiring article about her FIDA project in Form+Zweck n°22 ” the tangiblity of the digital”. Here you can find a brief overview to the projects happening in Eindhoven, including FIDA. Of course I rather recommend to order Issue n°22 of form+zweck magazine.

Ok … the other brother is a table top device, that captures images randomly and sends them to an internet account or digital pictureframe. What makes it a good one, is that it claims for attention when no one interacts with it. There are two microphones in a similiar position where the human ear is and the object turns in the direction where the sound comes from – it follows you. If even then no one will react, it starts to emit strange sounds and weird colours, so at least after this you might go for it and calm it down. The interaction works by two sensors attached to where the human chin and forehead are situated. By wiping over the sensor ( kind of petting gesture) it calms down or awakes. One outstanding point to the other demos from yesterday was that it is very nicely designed and well thouhgt in shapes, forms and human related interactions. Here is the paper on it and video I made yesterday while the demos.

Next to the tangible interfaces presented on the TEI, the embedded interaction start begin to interest me. After the keynote of Tom Igoe, dealing about ubiquitous computing and tracking components of technical devices to later customize or re-use them, the SPIME term came along quite often, though I never heard of it before. So I had a talk to few people tellig me Bruce Sterlings book called Shaping Things will give an overview about. While googling for it I stumbled over a SIGGRAPH key note he held 2004, where he describes the term blobject (GIZMO) as an obviously with CAD/CAM made object, with short lifespan and and a low functionality. Thats what we have now. A Spime was back in 2004 still a speculative non existing kind of product:

“The most important thing to know about Spimes is that they are precisely located in space and time. They have histories. They are recorded, tracked, inventoried, and always associated with a story.

Spimes have identities, they are protagonists of a documented process.

They are searchable, like Google. You can think of Spimes as being auto-Googling objects.”

And there is a scenario aswell to it:

You buy a Spime with a credit card. Your account info is embedded in the transaction, including a special email address set up for your Spimes. After the purchase, a link is sent to you with customer support, relevant product data, history of ownership, geographies, manufacturing origins, ingredients, recipes for customization, and bluebook value. The spime is able to update its data in your database (via radio-frequency ID), to inform you of required service calls, with appropriate links to service centers. This removes guesswork and streamlines recycling.


This was 2004. Today Leonardo Bonanni and his colleagues from the MIT Media lab work on a working environemt, that provides a tangible access to digital embedded information to physical objects, that are allready enriched with global detetction systems. What Sterling is describing is of course very smart, but lacks of a description how to get to those embedded informations. The MIT guys have as always an answer on this. Their solution is to take a gizmo, blobject, all day product and put it on a topprojected surface to get simultanesly the digital and the physical information of the object. One scenario they describe in the paper is “hyperlinking a physical object” That works like this: You put your electronic device on the surface and it opens you all ilt´s inner informations of where and when, by who this component was made: “the full list of Spime ingredients (basically the object’s material and energy flows), its unique ID code … various handy recipes for post-purchase customization, a public site for interaction and live views of the production chain and bluebook value… At the end of its lifespan the Spime is … entirely disassembled and folded back into the manufacturing stream.”

Please read the PAPER, to learn more about it.

Here is also a video of a project of Brygg Ulmer. I also deals with the re-use of inner product components. Again, I recommend to read the PAPER

additional link:

Here we have a video of Bruce Sterling talking about his book back in 2007 a the interaction symposium in Potsdam Germany.

The last block of papers was about design techniques that can be used to display, explain and render tangible interactions. Leonardo Bonanni again shows some nice examples ,that not only inspired me to think over the high end, glossy rendered, flash based computer animations. The stop motion movies he shew while his talk were really nice to see and over all funny. Over a playful way of discovering the strange world of interaction a funny result comes out, that is so simple in appearance. Here just a few of them.

The closing key note was held by Durrell Bishop, the man behind the marble answering machine. He held an inspiring talk about the design of tangible interfaces, that they should more see the potentials in the real world then just being designed on the desk with the only object in sight is the computer… He also refers to a nice rendering technique, in which the interaction of VHS recorder is explained by a kitchen sink, a grid and a glas bottle. The nimation was not stop motion but really low tech, so that you´not tend to see to much the criteria of shape but more the value of the interaction processes. He claims, that all objects can be augmented with new properties.



After the last question was answered, the conference chairs went up to stage, saying thank you to all of the people making that conference possible and invited the next years chairs to come up too. Next Year, the conference will take place at the MIT Media Lab in Cambridge Massachusetts. The Chairs Marcelo Coelho and Jamie Zigelbaum will take care of the organisation but it´s also Hiroshi Ishii in the Team and many more. There will be a few things changing next year: More Design, More Art, Workshops and so on… I will look forward to see if they can keep their promise.




Tomorrow I will post mysummary of the TEI°09.