I’ve made this toolkit which drag-drop’iffys the process of making multi-touch displays in the physical enviroment. It works by using a Kinect and a projector to combine projection mapping with multi-touch.
I think that the uses for ubicomp tech like this which aims to “disappear”—like the pen—are probably found in lots of niches that would never normally be able to access the technology and see if it helps. I hope that this toolkit will help us learn about what the useful applications and deployment challenges are
Thanks Craig! Yea, Jason is great Do you know him from NZ?
I enjoyed your papers on software vis btw. In case it is of any interest, we are thinking about using UbiDisplays to experiment with some novel UI’s (a bit like CoffeeTable highwire-dtc.com/coffeetable) in the applied setting of our new Student Software Design Studio at Lancaster.
I definitely need to look into this. I assume the limitation is where the kinect is placed so that nothing can block its view. I didn’t notice where the kinect is actually placed for your example. If its straight above I can imagine some possibilities for it. If its off to the side that puts a pretty big limitation on what you could do. I’m imaging a table with people sitting at it and everyone interaction with the projection at once.
Yea, but its not as much of an issue as I first thought. I measured accuracy at a variety of different angles and distances and found that it was largely independent of angle, but is better when the Kinect is placed less than 1.4m away from the interaction surface. I’ve uploaded a graph!