This idea came to me quite suddenly yesterday, when I was sitting in my lecture and messing with PyMT on my laptop. Without an actual multitouch interface to PyMT, I could only do one finger interfacing, and simulate multiple fingers via keyboard input. However, the idea came to me as I reached for my iPod touch, and realized the potential in tethering input from the iPod to PyMT. Most computers these days are wifi capable, and since both the iPhone and iPod touch are, we could have a wireless multi-touch interface. With millions of people having iPhones and iPod touches, it would make for a cheap solution to a multitouch interface
So I’ve devised a set of requirements for something like this to work:
-PyMT apps should be completely boxed of the knowledge on what type of interface is being used to interact with PyMT
-The iPhone/iPod touch should be completely boxed of the knowledge of what app is running on PyMT
-PyMT should handle input from iPhone/iPod appropriately, translating data to the correct gestures
So the plan is to write an extension for PyMT that would act as a server/listener. The iPod would send out information about finger movements to the server/listener and the server/listener would decode the information appropriately, sending the result to the apps running in PyMT. This would also require an app for the iPhone/iPod touch to be built to capture finger movements and send them.
Specs for the iPhone/iPod touch app:
-On startup, the server would send information about available apps to the iPhone/iPod touch tethering app.
-A user selects which PyMT app they would like to run, it runs remotely on PyMT.
-Interface becomes a black box, where users can use their fingers to manipulate apps remotely.
Of course, a settings menu will also be provided
So that’s the plan! Input from anyone is much appreciated. I’ll get working on mockups for the iPhone/iPod app and post them here when I have the time (I have an exam I should be studying for right now >_<)