Hi, my interest is again directed towards multitouch interfaces.
What are the current limitations, related to touch interface interaction with musical instruments, daws, etc?
Some days ago, i tested a pair of kivy apps, and touchosc, on my android phone, tablet pc. And i find it usable but not, for live performances, just launching clips, or something that doesnt require low latency interaction. And looked towards html5 touch interaction, because i find it portable to every device, be it pc, phone, etc.
I find its approach, more universal than a binary made in kivy.
First i was thinking controllers, like maschine, ableton push, midi synths, they are good as a concept, but deprecated as plastic, all can be replaced with touch surfaces, and programmed interfaces.
But to simulate drumpads, you have to control touch pressure sensitivity on realtime. How do you achieve that cheaper, or more general than a user specific controller?
Im looking good researching on html5, kivy, Osc, and Max for Live? Or there are better ways for creating, using universal responsive apps?