I have built a jazzmutant-style modular controller in max-msp-jitter, using Open Gl for graphics, and my own max code for relating tracked finger coordinates to the graphic sprites . initially, the finger tracking was coming from wiimote whiteboard, beacause I was just using the ir cam in the wiimote for a camera… and sending it to maxmsp with osc… I left it hanging at the hardware stage, when I started building my ftir setup, and got discouraged.
I am interested in continuing this project, and now that latitude XT has multitouch, I am wondering if I could use it with my controller project.
Max MSP has a HID (human interface device) object that reads whatever raw data is coming from HID usb devices it detects. If the duosense technology spits out separate coordinates for each finger in a simple to read way, than I could definitely use it with my setup… I have tried interfacing with touchscreen digitizers before, and they don’t always spit out easily interpretable coordinate info…
Can someone with the XT check what kind of data they get from the HID driver? Do you get separate coordinates for each finger? I assume the HID object in max would receive the same data as would recieve the Raw Input api in windows.
BTW, where can we find the sdk for n-trig’s multitouch duosense?