This document details the proposal of a project to the NUI Group (Natural User Interface Group) for the Google Summer of Code, 2011 edition. The project’s goal is to create a musical instrument using the multi-touch application development framework called Kivy, specifically an interface widget implementing a keyboard-like and continuous instrument. The implementation will be taking advantage of the flexibility and ease of use provided by a graphical and natural user interface environment for music production. The widget will output a message format which is used for controlling electronic music, such as MIDI (Musical Instrument Digital Interface) or OSC (Open Sound Control). A sound module will receive this data and produce the resulting sound. This sound module will be implemented and included out-of-the-box and developed in a technology called PureData. The motivation behind this project is the study of the applicabilities of a multi-touch interface for music production and the implementation of an innovative instrument with great potential, but not popular due to its high cost. The future work for this project will involve adding more features to the interface, improving the latency of the setup and adding additional effects/samples to the sound module. It is also a possibility to use the sound module for controlling for example graphics or video, to allow for a visual experience in addition to just the audio.
Name: Rui Reis Costa Campos
Location/Timezone: Porto, Portugal/GMT
Education/Qualifications: Student of the Integrated Masters of Informatics and Computing Engineering in the University of Porto, Portugal ( 5th year)
Hello, I’m currently in my final year of my degree and I’m working on my master’s thesis. My master’s thesis has the title of “Framework for Role Playing Games in Multi-Touch Multi-User devices”. In the past year my interest has spiked towards the area of interaction mainly due to the opportunity of making this thesis. I have worked with PyMT for most of the development of my thesis and am now switching to Kivy.
Joined with my recent interest in HCI is my music hobby which gave me an idea for a side project. This idea is to develop a platform for controlling midi via touch but most importantly, to implement a system similar to the Haken Continuum (http://www.youtube.com/watch?v=Mrmp2EaVChI&feature=related). The project also includes implementing a series of algorithms to do things like pitch rounding or automatic pitch sliding between consecutive contact points, see http://www.youtube.com/watch?v=yCM-WBqDZ-Q for a few of the cool features that can be implemented.
Ever since I saw this instrument being played I had the impression that it would be the next generation keyboard, because the possibilities are tremendous. However, it has not seen widespread adoption mainly due to the fact that it’s quite expensive.
So my project proposal is to implement an Haken Continuumish system with Kivy and a set of midi controls all touch based. My implementation will have three main tasks:
Implementation of the Interface:
This represents the interaction component where I will implement the continuous keyboard and various controls for the keyboard. I already know how work with PyMT/Kivy so I expect this task to be the less time consuming. One week for an initial prototype so I can start working on the other modules and two weeks spread along the other tasks for improving the user experience and some additional controls that may arise from the rest of the development.
Touch information treatment and sound production:
This is the component which will be responsible for treating the touch information received and generate the sound. As TUIO is based on OSC maybe it would be possible to simply send the TUIO directly to synth software and implement the parser of the touch information in the synth directly, but I’m afraid that this might restrict the implementation to that specific software. The analysis of these tradeoffs along with the fact that this is the module where I have less know-how( I will learn about OSC and how the synth software works) ,I consider that this is the task which will be the most time consuming and thus predict an estimate of 6 weeks for completion.
Optimizations for sound production:
Here I will see how much latency the current system has and try to optimize it using code optimizations, setup analysis, etc. I know that there are some proposals for using the GPU and for using regions of interests, the latter is especially interesting for me as the main interaction will be done in a specific region of the display and there may be a considerable performance gain in this; I will address this in the end of the implementation time so there may be some more advancements then. I expect to spend 2/3 weeks addressing this issue.
This idea have been tested at the beginning of PyMT with Flipo. However, the code never been published for everyone.
I would love to see a musical instrument as the Continuum for everyone.
Making the instrument as generic as possible (ie as you said, touch parsing inside kivy, not inside the synth) is perfect.
However, do you intent to do both Midi & OSC ?
For sound generation, do you have an idea about what you will use yet ? PureData ?
Is there the possibility that you could share this code? Were there any problems as to why the code was never published?
As for the music side of the implementation my knowledge is quite fuzzy, I haven’t addressed these topics before and would like to use this project to learn more about it. From what I could gather from my small research on the matter, Midi is the current standard and has a lot of software/hardware compatibility, so it would be easier to use because I can just plug it to any available vst or hardware. However, from what I could understand, OSC allows controlling not only the sound output, but it allows anything to be controlled because it has user defined messages so I can control the synth, the effects used. If i use OSC the disadvantage from what I understand is that I will have to ensure some compatibility with the sound generation module that I use, because it needs to understand my defined messages, or I have to comply with that module’s defined messages. I have to better analyse these two possibilities, maybe if the implementation of both is not overwhelming, I can implement both.
As for the sound generation, I still don’t know what I will use. If I generate Midi I guess I don’t have to worry about which program I will use, because most of them support it. If I use OSC I will have to find some compatibility. I didn’t know about PureData, I will investigate it
uaua, that seems a good idea. You will have to do a setup with the maximum precision in finger position recognition, also a pressure sensitive setup, i think.
I would recomend that you try to use OSC and perhaps MIDI, but OSC will give you much more flexibility to send data to any synth.
Did you think which audio software are you going to use?
I can help you with puredata or MAXMSP if you need it.
Also, via OSC you can comunicate with Ableton Live and have all this instruments, etc.
OSC can give you also a control over the graphics if you want to display some sound-graphics interaction.
Hope you can do it !!
If you have any question about sound programing, you can ask here or to me. I am not a very good programer in pure data or maxmsp but have some experience.
Great proposal… should be used as an example for other students On the concept tho I am worried it is a bit to narrow… how could this be expanded to cover a more broad signal processing concepts, no or in the future etc (through modules).
Thanks for the replies, tomorrow I will include a future work section on the proposal. My initial idea was to only address the interface and osc/midi output, using any available synth, but since a lot of people have talked about actually implementing a sound output module I included it in the proposal; maybe I should address it even more.
I have changed the proposal. I will now address the sound module with a different approach since the interest seems considerable. My know-how in this area is lacking, which is good because I will get to learn more about these technologies. I didn’t know about PureData but, since it was referred here, I have been exploring it and it looks really interesting. Also there will be a workshop here in Portugal about it, on the 15th of April, I won’t miss it
Thanks for posting the PDF of the proposal! The idea is interesting and the way you presented it is great! I hope you will get one place in GSoC.
Also if you don’t mind I will use the structure of your proposal as a template for my own.
Thanks for the compliment! Feel free to use the structure
As an additional note, I will be doing a simple prototype which uses Kinect and a library called TouchKinect which will map a table and give the position of the fingers which are touching the table; I will create a midi output out of this which will be based on the position of the fingers in the table. The idea is to do something very simple, It’s just a tech demo do be developed in one day. I hope I learn more about PureData with this
I managed to make a simple prototype which uses Kinect for detecting touches on a surface and pd to produce sound. It came out better than I thought, the latency in Kinect is almost unnoticeable.
I learned pd in the process, at least the programming part because creating the sound itself and changing it to suit my needs is more challenging. It requires knowledge of signal processing for a musical context which is trickier… Regardless, I now clearly know how to make it so that you can easily add more sounds or effects to the pd patch and how you can use osc messages to enable or disable these effects. I did this with TUIO but I assume that the way the OSC module in puredata works is very similar.
I know some people in the sound business, I can ask them later on to make me some patch to produce a less annoying sound, right now I’m speaking with whales
The patch is attached, just plug to any TUIO provider and it plays something.
It functions very well as a workaround if you don’t have a multitouch table system working yet. I just had the kinect pointing at a simple 20’ tft screen and voilá, multitouch system :D kinect is awesome…
We allow 40% discount to all the students on every purchase of IT certification Exam dumps pdf. Customer satisfaction is the first priority of the DumpsExpert. We offer 100% money back guarantee. More information