[GSoC 2011 Proposal] Continuous Musical Fingerboard for Kivy
Posted: 01 April 2011 08:30 AM   [ Ignore ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

edit:
I’ve finished the proposal document, it’s here: http://dl.dropbox.com/u/16643844/gsoc2011.pdf

Abstract:

This document details the proposal of a project to the NUI Group (Natural User Interface Group) for the Google Summer of Code, 2011 edition. The project’s goal is to create a musical instrument using the multi-touch application development framework called Kivy, specifically an interface widget implementing a keyboard-like and continuous instrument. The implementation will be taking advantage of the flexibility and ease of use provided by a graphical and natural user interface environment for music production. The widget will output a message format which is used for controlling electronic music, such as MIDI (Musical Instrument Digital Interface) or OSC (Open Sound Control). A sound module will receive this data and produce the resulting sound. This sound module will be implemented and included out-of-the-box and developed in a technology called PureData. The motivation behind this project is the study of the applicabilities of a multi-touch interface for music production and the implementation of an innovative instrument with great potential, but not popular due to its high cost. The future work for this project will involve adding more features to the interface, improving the latency of the setup and adding additional effects/samples to the sound module. It is also a possibility to use the sound module for controlling for example graphics or video, to allow for a visual experience in addition to just the audio.

-------------------------------------------------------

Name: Rui Reis Costa Campos
Email:
Location/Timezone: Porto, Portugal/GMT
Age: 22
Education/Qualifications: Student of the Integrated Masters of Informatics and Computing Engineering in the University of Porto, Portugal ( 5th year)
Project Proposal:
Hello, I’m currently in my final year of my degree and I’m working on my master’s thesis. My master’s thesis has the title of “Framework for Role Playing Games in Multi-Touch Multi-User devices”. In the past year my interest has spiked towards the area of interaction mainly due to the opportunity of making this thesis. I have worked with PyMT for most of the development of my thesis and am now switching to Kivy.
Joined with my recent interest in HCI is my music hobby which gave me an idea for a side project. This idea is to develop a platform for controlling midi via touch but most importantly, to implement a system similar to the Haken Continuum (http://www.youtube.com/watch?v=Mrmp2EaVChI&feature=related).  The project also includes implementing a series of algorithms to do things like pitch rounding or automatic pitch sliding between consecutive contact points, see http://www.youtube.com/watch?v=yCM-WBqDZ-Q for a few of the cool features that can be implemented.
Ever since I saw this instrument being played I had the impression that it would be the next generation keyboard, because the possibilities are tremendous. However, it has not seen widespread adoption mainly due to the fact that it’s quite expensive.
So my project proposal is to implement an Haken Continuumish system with Kivy and a set of midi controls all touch based. My implementation will have three main tasks:
Implementation of the Interface:
This represents the interaction component where I will implement the continuous keyboard and various controls for the keyboard. I already know how work with PyMT/Kivy so I expect this task to be the less time consuming. One week for an initial prototype so I can start working on the other modules and two weeks spread along the other tasks for improving the user experience and some additional controls that may arise from the rest of the development.
Touch information treatment and sound production:
This is the component which will be responsible for treating the touch information received and generate the sound. As TUIO is based on OSC maybe it would be possible to simply send the TUIO directly to synth software and implement the parser of the touch information in the synth directly, but I’m afraid that this might restrict the implementation to that specific software. The analysis of these tradeoffs along with the fact that this is the module where I have less know-how( I will learn about OSC and how the synth software works) ,I consider that this is the task which will be the most time consuming and thus predict an estimate of 6 weeks for completion.
Optimizations for sound production:
Here I will see how much latency the current system has and try to optimize it using code optimizations, setup analysis, etc. I know that there are some proposals for using the GPU and for using regions of interests, the latter is especially interesting for me as the main interaction will be done in a specific region of the display and there may be a considerable performance gain in this; I will address this in the end of the implementation time so there may be some more advancements then. I expect to spend 2/3 weeks addressing this issue.

I believe that this project will be beneficial for the community given that we can bring even more people from the music community to the NUI field of research. The main problem with this kind of system is the latency, I consider it the biggest challenge for this project to be realistically used by musicians, however, I hope that this project can arise more interest in this area and since it is a field which is constantly evolving, the advancements in the processing area will eventually make this project a real success.
Bibliography/Inspirations:
http://www.cerlsoundgroup.org/Continuum/
http://opensoundcontrol.org/sensor-gesture-based-electronic-musical-instruments
http://www.jazzmutant.com/lemur_gallery_videos.php
http://www.livelab.dk/tablet2midi.php

Profile
 
 
Posted: 01 April 2011 09:14 AM   [ Ignore ]   [ # 1 ]
Avatar
RankRankRank
Joined  2008-12-27
Total Posts:  262
Sr. Member

Hi Rui,

This idea have been tested at the beginning of PyMT with Flipo. However, the code never been published for everyone.
I would love to see a musical instrument as the Continuum for everyone.
Making the instrument as generic as possible (ie as you said, touch parsing inside kivy, not inside the synth) is perfect.
However, do you intent to do both Midi & OSC ?

For sound generation, do you have an idea about what you will use yet ? PureData ?

Good luck for your proposal !

 Signature 

Kivy | PyMT | Movid

Profile
 
 
Posted: 01 April 2011 10:15 AM   [ Ignore ]   [ # 2 ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

Thanks for the quick reply smile

Is there the possibility that you could share this code? Were there any problems as to why the code was never published?

As for the music side of the implementation my knowledge is quite fuzzy, I haven’t addressed these topics before and would like to use this project to learn more about it. From what I could gather from my small research on the matter, Midi is the current standard and has a lot of software/hardware compatibility, so it would be easier to use because I can just plug it to any available vst or hardware. However, from what I could understand, OSC allows controlling not only the sound output, but it allows anything to be controlled because it has user defined messages so I can control the synth, the effects used. If i use OSC the disadvantage from what I understand is that I will have to ensure some compatibility with the sound generation module that I use, because it needs to understand my defined messages, or I have to comply with that module’s defined messages. I have to better analyse these two possibilities, maybe if the implementation of both is not overwhelming, I can implement both.

As for the sound generation, I still don’t know what I will use. If I generate Midi I guess I don’t have to worry about which program I will use, because most of them support it. If I use OSC I will have to find some compatibility. I didn’t know about PureData, I will investigate it smile

Profile
 
 
Posted: 01 April 2011 10:55 AM   [ Ignore ]   [ # 3 ]
Avatar
RankRankRank
Joined  2008-12-27
Total Posts:  262
Sr. Member

I’ve mistyped, it’s not “with” Flippo, but “by” Flippo. I don’t have any code smile

 Signature 

Kivy | PyMT | Movid

Profile
 
 
Posted: 03 April 2011 03:06 AM   [ Ignore ]   [ # 4 ]
Avatar
Rank
Joined  2010-09-14
Total Posts:  66
New Member

Hi,

uaua, that seems a good idea. You will have to do a setup with the maximum precision in finger position recognition, also a pressure sensitive setup, i think.
I would recomend that you try to use OSC and perhaps MIDI, but OSC will give you much more flexibility to send data to any synth.
Did you think which audio software are you going to use?
I can help you with puredata or MAXMSP if you need it.
Also, via OSC you can comunicate with Ableton Live and have all this instruments, etc.
OSC can give you also a control over the graphics if you want to display some sound-graphics interaction.

Hope you can do it !!
If you have any question about sound programing, you can ask here or to me. I am not a very good programer in pure data or maxmsp but have some experience.

For the hardware, did you think about a setup?

Profile
 
 
Posted: 05 April 2011 11:25 AM   [ Ignore ]   [ # 5 ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

I’ve finished my proposal and edited the topic post. I’ve added the link to the proposal pdf and included the abstract.

Looking forward to all the feedback I can get smile

Profile
 
 
Posted: 05 April 2011 12:53 PM   [ Ignore ]   [ # 6 ]
Avatar
RankRankRank
Joined  2008-12-27
Total Posts:  262
Sr. Member

Don’t forget to submit on melange app before the rush to ensure you’ll be in the competition smile

 Signature 

Kivy | PyMT | Movid

Profile
 
 
Posted: 05 April 2011 01:13 PM   [ Ignore ]   [ # 7 ]
Avatar
RankRankRankRank
Joined  2006-11-09
Total Posts:  1499
Administrator

Great proposal… should be used as an example for other students smile On the concept tho I am worried it is a bit to narrow… how could this be expanded to cover a more broad signal processing concepts, no or in the future etc (through modules).

 Signature 

~

Profile
 
 
Posted: 05 April 2011 06:29 PM   [ Ignore ]   [ # 8 ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

Thanks for the replies, tomorrow I will include a future work section on the proposal. My initial idea was to only address the interface and osc/midi output, using any available synth, but since a lot of people have talked about actually implementing a sound output module I included it in the proposal; maybe I should address it even more.

Profile
 
 
Posted: 06 April 2011 06:50 AM   [ Ignore ]   [ # 9 ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

I have changed the proposal. I will now address the sound module with a different approach since the interest seems considerable. My know-how in this area is lacking, which is good because I will get to learn more about these technologies. I didn’t know about PureData but, since it was referred here, I have been exploring it and it looks really interesting. Also there will be a workshop here in Portugal about it, on the 15th of April, I won’t miss it smile

I added a Future Work section to the proposal.

Thanks for the great feedback so far smile

Profile
 
 
Posted: 06 April 2011 07:33 AM   [ Ignore ]   [ # 10 ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

I also added the proposal to the melange app. I made a mistake and had to create two. I should have successfully unsubmitted it, but if you can still see the first one please ignore it.

Profile
 
 
Posted: 06 April 2011 03:10 PM   [ Ignore ]   [ # 11 ]
Rank
Joined  2011-03-20
Total Posts:  6
New Member

Thanks for posting the PDF of the proposal! The idea is interesting and the way you presented it is great! I hope you will get one place in GSoC.
Also if you don’t mind I will use the structure of your proposal as a template for my own.

Profile
 
 
Posted: 06 April 2011 03:24 PM   [ Ignore ]   [ # 12 ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

Thanks for the compliment! Feel free to use the structure smile

As an additional note, I will be doing a simple prototype which uses Kinect and a library called TouchKinect which will map a table and give the position of the fingers which are touching the table; I will create a midi output out of this which will be based on the position of the fingers in the table. The idea is to do something very simple, It’s just a tech demo do be developed in one day. I hope I learn more about PureData with this smile

Profile
 
 
Posted: 06 April 2011 04:20 PM   [ Ignore ]   [ # 13 ]
Avatar
Rank
Joined  2010-09-14
Total Posts:  66
New Member

Hi,
yes, it is a good idea, keep going !. Only one advise: remember that usually it takes much more time to program, design, etc. than expected.... so be careful....

To begin with Pure Data , i recommend you these links:

http://www.pd-tutorial.com/english/ : Programming Electronic Music in Pd, by Johannes Kreidler.
you can get also pdf of this great book.

Floss Manual of Pure Data : an excellent intro to Pure Data and digital music.

videos :
cheetomoskeeto channel on youtube : amazing PD tutorials. Begin with the first:

Pure Data Lesson 1: Hello World.

with this you have more than enough to be an expert PD programmer !!!!!!

good luck !

Profile
 
 
Posted: 06 April 2011 07:00 PM   [ Ignore ]   [ # 14 ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

Wow! Thanks a lot, I will definitely need this tomorrow when i start working on the prototype i talked about!

Profile
 
 
Posted: 14 April 2011 05:53 AM   [ Ignore ]   [ # 15 ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

I managed to make a simple prototype which uses Kinect for detecting touches on a surface and pd to produce sound. It came out better than I thought, the latency in Kinect is almost unnoticeable.

I learned pd in the process, at least the programming part because creating the sound itself and changing it to suit my needs is more challenging. It requires knowledge of signal processing for a musical context which is trickier… Regardless, I now clearly know how to make it so that you can easily add more sounds or effects to the pd patch and how you can use osc messages to enable or disable these effects. I did this with TUIO but I assume that the way the OSC module in puredata works is very similar.

I know some people in the sound business, I can ask them later on to make me some patch to produce a less annoying sound, right now I’m speaking with whales rasberry

The patch is attached, just plug to any TUIO provider and it plays something.

File Attachments
TUIO sound player.zip  (File Size: 172KB - Downloads: 166)
Profile
 
 
Posted: 14 April 2011 06:21 AM   [ Ignore ]   [ # 16 ]
Avatar
Rank
Joined  2010-09-14
Total Posts:  66
New Member

great!
i will look the pd patch, and tell you about… yess, cheetomosqeeto is great !
how do you get kinect to tuio?

Profile
 
 
Posted: 14 April 2011 06:52 AM   [ Ignore ]   [ # 17 ]
Rank
Joined  2011-01-22
Total Posts:  11
New Member

It’s with this: https://github.com/robbeofficial/KinectTouch

It functions very well as a workaround if you don’t have a multitouch table system working yet. I just had the kinect pointing at a simple 20’ tft screen and voilá, multitouch system :D kinect is awesome…

Profile
 
 
Posted: 11 February 2012 07:10 PM   [ Ignore ]   [ # 18 ]
Rank
Joined  2011-12-17
Total Posts:  18
New Member

Hey, nice idea though, can anyone help me.

I am working on a project, for that I want control frequency of audio. Can anyone tell me how could I do that in Kivy !?

Profile
 
 
Posted: 04 November 2019 12:14 AM   [ Ignore ]   [ # 19 ]
Rank
Joined  2016-02-07
Total Posts:  9
New Member

We allow 40% discount to all the students on every purchase of IT certification Exam dumps pdf. Customer satisfaction is the first priority of the DumpsExpert. We offer 100% money back guarantee. More information

Profile