Here is an idea for my proposal for GSoC. I wanted to know what you all thought.
The purposed project is the development of a user proximity sensing application. The basic idea behind user proximity sensing is that the user approaches a surface and the environment/objects change to meet the preferences and personality of that user. This application would bring a whole other meaning to the “natural” in a NUI environment. This user proximity sensing would be accomplished over Bluetooth and would determine the user with information found on device (e.g. device name, device type, device address). The interactive objects and display would then be transformed to suit/attract specific users (e.g. photos, videos, feeds, etc.).
Using C#/WPF I would develop an application, with the help of a few libraries such as InTheHand and Brecham Obex that would be used to detect a user with a Bluetooth device. The detection of the user would be accomplished through the device name, device type, and device address that would be obtained over Bluetooth. The information that was then obtain would be applied to a database of some sort (e.g. mySql, xml, etc.), and from this database the preferences would be extracted, loaded then applied to the User Interface.
In the future I see this really working alongside devices to communicate with each other to bring the user what they desire. A blogger that just has to bring their phone to the surface and let the devices recreate their day before their eyes, a new student that finds his way to class from a generated map showing him everything he needs, or a family member who’s multi-touch surface can manipulate and display DVR settings depending on who’s around.