Thursday, February 12, 2015

Look inward (angel)

Another idea. What if we positioned a tiny camera on the outside of the heel of each thumb, pointing inward--directly at the other thumb. These two cameras should be able to monitor the thumb activity very closely, and it seems this would be less intrusive than the accelerometer.

So we would have two cameras looking downward at the keys and fingers and two cameras looking inward at the thumbs and globs of (UV?) paint for color markers.

Just a thought. I really need some advice on cameras to prove this concept.


Compare this with the IMU-based picture:


All of which makes me start thinking of scalability and money. Even assuming we can build this with cheap components (PS3Eye cameras or similar), how many can we afford to build? 

How much CPU do we need? Can we share CPUs in a piano lab environment? Is that the best environment to collect data?

Could we set up in a practice room or two to which only project participants have access? Do I need to be there whenever data are collected? What about a take-home system for a smaller, more expert set of pianists (with larger repertoires)?

Do we need to control the repertoire and display it on a screen? This is not really necessary to the collection system. We don't really care what the user plays. We can characterize it later from the MIDI stream. We might suggest they play predominantly finger legato single-note lines using both hands. Initially at least, I am thinking of ignoring chords and focusing more on the interplay of melodic interplay between the two hands. This was suggested by someone at DHCS.

Another thought is a three-camera, one-hand tracking system with one camera staring down the length of the keyboard in lieu of the hand-mounted cameras. Come to think of it, having the cameras in motion may be a significant problem.

One last interesting approach is worth mentioning. Check out this video:

And this one:

On the surface, this looks insanely promising for our purposes. (I especially like how he complains about the thumb occluding the other fingers.) I am thinking about contacting David Kim, who is now a post-doc with Microsoft Research in Cambridge, UK.

More details in the following article:

Kim, D., Hilliges, O., Izadi, S., Butler, A., Chen, J., Oikonomidis, I., & Olivier, P. (2012). Digits: Freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In UIST ’12: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (pp. 167–176). Honolulu, Hawaii, USA: Association for Computing Machinery.

1 comment:

  1. Watching some people actually play the piano, I have serious reservations about the wrist- and thumb-cam ideas. I think the keys and the edge of the piano would often occlude the wrist cam. And there would likely be occasions where the thumb cam can't see the other thumb. Also, it is no less intrusive than the IMU approach and is likely to be more error prone.

    And about displaying the score (and controlling the repertoire), his would probably be necessary at least in the early prototyping stages, as we will want the players to correct the system to help with error analysis. This might also be a core component in a semi-automated annotation workflow, which might be the best we are able to come up with. I somewhat doubt it will be enough to show the player a raw sequence of pitches (even on the staff) with a sequence of detected fingerings to be verified. I would think they would want the result we show to look like the printed music they are playing, with rhythms, measures, note groupings, etc. Otherwise, the verification step will be too hard. But we should do some sort of usability study with our elite corps of pianists.

    ReplyDelete