Mobile devices have been a promising platform for musical performance thanks to the various sensors readily available on board. In particular, mobile cameras can provide rich input as they can capture a wide variety of user gestures or environment dynamics. However, this raw camera input only provides continuous parameters and requires expensive computation. With Phone with the Flow (PwF), we propose combining camera based motion/gesture input with the touch input, in order to filter movement information both temporally and spatially, thus increasing expressiveness while reducing computation time.
PwF is currently implemented as an Android App. You can download the apk file here. The source code will be available soon.
Every touch input activates a region of interest (ROI) in the camera image. The captured movements in the ROIs are then sonified. The sound synthesis can be done either directly on the mobile device, with restrictions on the complexity of the synthesis due to limited computing capabilities, or the features can be sent to external musical software via OpenSoundControl messages. For further information about the mappings used in the app, you can download the Pure Data patch and control it by enabling the OSC output of PwF.
For built-in sound synthesis to work, a smartphone with an Octa-core processor and 4GB of RAM is recommended. Otherwise you can use the OSC output and the patch provided above. If you are interested in porting the App to IOS, you are more than welcome to contact me.