1. Singing in birds is accompanied by beak, head and throat movements. The role of these visual cues has long been hypothesised to be an important facilitator in vocal communication, including social interactions and song acquisition, but has seen little experimental study.
2. To address whether audio-visual cues are relevant for birdsong we used high-speed video recording, 3D scanning, 3D printing technology and colour-realistic painting to create RoboFinch, an open source adult-mimicking robot which matches temporal and chromatic properties of songbird vision. We exposed several groups of juvenile zebra finches during their song developmental phase to one of six singing robots that moved their beaks synchronised to their song and compared them with birds in a non-synchronised and two control treatments.
3. Juveniles in the synchronised treatment approached the robot setup from the start of the experiment and progressively increased the time they spent singing, contra to the other treatment groups. Interestingly, birds in the synchronised group seemed to actively listen during tutor song playback, as they sung less during the actual song playback compared to the birds in the asynchronous and audio-only control treatments.
4. Our open source RoboFinch setup thus provides an unprecedented tool for systematic study of the functionality and integration of audio-visual cues associated with song behaviour. Realistic head and beak movements aligned to specific song elements may allow future studies to assess the importance of multisensory cues during song development, sexual signalling and social behaviour. All software and assembly instructions are open source, and the robot can be easily adapted to other species. Experimental manipulations of stimulus combinations and synchronisation can further elucidate how audio-visual cues are integrated by receivers and how they may enhance signal detection, recognition, learning and memory.