Posté par: boinappi Il y a 1 mois
Image by Olga Toltinova
What are the design potentials of sonification when it comes to human-robot-interaction? Can we change how we encounter robots by sounddesign? Can we listen to their data, their knowledge, and how would this change our interaction with them? In my practice-led research project (Bauhaus-University Weimar/D-LAB & Technical University of Applied Sciences Augsburg/Hybrid Things Lab) we focus on sonified sounds for non-anthropomorphic robots (e.g. industrial manipulators) and research human-robot-interaction in interactive demonstrators, explorations and playful experiences.
In Robosonic Play we continue the promising results of our previous work (https://doi.org/10.1145/3611646) in nonlinear human-robot-material interaction scenarios, by incorporating the performative aspects of human-robot collaboration into our further research. We focus on moments of vagueness, autonomy, embodied interaction and the machine's data space as material for augmented sonic presence. By turning the inside of the machine (the information in form of data) out (in form of sound), we create an additional layer of information, which can help human collaborators to better understand their computational counterparts: Their autonomous actions (e.g. movement data) and their representation of the world (e.g. sensory data).
Image by Timo Holzmann
Together we will listen to a moving robot, create mappings between sound and machine, and tweak some knobs on a synth.
Partager sur Twitter Partager sur Facebook
Commentaires
Pas de commentaires actuellement
Nouveau commentaire