Roberto Bresin

Orcid: 0000-0002-3086-0322

According to our database1, Roberto Bresin authored at least 51 papers between 1992 and 2023.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2023
Introduction to the special issue on design and perception of interactive sonification.
J. Multimodal User Interfaces, December, 2023

Correction to: PepperOSC: enabling interactive sonification of a robot's expressive movement.
J. Multimodal User Interfaces, December, 2023

PepperOSC: enabling interactive sonification of a robot's expressive movement.
J. Multimodal User Interfaces, December, 2023

Hearing it Out: Guiding Robot Sound Design through Design Thinking.
Proceedings of the 32nd IEEE International Conference on Robot and Human Interactive Communication, 2023

Persuasive Polite Robots in Free-Standing Conversational Groups.
IROS, 2023

Investigating the Role of Robot Voices and Sounds in Shaping Perceived Intentions.
Proceedings of the International Conference on Human-Agent Interaction, 2023

2022
Perceptual Evaluation of Blended Sonification of Mechanical Robot Sounds Produced by Emotionally Expressive Gestures: Augmenting Consequential Sounds to Improve Non-verbal Robot Communication.
Int. J. Soc. Robotics, 2022

2020
Sonification of the self vs. sonification of the other: Differences in the sonification of performed vs. observed simple hand movements.
Int. J. Hum. Comput. Stud., 2020

2019
Introduction to the special issue on interactive sonification.
J. Multimodal User Interfaces, 2019

Correction to: Haptic feedback combined with movement sonification using a friction sound improves task performance in a virtual throwing task.
J. Multimodal User Interfaces, 2019

Haptic feedback combined with movement sonification using a friction sound improves task performance in a virtual throwing task.
J. Multimodal User Interfaces, 2019

Interactive sonification of a fluid dance movement: an exploratory study.
J. Multimodal User Interfaces, 2019

Sound Forest: Evaluation of an Accessible Multisensory Music Installation.
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2019

2016
Mind the Gap: A SIG on Bridging the Gap in Research on Body Sensing, Body Perception and Multisensory Feedback.
Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 2016

2015
Effects of Interactive Sonification on Emotionally Expressive Walking Styles.
IEEE Trans. Affect. Comput., 2015

Demo hour.
Interactions, 2015

Nebula: An Interactive Garment Designed for Functional Aesthetics.
Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, 2015

2014
Modes of Sonic Interaction in Circus: Three Proofs of Concept.
Proceedings of the Music Technology meets Philosophy, 2014

2013
Systems for Interactive Control of Computer Generated Music Performance.
Proceedings of the Guide to Computing for Expressive Music Performance, 2013

Evaluation of Computer Systems for Expressive Music Performance.
Proceedings of the Guide to Computing for Expressive Music Performance, 2013

2012
Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices.
J. Multimodal User Interfaces, 2012

Interactive sonification of expressive hand gestures on a handheld device.
J. Multimodal User Interfaces, 2012

Interactive sonification.
J. Multimodal User Interfaces, 2012

2011
MoodifierLive: Interactive and Collaborative Expressive Music Performance on Mobile Devices.
Proceedings of the 11th International Conference on New Interfaces for Musical Expression, 2011

2010
Communication of musical expression by means of mobile robot gestures.
J. Multimodal User Interfaces, 2010

The Skipproof Virtual Turntable for High-Level Control of Scratching.
Comput. Music. J., 2010

2009
Sound design and perception in walking interactions.
Int. J. Hum. Comput. Stud., 2009

User-Centric Context-Aware Mobile Applications for Embodied Music Listening.
Proceedings of the User Centric Media - First International Conference, 2009

2008
Sonic interaction design: sound, information and experience.
Proceedings of the Extended Abstracts Proceedings of the 2008 Conference on Human Factors in Computing Systems, 2008

2007
A Virtual Head Driven by Music Expressivity.
IEEE Trans. Speech Audio Process., 2007

Expressive Control of Music and Visual Media by Full-Body Movement.
Proceedings of the Seventh International Conference on New Interfaces for Musical Expression, 2007

Emerging Sounds for Disappearing Computers.
Proceedings of the Disappearing Computer, 2007

Sound Design for Affective Interaction.
Proceedings of the Affective Computing and Intelligent Interaction, 2007

User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements.
Proceedings of the Affective Computing and Intelligent Interaction, 2007

Displaying Expression in Musical Performance by Means of a Mobile Robot.
Proceedings of the Affective Computing and Intelligent Interaction, 2007

2006
Mapping strategies in DJ scratching.
Proceedings of the New Interfaces for Musical Expression, 2006

Affective diary: designing for bodily expressiveness and self-reflection.
Proceedings of the Extended Abstracts Proceedings of the 2006 Conference on Human Factors in Computing Systems, 2006

2005
What is the Color of that Music Performance?
Proceedings of the 2005 International Computer Music Conference, 2005

From Acoustic Cues to an Expressive Agent.
Proceedings of the Gesture in Human-Computer Interaction and Simulation, 2005

2004
Rencon 2004: Turing Test for Musical Expression.
Proceedings of the New Interfaces for Musical Expression, 2004

2003
Sounding Objects.
IEEE Multim., 2003

After the first year of Rencon.
Proceedings of the 2003 International Computer Music Conference, 2003

Analysis of a Genuine Scratch Performance.
Proceedings of the Gesture-Based Communication in Human-Computer Interaction, 2003

2002
Automatic Real-Time Extraction of Musical Expression.
Proceedings of the 2002 International Computer Music Conference, 2002

2001
Articulation Rules For Automatic Music Performance.
Proceedings of the 2001 International Computer Music Conference, 2001

2000
Emotional Coloring of Computer-Controlled Music Performances.
Comput. Music. J., 2000

Software Tools for Musical Expression.
Proceedings of the 2000 International Computer Music Conference, 2000

Rule-Based Emotional Coloring of Music Performance.
Proceedings of the 2000 International Computer Music Conference, 2000

1994
Neural Networks for Musical Tones Compression, Control and Synthesis.
Proceedings of the 1994 International Computer Music Conference, 1994

Neural Networks vs. Rules System: Evaluation of Test of Automatic Performance.
Proceedings of the 1994 International Computer Music Conference, 1994

1992
Symbolic and Sub-Symbolic Rules Systems for Real-Time Score Performance.
Proceedings of the 1992 International Computer Music Conference, 1992


  Loading...