Name
Affiliation
Papers
MAURIZIO MANCINI
Casa Paganini – InfoMus, DIST, University of Genova, Viale Causa 13, 16145 Genova, Italy1
73
Collaborators
Citations 
PageRank 
119
597
55.25
Referers 
Referees 
References 
1041
995
628
Search Limit
1001000
Title
Citations
PageRank
Year
Social Interaction Data-sets in the Age of Covid-19: a Case Study on Digital Commensality00.342022
How ECA vs Human Leaders Affect the Perception of Transactive Memory System (TMS) in a Team00.342021
A Hitchhiker’s Guide towards Transactive Memory System Modeling in Small Group Interactions00.342021
Get Together in the Middle-earth: a First Step Towards Hybrid Intelligence Systems00.342021
When Emotions are Triggered by Single Musical Notes: Revealing the Underlying Factors of Auditory-Emotion Associations00.342021
Sonification of the self vs. sonification of the other: Differences in the sonification of performed vs. observed simple hand movements00.342020
Does embodied training improve the recognition of mid-level expressive movement qualities sonification?00.342019
A Computational Model for Managing Impressions of an Embodied Conversational Agent in Real-Time00.342019
Understanding Chromaesthesia by Strengthening Auditory -Visual-Emotional Associations00.342019
A VR Game-based System for Multimodal Emotion Data Collection.00.342019
Managing an Agent's Self-Presentational Strategies During an Interaction.00.342019
Computational Commensality: from theories to computational models for social food preparation and consumption in HCI20.512019
A Framework For Creative Embodied Interfaces00.342018
Change management in the ATM system: integrating information in the preliminary system safety assessment.00.342016
The Dancer in the Eye: Towards a Multi-Layered Computational Framework of Qualities in Movement.130.802016
A system to support the learning of movement qualities in dance: a case study on dynamic symmetry.20.382016
Analysis of Intrapersonal Synchronization in Full-Body Movements Displaying Different Expressive Qualities.10.372016
Automated Laughter Detection From Full-Body Movements.70.512016
Designing Multimodal Interactive Systems using EyesWeb XMI.30.532016
Towards a Multimodal Repository of Expressive Movement Qualities in Dance.20.392016
Laughing with a Virtual Agent10.362015
Perception of intensity incongruence in synthesized multimodal expressions of laughter10.352015
Automated Detection of Impulsive Movements in HCI60.562015
Gesture mimicry in expression of laughter00.342015
Social retrieval of music content in multi-user performance.00.342015
Lol - Laugh Out Loud10.352015
How is your laugh today?30.532014
Rhythmic Body Movements of Laughter70.532014
Effects of Gender Mapping on the Perception of Emotion from Upper Body Movement in Virtual Characters.00.342014
Multimodal Analysis of Laughter for an Interactive System.60.532013
Laugh When You'Re Winning50.482013
Towards Automated Full Body Detection of Laughter Driven by Human Expert Annotation40.472013
How Action Adapts to Social Context: The Movements of Musicians in Solo and Ensemble Conditions20.382013
MMLI: Multimodal Multiperson Corpus of Laughter in Interaction130.702013
Interactive reflexive and embodied exploration of sound qualities with BeSound00.342013
A system for mobile music authoring and active listening20.452013
Studying the Effect of Creative Joint Action on Musicians' Behavior.00.342013
The 3rd international workshop on social behaviour in music: SBM201200.342012
Computing and evaluating the body laughter index171.132012
Embodied cooperation using mobile devices: presenting and evaluating the Sync4All application00.342012
Modelling and Analysing Creative Communication within Groups of People: The Artistic Event at FET11.00.342011
Evaluation of the Mobile Orchestra Explorer Paradigm.20.382011
A System for Mobile Active Music Listening Based on Social Interaction and Embodiment90.662011
The Mobile Orchestra Explorer.00.342011
An Invisible Line: Remote Communication Using Expressive Behavior.00.342011
Realtime Expressive Movement Detection Using the EyesWeb XMI Platform.10.352011
User-Centered Evaluation of the Virtual Binocular Interface.10.402011
Human movement expressivity for mobile active music listening20.452010
Cross-disciplinary approaches to multimodal user interfaces10.482010
Greta, une plateforme d'agent conversationnel expressif et interactif30.402010
  • 1
  • 2