Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
When Action Speaks Louder than Words: Exploring Non-Verbal and Paraverbal Features in Dyadic Collaborative VR
by
Osei Tutu, Dennis
, Habibiabad, Sepideh
, Bombeke, Klaas
, Van den Noortgate, Wim
, Saldien, Jelle
in
Adult
/ Artificial intelligence
/ Attention - physiology
/ Collaboration
/ Communication
/ Comparative analysis
/ Cooperative Behavior
/ Design
/ Facial Expression
/ Feedback
/ Female
/ Gestures
/ Humans
/ Male
/ Mechanics
/ multimodal behavior tracking
/ Nonverbal Communication - physiology
/ Sensors
/ Soft skills
/ soft skills training
/ Speech - physiology
/ User behavior
/ Virtual Reality
/ VR interaction analysis
/ Young Adult
2025
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
When Action Speaks Louder than Words: Exploring Non-Verbal and Paraverbal Features in Dyadic Collaborative VR
by
Osei Tutu, Dennis
, Habibiabad, Sepideh
, Bombeke, Klaas
, Van den Noortgate, Wim
, Saldien, Jelle
in
Adult
/ Artificial intelligence
/ Attention - physiology
/ Collaboration
/ Communication
/ Comparative analysis
/ Cooperative Behavior
/ Design
/ Facial Expression
/ Feedback
/ Female
/ Gestures
/ Humans
/ Male
/ Mechanics
/ multimodal behavior tracking
/ Nonverbal Communication - physiology
/ Sensors
/ Soft skills
/ soft skills training
/ Speech - physiology
/ User behavior
/ Virtual Reality
/ VR interaction analysis
/ Young Adult
2025
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
When Action Speaks Louder than Words: Exploring Non-Verbal and Paraverbal Features in Dyadic Collaborative VR
by
Osei Tutu, Dennis
, Habibiabad, Sepideh
, Bombeke, Klaas
, Van den Noortgate, Wim
, Saldien, Jelle
in
Adult
/ Artificial intelligence
/ Attention - physiology
/ Collaboration
/ Communication
/ Comparative analysis
/ Cooperative Behavior
/ Design
/ Facial Expression
/ Feedback
/ Female
/ Gestures
/ Humans
/ Male
/ Mechanics
/ multimodal behavior tracking
/ Nonverbal Communication - physiology
/ Sensors
/ Soft skills
/ soft skills training
/ Speech - physiology
/ User behavior
/ Virtual Reality
/ VR interaction analysis
/ Young Adult
2025
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
When Action Speaks Louder than Words: Exploring Non-Verbal and Paraverbal Features in Dyadic Collaborative VR
Journal Article
When Action Speaks Louder than Words: Exploring Non-Verbal and Paraverbal Features in Dyadic Collaborative VR
2025
Request Book From Autostore
and Choose the Collection Method
Overview
Soft skills such as communication and collaboration are vital in both professional and educational settings, yet difficult to train and assess objectively. Traditional role-playing scenarios rely heavily on subjective trainer evaluations—either in real time, where subtle behaviors are missed, or through time-intensive post hoc analysis. Virtual reality (VR) offers a scalable alternative by immersing trainees in controlled, interactive scenarios while simultaneously capturing fine-grained behavioral signals. This study investigates how task design in VR shapes non-verbal and paraverbal behaviors during dyadic collaboration. We compared two puzzle tasks: Task 1, which provided shared visual access and dynamic gesturing, and Task 2, which required verbal coordination through separation and turn-taking. From multimodal tracking data, we extracted features including gaze behaviors (eye contact, joint attention), hand gestures, facial expressions, and speech activity, and compared them across tasks. A clustering analysis explored whether o not tasks could be differentiated by their behavioral profiles. Results showed that Task 2, the more constrained condition, led participants to focus more visually on their own workspaces, suggesting that interaction difficulty can reduce partner-directed attention. Gestures were more frequent in shared-visual tasks, while speech became longer and more structured when turn-taking was enforced. Joint attention increased when participants relied on verbal descriptions rather than on a visible shared reference. These findings highlight how VR can elicit distinct soft skill behaviors through scenario design, enabling data-driven analysis of collaboration. This work contributes to scalable assessment frameworks with applications in training, adaptive agents, and human-AI collaboration.
Publisher
MDPI AG
Subject
This website uses cookies to ensure you get the best experience on our website.