Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
21
result(s) for
"Human-robot interaction Safety measures."
Sort by:
Analysis of Deep-Learning Methods in an ISO/TS 15066–Compliant Human–Robot Safety Framework
2025
Over the last years collaborative robots have gained great success in manufacturing applications where human and robot work together in close proximity. However, current ISO/TS-15066-compliant implementations often limit the efficiency of collaborative tasks due to conservative speed restrictions. For this reason, this paper introduces a deep-learning-based human–robot–safety framework (HRSF) that aims at a dynamical adaptation of robot velocities depending on the separation distance between human and robot while respecting maximum biomechanical force and pressure limits. The applicability of the framework was investigated for four different deep learning approaches that can be used for human body extraction: human body recognition, human body segmentation, human pose estimation, and human body part segmentation. Unlike conventional industrial safety systems, the proposed HRSF differentiates individual human body parts from other objects, enabling optimized robot process execution. Experiments demonstrated a quantitative reduction in cycle time of up to 15% compared to conventional safety technology.
Journal Article
Considerations on the Dynamics of Biofidelic Sensors in the Assessment of Human–Robot Impacts
by
Faglia, Rodolfo
,
Fassi, Irene
,
Valori, Marcello
in
biofidelic sensor
,
Biomechanics
,
Collaboration
2024
Ensuring the safety of physical human–robot interaction (pHRI) is of utmost importance for industries and organisations seeking to incorporate robots into their workspaces. To address this concern, the ISO/TS 15066:2016 outlines hazard analysis and preventive measures for ensuring safety in Human–Robot Collaboration (HRC). To analyse human–robot contact, it is common practice to separately evaluate the “transient” and “quasi-static” contact phases. Accurately measuring transient forces during close human–robot collaboration requires so-called “biofidelic” sensors that closely mimic human tissue properties, featuring adequate bandwidth and balanced damping. The dynamics of physical human–robot interactions using biofidelic measuring devices are being explored in this research. In this paper, one biofidelic sensor is tested to analyse its dynamic characteristics and identify the main factors influencing its performance and its practical applications for testing. To this aim, sensor parameters, such as natural frequency and damping coefficient, are estimated by utilising a custom physical pendulum setup to impact the sensor. Mathematical models developed to characterise the sensor system and pendulum dynamics are also disclosed.
Journal Article
VR Human-Centric Winter Lane Detection: Performance and Driving Experience Evaluation
by
Paderewski, Patricia
,
Gutierrez-Vela, Francisco
,
Ortegon-Sarmiento, Tatiana
in
Adult
,
advanced driver assistance systems
,
Algorithms
2025
Driving in snowy conditions challenges both human drivers and autonomous systems. Snowfall and ice accumulation impair vehicle control and affect driver perception and performance. Road markings are often obscured, forcing drivers to rely on intuition and memory to stay in their lane, which can lead to encroachment into adjacent lanes or sidewalks. Current lane detectors assist in lane keeping, but their performance is compromised by visual disturbances such as ice reflection, snowflake movement, fog, and snow cover. Furthermore, testing these systems with users on actual snowy roads involves risks to driver safety, equipment integrity, and ethical compliance. This study presents a low-cost virtual reality simulation for evaluating winter lane detection in controlled and safe conditions from a human-in-the-loop perspective. Participants drove in a simulated snowy scenario with and without the detector while quantitative and qualitative variables were monitored. Results showed a 49.9% reduction in unintentional lane departures with the detector and significantly improved user experience, as measured by the UEQ-S (p = 0.023, Cohen’s d = 0.72). Participants also reported higher perceived safety, situational awareness, and confidence. These findings highlight the potential of vision-based lane detection systems adapted to winter environments and demonstrate the value of immersive simulations for user-centered testing of ADASs.
Journal Article
Human-Robot Interaction and Collaboration (HRI-C) Utilizing Top-View RGB-D Camera System
2021
In this study, a smart and affordable system that utilizes an RGB-D camera to measure the exact position of an operator with respect to an adjacent robotic manipulator was developed. This developed technology was implemented in a simulated human operation in an automated manufacturing robot to achieve two goals; enhancing the safety measures around the robot by adding an affordable smart system for human detection and robot control and developing a system that will allow the between the human-robot collaboration to finish a predefined task. The system utilized an Xbox Kinect V2 sensor/camera and Scorbot ER-V Plus to model and mimics the selected applications. To achieve these goals, a geometric model for the Scorbot and Xbox Kinect V2 was developed, a robotics joint calibration was applied, an algorithm of background segmentation was utilized to detect the operator and a dynamic binary mask for the robot was implemented, and the efficiency of both systems based on the response time and localization error was analyzed. The first application of the Add-on Safety Device aims to monitor the working-space and control the robot to avoid any collisions when an operator enters or gets closer. This application will reduced and remove physical barriers around the robots, expand the physical work area, reduce the proximity limitations, and enhance the human-robots interaction (HRI) in an industrial environment while sustaining a low cost. The system was able to respond to human intrusion to prevent any collision within 500 ms on average, and it was found that the system’s bottleneck was PC and robot inter-communication speed. The second application was developing a successful collaborative scenario between a robot and a human operator, where a robot will deposit an object on the operator’s hand, mimicking a real-life human-robot collaboration (HRC) tasks. The system was able to detect the operator’s hand and it’s location then command the robot to place an object on the hand, the system was able to place the object within a mean error of 2.4 cm, and the limitation of this system was the internal variables and data transmitting speed between the robot controller and main computer. These results are encouraging and ongoing work aims to experiment with different operations and implement gesture detection in real-time collaboration tasks while keeping the human operator safe and predicting their behavior.
Journal Article
Mixed reality representation of hazard zones while collaborating with a robot: sense of control over own safety
by
San Martin, Ane
,
Lazkano, Elena
,
Kildal, Johan
in
Artificial Intelligence
,
Collaboration
,
Computer aided design
2025
Safety is the main concern in human-robot collaboration (HRC) in work environments. Standard safety measures based on reducing robot speed affect productivity of collaboration, and do not inform workers adequately about the state of the robot, leading to stressful situations due to uncertainty. To grant the user control over safety, we investigate using audio, visual and audio-visual mixed reality displays that inform about the boundaries of zones with different levels of hazard. We describe the design of the hazard displays for scenario of collaboration with a real robot. We then report an experimental user study with 24 users, comparing performance and user experience (UX) obtained with the auditory display, the visual display, and the audio-visual display resulting from combining both. Findings suggest that all modalities are suitable for HRC scenarios, warranting similar performance during collaboration. However, distinct qualitative results were observed between displays, indicating differences in the UX obtained.
Journal Article
Safety Engineering for Humanoid Robots in Everyday Life—Scoping Review
2025
As humanoid robots move from controlled industrial environments into everyday human life, their safe integration is essential for societal acceptance and effective human–robot interaction (HRI). This scoping review examines engineering safety frameworks for humanoid robots across four core domains: (1) physical safety in HRI, (2) cybersecurity and software robustness, (3) safety standards and regulatory frameworks, and (4) ethical and societal implications. In the area of physical safety, recent research trends emphasize proactive, multimodal perception-based collision avoidance, the use of compliance mechanisms, and fault-tolerant control to handle hardware failures and falls. In cybersecurity and software robustness, studies increasingly address the full threat landscape, secure real-time communication, and reliability of artificial intelligence (AI)-based control. The analysis of standards and regulations reveals a lag between technological advances and the adaptation of key safety standards in current research. Ethical and societal studies show that safety is also shaped by user trust, perceived safety, and data protection. Within the corpus of 121 peer-reviewed studies published between 2021 and 2025 and included in this review, most work concentrates on physical safety, while cybersecurity, standardization, and socio-ethical aspects are addressed less frequently. These gaps point to the need for more integrated, cross-domain approaches to safety engineering for humanoid robots.
Journal Article
Safe pHRI via the Variable Stiffness Safety-Oriented Mechanism (V2SOM): Simulation and Experimental Validations
2020
Robots are gaining a foothold day-by-day in different areas of people’s lives. Collaborative robots (cobots) need to display human-like dynamic performance. Thus, the question of safety during physical human–robot interaction (pHRI) arises. Herein, we propose making serial cobots intrinsically compliant to guarantee safe pHRI via our novel designed device, V2SOM (variable stiffness safety-oriented mechanism). Integrating this new device at each rotary joint of the serial cobot ensures a safe pHRI and reduces the drawbacks of making robots compliant. Thanks to its two continuously linked functional modes—high and low stiffness—V2SOM presents a high inertia decoupling capacity, which is a necessary condition for safe pHRI. The high stiffness mode eases the control without disturbing the safety aspect. Once a human–robot (HR) collision occurs, a spontaneous and smooth shift to low stiffness mode is passively triggered to safely absorb the impact. To highlight V2SOM’s effect in safety terms, we consider two complementary safety criteria: impact force (ImpF) criterion and head injury criterion (HIC) for external and internal damage evaluation of blunt shocks, respectively. A pre-established HR collision model is built in Matlab/Simulink (v2018, MathWorks, France) in order to evaluate the latter criterion. This paper presents the first V2SOM prototype, with quasi-static and dynamic experimental evaluations.
Journal Article
Implementation of a Sponge-Based Flexible Electronic Skin for Safe Human–Robot Interaction
2022
In current industrial production, robots have increasingly been taking the place of manual workers. With the improvements in production efficiency, accidents that involve operators occur frequently. In this study, a flexible sensor system was designed to promote the security performance of a collaborative robot. The flexible sensors, which was made by adsorbing graphene into a sponge, could accurately convert the pressure on a contact surface into a numerical signal. Ecoflex was selected as the substrate material for our sensing array so as to enable the sensors to better adapt to the sensing application scenario of the robot arm. A 3D printing mold was used to prepare the flexible substrate of the sensors, which made the positioning of each part within the sensors more accurate and ensured the unity of the sensing array. The sensing unit showed a correspondence between the input force and the output resistance that was in the range of 0–5 N. Our stability and reproducibility experiments indicated that the sensors had a good stability. In addition, a tactile acquisition system was designed to sample the tactile data from the sensor array. Our interaction experiment results showed that the proposed electronic skin could provide an efficient approach for secure human–robot interaction.
Journal Article
Dynamic Human–Robot Collision Risk Based on Octree Representation
by
Michalakis, George
,
Paraskevopoulos, Giorgos
,
Michalellis, Isidoros
in
Accidental collisions
,
AR applications
,
Augmented reality
2023
The automation of manufacturing applications where humans and robots operate in a shared environment imposes new challenges for presenting the operator’s safety and robot’s efficiency. Common solutions relying on isolating the robots’ workspace from human access during their operation are not applicable for HRI. This paper presents an extended reality-based method to enhance human cognitive awareness of the potential risk due to dynamic robot behavior towards safe human–robot collaborative manufacturing operations. A dynamic and state-aware occupancy probability map indicating the forthcoming risk of human–robot accidental collision in the 3D workspace of the robot is introduced. It is determined using octrees and is rendered in a virtual or augmented environment using Unity 3D. A combined framework allows the generation of both static zones (taking into consideration the entire configuration space of the robot) and dynamic zones (generated in real time by fetching the occupancy data corresponding to the robot’s current configuration), which can be utilized for short-term collision risk prediction. This method is then applied in a virtual environment of the workspace of an industrial robotic arm, and we also include the necessary technical adjustments for the method to be applied in an AR setting.
Journal Article