Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
735 result(s) for "Extended reality"
Sort by:
Integrating Virtual, Mixed, and Augmented Reality to Human–Robot Interaction Applications Using Game Engines: A Brief Review of Accessible Software Tools and Frameworks
This article identifies and summarizes software tools and frameworks proposed in the Human–Robot Interaction (HRI) literature for developing extended reality (XR) experiences using game engines. This review includes primary studies proposing Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) solutions where humans can control or interact with real robotic platforms using devices that extend the user’s reality. The objective of this article is not to present an extensive list of applications and tools. Instead, we present recent, relevant, common, and accessible frameworks and software tools implemented in research articles published in high-impact robotics conferences and journals. For this, we searched papers published during a seven-years period between 2015 and 2022 in relevant databases for robotics (Science Direct, IEEE Xplore, ACM digital library, Springer Link, and Web of Science). Additionally, we present and classify the application context of the reviewed articles in four groups: social robotics, programming of industrial robots, teleoperation of industrial robots, and Human–Robot collaboration (HRC).
Empirical Research on the Metaverse User Experience of Digital Natives
The metaverse has been settled as a platform that is widely beloved by digital natives that are familiar with mobile devices and immersive contents. Thanks to the protocol enabling hedonic interaction, the user experience provides significant value from its communication, enabling learning experiences anytime and anywhere. However, the research topics are focused on the promotions of technology development, marketing effects, and relevant investment consensus. Surprisingly, the biggest problem was the lack of research from the perspective of the young generation, who mainly use the metaverse. This paper intends to examine the usability of digital native participants in detail and suggest how immersive contents, usage environment, and interface aspects should be designed from their point of view. As a result, the significant engagement factors and improvements, through heuristic usability evaluation considering content and user control, were discovered from individual interviews. Conversely, the elements to be supplemented in user experience were derived from information architecture and usage environment categories. In conclusion, the theoretical basis of the empirical usability evaluation on metaverse platforms and following recommendations with practical implications could gain more importance from this research.
Mixed Reality in the Operating Room: A Systematic Review
Mixed Reality is a technology that has gained attention due to its unique capabilities for accessing and visualizing information. When integrated with voice control mechanisms, gestures and even iris movement, it becomes a valuable tool for medicine. These features are particularly appealing for the operating room and surgical learning, where access to information and freedom of hand operation are fundamental. This study examines the most significant research on mixed reality in the operating room over the past five years, to identify the trends, use cases, its applications and limitations. A systematic review was conducted following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines to answer the research questions established using the PICO (Population, Intervention, Comparator and Outcome) framework. Although implementation of Mixed Reality applications in the operations room presents some challenges, when used appropriately, it can yield remarkable results. It can make learning easier, flatten the learning curve for several procedures, and facilitate various aspects of the surgical processes. The articles’ conclusions highlight the potential benefits of these innovations in surgical practice while acknowledging the challenges that must be addressed. Technical complexity, equipment costs, and steep learning curves present significant obstacles to the widespread adoption of Mixed Reality and computer-assisted evaluation. The need for more flexible approaches and comprehensive studies is underscored by the specificity of procedures and limited samples sizes. The integration of imaging modalities and innovative functionalities holds promise for clinical applications. However, it is important to consider issues related to usability, bias, and statistical analyses. Mixed Reality offers significant benefits, but there are still open challenges such as ergonomic issues, limited field of view, and battery autonomy that must be addressed to ensure widespread acceptance.
Augmented Reality, Virtual Reality and Artificial Intelligence in Orthopedic Surgery: A Systematic Review
Background: The application of virtual and augmented reality technologies to orthopaedic surgery training and practice aims to increase the safety and accuracy of procedures and reducing complications and costs. The purpose of this systematic review is to summarise the present literature on this topic while providing a detailed analysis of current flaws and benefits. Methods: A comprehensive search on the PubMed, Cochrane, CINAHL, and Embase database was conducted from inception to February 2021. The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used to improve the reporting of the review. The Cochrane Risk of Bias Tool and the Methodological Index for Non-Randomized Studies (MINORS) was used to assess the quality and potential bias of the included randomized and non-randomized control trials, respectively. Results: Virtual reality has been proven revolutionary for both resident training and preoperative planning. Thanks to augmented reality, orthopaedic surgeons could carry out procedures faster and more accurately, improving overall safety. Artificial intelligence (AI) is a promising technology with limitless potential, but, nowadays, its use in orthopaedic surgery is limited to preoperative diagnosis. Conclusions: Extended reality technologies have the potential to reform orthopaedic training and practice, providing an opportunity for unidirectional growth towards a patient-centred approach.
Facilitating and enhancing co-creation through multi-sensory mixed reality experiences: a paradigm shift in stakeholder engagement
Co-creation has become increasingly important as companies strive to develop more customer-centric products and services by involving stakeholders throughout the design process. This paper introduces XR-CO, a multi-sensory, multi-user mixed reality co-creation platform. XR-CO utilizes an optical see-through head mounted display to create a 1:1 immersive mixed reality environment over essential physical prototypes, allowing stakeholders to physically interact with functional prototypes inside the digital environment. By incorporating visual, auditory, somatosensory and vestibular senses, XR-CO maximizes collaboration and triggers meaningful interactions among stakeholders, leading to improved co-creation quality. To validate XR-CO’s effectiveness, a study was conducted involving 32 participants in co-creation sessions for the development of cabin interiors of a new concept aircraft, the Flying-V. Half of the participants used XR-CO platform, while another half used the conventional desktop co-creation setups. The participants appreciated the XR-CO platforms, as indicated by their high scores in CSI, SUS, co-creation experience, and quantity of ideas in terms of quantitative data analysis. Additionally, 4D scanning of participants movements demonstrated that XR-CO provided a true-to-life perception of the digital space. XR-CO facilitated the discovery of more design issues compared to conventional setups, a finding affirmed by an expert panel using qualitative Delphi techniques. The positive user experience and meaningful outcomes observed in these co-creation sessions serve as strong evidence of the efficacy of the XR-CO platform, particularly the newly introduced somatosense. In summary, our study highlights the significance of XR-CO in co-creation processes. The platform’s immersive nature, combined with the integration of multiple sensory inputs, fosters collaboration, enriches interactions, and ultimately leads to improved outcomes.
PowVRtool: a handheld haptic device for realistic power tool feedback in VR-based manufacturing training
In VR-based manufacturing training employing Oculus controllers for power tool operation, users consistently encounter a glaring impediment: the conspicuous absence of haptic feedback. This critical shortfall significantly hinders the seamless transfer of acquired skills from the virtual realm to the real-world application, posing a substantial challenge to effective training. To overcome this obstacle, we introduce PowVRtool –a handheld haptic device meticulously crafted for immersive virtual reality simulations of power tools. By collaborating with industry experts, this tool excels in providing crucial elements such as vibrotactile feedback, weight perception, and mass distribution. These features empower users to develop muscle memory and refine their skills effectively. We demonstrated (1) Realistic vibrotactile feedback derived from an extensive frequency domain database and state-of-the-art actuators, (2) Accurate weight perception achieved through weight addition, and (3) Evaluation conducted in four user studies, demonstrating: (a) PowVRtool ’s adeptness at replicating tool weight and balance, (b) User preferences for combined vibrotactile feedback and weight simulation, (c) Effectiveness across various power tools. These results show the PowVRtool ’s potential as a valuable training tool, particularly within the manufacturing industry’s VR-based training programs.
Extended Reality (XR) Engines for Developing Gamified Apps and Serious Games: A Scoping Review
Extended Reality (XR) is an emerging technology that enables enhanced interaction between the real world and virtual environments. In this study, we conduct a scoping review of XR engines for developing gamified apps and serious games. Our study revolves around four aspects: (1) existing XR game engines, (2) their primary features, (3) supported serious game attributes, and (4) supported learning activities. We used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) model to conduct the scoping review, which included 40 primary studies published between 2019 and 2023. Our findings help us understand how current XR engines support the development of XR-enriched serious games and gamified apps for specific learning activities. Additionally, based on our findings, we suggest a set of pre-established game attributes that could be commonly supported by all XR game engines across the different game categories proposed by Lameras. Hence, this scoping review can help developers (1) select important game attributes for their new games and (2) choose the game engine that provides the most support to these attributes.
Augmented Reality Support for Anterior Decompression and Fusion Using Floating Method for Cervical Ossification of the Posterior Longitudinal Ligament
Anterior decompression and fusion (ADF) using the floating method for cervical ossification of the posterior longitudinal ligament (OPLL) is an ideal surgical technique, but it has a specific risk of insufficient decompression caused by the impingement of residual ossification. Augmented reality (AR) support is a novel technology that enables the superimposition of images onto the view of a surgical field. AR technology was applied to ADF for cervical OPLL to facilitate intraoperative anatomical orientation and OPLL identification. In total, 14 patients with cervical OPLL underwent ADF with microscopic AR support. The outline of the OPLL and the bilateral vertebral arteries was marked after intraoperative CT, and the reconstructed 3D image data were transferred and linked to the microscope. The AR microscopic view enabled us to visualize the ossification outline, which could not be seen directly in the surgical field, and allowed sufficient decompression of the ossification. Neurological disturbances were improved in all patients. No cases of serious complications, such as major intraoperative bleeding or reoperation due to the postoperative impingement of the floating OPLL, were registered. To our knowledge, this is the first report of the introduction of microscopic AR into ADF using the floating method for cervical OPLL with favorable clinical results.
Metaverse
The Metaverse is the post-reality universe, a perpetual and persistent multiuser environment merging physical reality with digital virtuality. It is based on the convergence of technologies that enable multisensory interactions with virtual environments, digital objects and people such as virtual reality (VR) and augmented reality (AR). Hence, the Metaverse is an interconnected web of social, networked immersive environments in persistent multiuser platforms. It enables seamless embodied user communication in real-time and dynamic interactions with digital artifacts. Its first iteration was a web of virtual worlds where avatars were able to teleport among them. The contemporary iteration of the Metaverse features social, immersive VR platforms compatible with massive multiplayer online video games, open game worlds and AR collaborative spaces.
Effect of an Extended Reality Simulation Intervention on Midwifery Students’ Anxiety: Systematic Review
Midwifery students often experience anxiety due to several factors, such as the clinical experiences faced. Simulation-based learning in nursing and midwifery studies using extended reality (XR) tools offers the opportunity to manage better educational processes while reducing this anxiety. This study aims to evaluate the current knowledge and understanding of how the use of XR gesture-simulation-based tools allows a better understanding of the anxiety levels of midwives and nurses in educational settings. We conducted a systematic review, a scientific literature search following the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. Using PubMed, IEEE, Scopus, and Web of Science, up to March 2024, 1005 articles were found to identify studies that reported the effectiveness of these technologies for gesture simulation in education and training on nursing and midwifery student anxiety. The inclusion-exclusion criteria were based on the PICO (population, intervention, control, and outcomes) framework. The population included nurses, midwives, and nursing and midwifery students of any kind using any virtual or augmented or mixed reality simulation training tool to perform a procedure aimed at reducing anxiety. In addition, the Cochrane risk of bias tool was used to evaluate the quality of the systematic review and the bias in the included studies. A narrative synthesis was conducted due to the heterogeneity of study designs and outcome measures. Key findings were summarized in a structured table and grouped according to the learning objective, simulating and performing procedures in an educational setting. Overall, 7 articles, involving a total of 428 participants, were included in this review. The findings indicate that XR can effectively reduce anxiety in midwifery and nursing education. However, the limited number of studies highlights a research gap in the field, particularly in the area of mixed reality, which warrants further exploration. This systematic review highlights the potential of XR-based gesture-simulation tools in reducing anxiety among midwifery and nursing students. The included studies suggest that XR-enhanced training provides a more immersive and controlled learning environment, helping students manage stress and improve procedural confidence. However, the limited number of studies, methodological variations, and the underrepresentation of mixed reality applications indicate the need for further research. Future studies should focus on standardized anxiety measurement tools, larger sample sizes, and long-term impact assessments to strengthen the evidence base. Expanding research in this field could enhance the integration of XR technologies into midwifery and nursing education, ultimately improving both learning experiences and clinical preparedness.