Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
2
result(s) for
"fusion-based positioning methods"
Sort by:
Indoor Localization Methods for Smartphones with Multi-Source Sensors Fusion: Tasks, Challenges, Strategies, and Perspectives
2025
Positioning information greatly enhances the convenience of people’s lives and the efficiency of societal operations. However, due to the impact of complex indoor environments, GNSS signals suffer from multipath effects, blockages, and attenuation, making it difficult to provide reliable positioning services indoors. Smartphone indoor positioning and navigation is a crucial technology for enabling indoor location services. Nevertheless, relying solely on a single positioning technique can hardly achieve accurate indoor localization. We reviewed several main methods for indoor positioning using smartphone sensors, including Wi-Fi, Bluetooth, cameras, microphones, inertial sensors, and others. Among these, wireless medium-based positioning methods are prone to interference from signals and obstacles in the indoor environment, while inertial sensors are limited by error accumulation. The fusion of multi-source sensors in complex indoor scenarios benefits from the complementary advantages of various sensors and has become a research hotspot in the field of pervasive indoor localization applications for smartphones. In this paper, we extensively review the current mainstream sensors and indoor positioning methods for smartphone multi-source sensor fusion. We summarize the recent research progress in this domain along with the characteristics of the relevant techniques and applicable scenarios. Finally, we collate and organize the key issues and technological outlooks of this field.
Journal Article
A Flying Robot Localization Method Based on Multi-Sensor Fusion
2014
This paper proposes a novel localization method for a power-tower-inspection flying robot based on fusion of vision, IMU and GPS. First, the research background is introduced in relation to a visual localization algorithm derived from 3D-model-based tracking and a coordinate transformation model for related coordinate frames. Then, a multi-sensor fusion-based localization method is presented, in which two collaborative Kalman filters are designed to fuse IMU/GPS and visual information. Finally, experimental results are presented to show the robustness and precision of the proposed method.
Journal Article