Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
614 result(s) for "sensor simulator"
Sort by:
On the Testing of Advanced Automotive Radar Sensors by Means of Target Simulators
The rapid development and wide commercial implementation of automotive radar sensors are strengthening the already considerable interest in matching radar target simulators. Such simulators boast promising results when used for both essential functional inspections of active sensors and the high-speed testing of numerous traffic scenarios while examining complex reactions of automobile electronic systems. For these purposes, advanced versions of target simulators enabling a generation of multiple targets moving at different velocities and ranges are required. The design, practical implementation and system programming of advanced sensor simulator setups require a detailed analytical description concerning all important technical aspects. An abundance of detailed information on the behavior and parameters of automotive radar sensors can be found in the references, but similar knowledge on sensor simulator setups is lacking. This article presents detailed analyses of the all-important RF parameters, where special attention is paid to phase noise, and its analytical description takes into account an even greater number of simulated targets. The derived analytical formulas enable both an optimal setup implementation and system programming of a wide range of practical testing procedures.
Development and Validation of LiDAR Sensor Simulators Based on Parallel Raycasting
Three-dimensional (3D) imaging technologies have been increasingly explored in academia and the industrial sector, especially the ones yielding point clouds. However, obtaining these data can still be expensive and time-consuming, reducing the efficiency of procedures dependent on large datasets, such as the generation of data for machine learning training, forest canopy calculation, and subsea survey. A trending solution is developing simulators for imaging systems, performing the virtual scanning of the digital world, and generating synthetic point clouds from the targets. This work presents a guideline for the development of modular Light Detection and Ranging (LiDAR) system simulators based on parallel raycasting algorithms, with its sensor modeled by metrological parameters and error models. A procedure for calibrating the sensor is also presented, based on comparing with the measurements made by a commercial LiDAR sensor. The sensor simulator developed as a case study resulted in a robust generation of synthetic point clouds in different scenarios, enabling the creation of datasets for use in concept tests, combining real and virtual data, among other applications.
A LiDAR System Simulator Using Parallel Raytracing and Validated by Comparison with a Real Sensor
The advances in 3D imaging and Computer Vision are allowing for a massive acquisition of data, especially in the form of point clouds. However, scanning with these sensors can be costly and time consuming, reducing the efficiency of certain procedures, like deep learning dataset generation and forest canopy calculation. One trending solution is the creation of 3D imaging sensor simulators. This paper presents a simulator for light detection and ranging (LiDAR) systems with a parallel raytracing approach and a flexible scene creator. Finally, the simulator sensor is validated with a real LiDAR data.
Software-In-Loop Simulation of an Underwater Wireless Sensor Network for Monitoring Seawater Quality: Parameter Selection and Performance Validation
In this work, a real-time software-in-loop simulation technique was employed to test and analyse an underwater wireless sensor network. This simulation should facilitate the deployment of the real network and helps guarantee the network’s expected behaviour. We study duplicated packets, one-way delay, and power consumption to analyse the network’s leading parameters. Evaluating production-ready software in simulated conditions eases effective deployment. This method will ultimately allow us to establish these parameters, test the software before the deployment, and have an excellent understanding of the network’s behaviour.
A Digital Sensor Simulator of the Pushbroom Offner Hyperspectral Imaging Spectrometer
Sensor simulators can be used in forecasting the imaging quality of a new hyperspectral imaging spectrometer, and generating simulated data for the development and validation of the data processing algorithms. This paper presents a novel digital sensor simulator for the pushbroom Offner hyperspectral imaging spectrometer, which is widely used in the hyperspectral remote sensing. Based on the imaging process, the sensor simulator consists of a spatial response module, a spectral response module, and a radiometric response module. In order to enhance the simulation accuracy, spatial interpolation-resampling, which is implemented before the spatial degradation, is developed to compromise the direction error and the extra aliasing effect. Instead of using the spectral response function (SRF), the dispersive imaging characteristics of the Offner convex grating optical system is accurately modeled by its configuration parameters. The non-uniformity characteristics, such as keystone and smile effects, are simulated in the corresponding modules. In this work, the spatial, spectral and radiometric calibration processes are simulated to provide the parameters of modulation transfer function (MTF), SRF and radiometric calibration parameters of the sensor simulator. Some uncertainty factors (the stability, band width of the monochromator for the spectral calibration, and the integrating sphere uncertainty for the radiometric calibration) are considered in the simulation of the calibration process. With the calibration parameters, several experiments were designed to validate the spatial, spectral and radiometric response of the sensor simulator, respectively. The experiment results indicate that the sensor simulator is valid.
Automatic Sensor and Meter Arrangement System for Building Energy Management
Building Energy Management System (BEMS) can save energy and minimize the impact on the environment due to energy efficiency technologies and systems. The most important things of Building Energy Management are the monitoring of indoor environment in a building by various sensors and the measurement of energy consumption in a building by various meters. We need to arrange sensors and meters automatically, and develop an automatic sensor and meter arrangement system being usable in any kinds of wired or wireless network. In this paper, we propose an automatic sensor and meter arrangement system for building energy management and explain each component operation of an automatic sensor and meter arrangement system for building energy management, a sensor arrangement simulator and a meter arrangement simulator.
High-Fidelity Interactive Motorcycle Driving Simulator with Motion Platform Equipped with Tension Sensors
The paper presents the innovative approach to a high-fidelity motorcycle riding simulator based on VR (Virtual Reality)-visualization, equipped with a Gough-Stewart 6-DOF (Degrees of Freedom) motion platform. Such a solution integrates a real-time tension sensor system as a source for highly realistic motion cueing control as well as the servomotor integrated into the steering system. Tension forces are measured at four points on the mock-up chassis, allowing a comprehensive analysis of rider interaction during various maneuvers. The simulator is developed to simulate realistic riding scenarios with immersive motion and visual feedback, enhanced with the simulation of external influences—headwind. This paper presents results of a validation study—pilot experiments conducted to evaluate selected riding scenarios and validate the innovative simulator setup, focusing on force distribution and system responsiveness to support further research in motorcycle HMI (Human–Machine Interaction), rider behavior, and training.
Fidelity Assessment of Motion Platform Cueing: Comparison of Driving Behavior under Various Motion Levels
The present paper focuses on vehicle simulator fidelity, particularly the effect of motion cues intensity on driver performance. The 6-DOF motion platform was used in the experiment; however, we mainly focused on one characteristic of driving behavior. The braking performance of 24 participants in a car simulator was recorded and analyzed. The experiment scenario was composed of acceleration to 120 km/h followed by smooth deceleration to a stop line with prior warning signs at distances of 240, 160, and 80 m to the finish line. To assess the effect of the motion cues, each driver performed the run three times with different motion platform settings–no motion, moderate motion, and maximal possible response and range. The results from the driving simulator were compared with data acquired in an equivalent driving scenario performed in real conditions on a polygon track and taken as reference data. The driving simulator and real car accelerations were recorded using the Xsens MTi-G sensor. The outcomes confirmed the hypothesis that driving with a higher level of motion cues in the driving simulator brought more natural braking behavior of the experimental drivers, better correlated with the real car driving test data, although exceptions were found.
GAN-Based LiDAR Translation between Sunny and Adverse Weather for Autonomous Driving and Driving Simulation
Autonomous driving requires robust and highly accurate perception technologies. Various deep learning algorithms based on only image processing satisfy this requirement, but few such algorithms are based on LiDAR. However, images are only one part of the perceptible sensors in an autonomous driving vehicle; LiDAR is also essential for the recognition of driving environments. The main reason why there exist few deep learning algorithms based on LiDAR is a lack of data. Recent translation technology using generative adversarial networks (GANs) has been proposed to deal with this problem. However, these technologies focus on only image-to-image translation, although a lack of data occurs more often with LiDAR than with images. LiDAR translation technology is required not only for data augmentation, but also for driving simulation, which allows algorithms to practice driving as if they were commanding a real vehicle, before doing so in the real world. In other words, driving simulation is a key technology for evaluating and verifying algorithms which are practically applied to vehicles. In this paper, we propose a GAN-based LiDAR translation algorithm for autonomous driving and driving simulation. It is the first LiDAR translation approach that can deal with various types of weather that are based on an empirical approach. We tested the proposed method on the JARI data set, which was collected under various adverse weather scenarios with diverse precipitation and visible distance settings. The proposed method was also applied to the real-world Spain data set. Our experimental results demonstrate that the proposed method can generate realistic LiDAR data under adverse weather conditions.