To see the other types of publications on this topic, follow the link: Slam LiDAR.

Journal articles on the topic 'Slam LiDAR'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Slam LiDAR.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Jie, Lu, Zhi Jin, Jinping Wang, Letian Zhang, and Xiaojun Tan. "A SLAM System with Direct Velocity Estimation for Mechanical and Solid-State LiDARs." Remote Sensing 14, no. 7 (2022): 1741. http://dx.doi.org/10.3390/rs14071741.

Full text
Abstract:
Simultaneous localization and mapping (SLAM) is essential for intelligent robots operating in unknown environments. However, existing algorithms are typically developed for specific types of solid-state LiDARs, leading to weak feature representation abilities for new sensors. Moreover, LiDAR-based SLAM methods are limited by distortions caused by LiDAR ego motion. To address the above issues, this paper presents a versatile and velocity-aware LiDAR-based odometry and mapping (VLOM) system. A spherical projection-based feature extraction module is utilized to process the raw point cloud generat
APA, Harvard, Vancouver, ISO, and other styles
2

Sier, Ha, Qingqing Li, Xianjia Yu, Jorge Peña Queralta, Zhuo Zou, and Tomi Westerlund. "A Benchmark for Multi-Modal LiDAR SLAM with Ground Truth in GNSS-Denied Environments." Remote Sensing 15, no. 13 (2023): 3314. http://dx.doi.org/10.3390/rs15133314.

Full text
Abstract:
LiDAR-based simultaneous localization and mapping (SLAM) approaches have obtained considerable success in autonomous robotic systems. This is in part owing to the high accuracy of robust SLAM algorithms and the emergence of new and lower-cost LiDAR products. This study benchmarks the current state-of-the-art LiDAR SLAM algorithms with a multi-modal LiDAR sensor setup, showcasing diverse scanning modalities (spinning and solid state) and sensing technologies, and LiDAR cameras, mounted on a mobile sensing and computing platform. We extend our previous multi-modal multi-LiDAR dataset with additi
APA, Harvard, Vancouver, ISO, and other styles
3

Zhao, Yu-Lin, Yi-Tian Hong, and Han-Pang Huang. "Comprehensive Performance Evaluation between Visual SLAM and LiDAR SLAM for Mobile Robots: Theories and Experiments." Applied Sciences 14, no. 9 (2024): 3945. http://dx.doi.org/10.3390/app14093945.

Full text
Abstract:
SLAM (Simultaneous Localization and Mapping), primarily relying on camera or LiDAR (Light Detection and Ranging) sensors, plays a crucial role in robotics for localization and environmental reconstruction. This paper assesses the performance of two leading methods, namely ORB-SLAM3 and SC-LeGO-LOAM, focusing on localization and mapping in both indoor and outdoor environments. The evaluation employs artificial and cost-effective datasets incorporating data from a 3D LiDAR and an RGB-D (color and depth) camera. A practical approach is introduced for calculating ground-truth trajectories and duri
APA, Harvard, Vancouver, ISO, and other styles
4

Chen, Shoubin, Baoding Zhou, Changhui Jiang, Weixing Xue, and Qingquan Li. "A LiDAR/Visual SLAM Backend with Loop Closure Detection and Graph Optimization." Remote Sensing 13, no. 14 (2021): 2720. http://dx.doi.org/10.3390/rs13142720.

Full text
Abstract:
LiDAR (light detection and ranging), as an active sensor, is investigated in the simultaneous localization and mapping (SLAM) system. Typically, a LiDAR SLAM system consists of front-end odometry and back-end optimization modules. Loop closure detection and pose graph optimization are the key factors determining the performance of the LiDAR SLAM system. However, the LiDAR works at a single wavelength (905 nm), and few textures or visual features are extracted, which restricts the performance of point clouds matching based loop closure detection and graph optimization. With the aim of improving
APA, Harvard, Vancouver, ISO, and other styles
5

Peng, Gang, Yicheng Zhou, Lu Hu, et al. "VILO SLAM: Tightly Coupled Binocular Vision–Inertia SLAM Combined with LiDAR." Sensors 23, no. 10 (2023): 4588. http://dx.doi.org/10.3390/s23104588.

Full text
Abstract:
For the existing visual–inertial SLAM algorithm, when the robot is moving at a constant speed or purely rotating and encounters scenes with insufficient visual features, problems of low accuracy and poor robustness arise. Aiming to solve the problems of low accuracy and robustness of the visual inertial SLAM algorithm, a tightly coupled vision-IMU-2D lidar odometry (VILO) algorithm is proposed. Firstly, low-cost 2D lidar observations and visual–inertial observations are fused in a tightly coupled manner. Secondly, the low-cost 2D lidar odometry model is used to derive the Jacobian matrix of th
APA, Harvard, Vancouver, ISO, and other styles
6

Dang, Xiangwei, Zheng Rong, and Xingdong Liang. "Sensor Fusion-Based Approach to Eliminating Moving Objects for SLAM in Dynamic Environments." Sensors 21, no. 1 (2021): 230. http://dx.doi.org/10.3390/s21010230.

Full text
Abstract:
Accurate localization and reliable mapping is essential for autonomous navigation of robots. As one of the core technologies for autonomous navigation, Simultaneous Localization and Mapping (SLAM) has attracted widespread attention in recent decades. Based on vision or LiDAR sensors, great efforts have been devoted to achieving real-time SLAM that can support a robot’s state estimation. However, most of the mature SLAM methods generally work under the assumption that the environment is static, while in dynamic environments they will yield degenerate performance or even fail. In this paper, fir
APA, Harvard, Vancouver, ISO, and other styles
7

Debeunne, César, and Damien Vivet. "A Review of Visual-LiDAR Fusion based Simultaneous Localization and Mapping." Sensors 20, no. 7 (2020): 2068. http://dx.doi.org/10.3390/s20072068.

Full text
Abstract:
Autonomous navigation requires both a precise and robust mapping and localization solution. In this context, Simultaneous Localization and Mapping (SLAM) is a very well-suited solution. SLAM is used for many applications including mobile robotics, self-driving cars, unmanned aerial vehicles, or autonomous underwater vehicles. In these domains, both visual and visual-IMU SLAM are well studied, and improvements are regularly proposed in the literature. However, LiDAR-SLAM techniques seem to be relatively the same as ten or twenty years ago. Moreover, few research works focus on vision-LiDAR appr
APA, Harvard, Vancouver, ISO, and other styles
8

Xu, Xiaobin, Lei Zhang, Jian Yang, et al. "A Review of Multi-Sensor Fusion SLAM Systems Based on 3D LIDAR." Remote Sensing 14, no. 12 (2022): 2835. http://dx.doi.org/10.3390/rs14122835.

Full text
Abstract:
The ability of intelligent unmanned platforms to achieve autonomous navigation and positioning in a large-scale environment has become increasingly demanding, in which LIDAR-based Simultaneous Localization and Mapping (SLAM) is the mainstream of research schemes. However, the LIDAR-based SLAM system will degenerate and affect the localization and mapping effects in extreme environments with high dynamics or sparse features. In recent years, a large number of LIDAR-based multi-sensor fusion SLAM works have emerged in order to obtain a more stable and robust system. In this work, the development
APA, Harvard, Vancouver, ISO, and other styles
9

Bu, Zean, Changku Sun, and Peng Wang. "Semantic Lidar-Inertial SLAM for Dynamic Scenes." Applied Sciences 12, no. 20 (2022): 10497. http://dx.doi.org/10.3390/app122010497.

Full text
Abstract:
Over the past few years, many impressive lidar-inertial SLAM systems have been developed and perform well under static scenes. However, most tasks are under dynamic environments in real life, and the determination of a method to improve accuracy and robustness poses a challenge. In this paper, we propose a semantic lidar-inertial SLAM approach with the combination of a point cloud semantic segmentation network and lidar-inertial SLAM LIO mapping for dynamic scenes. We import an attention mechanism to the PointConv network to build an attention weight function to improve the capacity to predict
APA, Harvard, Vancouver, ISO, and other styles
10

Abdelhafid, El Farnane, Youssefi My Abdelkader, Mouhsen Ahmed, Dakir Rachid, and El Ihyaoui Abdelilah. "Visual and light detection and ranging-based simultaneous localization and mapping for self-driving cars." International Journal of Electrical and Computer Engineering (IJECE) 12, no. 6 (2022): 6284. http://dx.doi.org/10.11591/ijece.v12i6.pp6284-6292.

Full text
Abstract:
<span lang="EN-US">In recent years, there has been a strong demand for self-driving cars. For safe navigation, self-driving cars need both precise localization and robust mapping. While global navigation satellite system (GNSS) can be used to locate vehicles, it has some limitations, such as satellite signal absence (tunnels and caves), which restrict its use in urban scenarios. Simultaneous localization and mapping (SLAM) are an excellent solution for identifying a vehicle’s position while at the same time constructing a representation of the environment. SLAM-based visual and light det
APA, Harvard, Vancouver, ISO, and other styles
11

Soebhakti, Hendawan, and Robbi Hermawansya Pangantar. "Simulation of Mobile Robot Navigation System using Hector SLAM on ROS." JURNAL INTEGRASI 16, no. 1 (2024): 11–20. http://dx.doi.org/10.30871/ji.v16i1.5755.

Full text
Abstract:
The ability to move from one point to the destination point autonomously is very necessary in AMR robots, to be able to meet this, the robot must be able to detect the surrounding environment and know its location to the environment, the Hector SLAM algorithm is added using the LIDAR sensor, and to find out the ability of the LIDAR sensor with the Hector SLAM and computer specifications in order to process properly, a simulation of the HECTOR SLAM with the LIDAR sensor was made. Simulation is carried out by creating an environment map on the Gazebo. Then explore environmental mapping using Hok
APA, Harvard, Vancouver, ISO, and other styles
12

Huang, Baichuan, Jun Zhao, Sheng Luo, and Jingbin Liu. "A Survey of Simultaneous Localization and Mapping with an Envision in 6G Wireless Networks." Journal of Global Positioning Systems 17, no. 2 (2021): 206–36. http://dx.doi.org/10.5081/jgps.17.2.206.

Full text
Abstract:
Simultaneous Localization and Mapping (SLAM) achieves the purpose of simultaneous positioning and map construction based on self-perception. The paper makes an overview in SLAM including Lidar SLAM, visual SLAM, and their fusion. For Lidar or visual SLAM, the survey illustrates the basic type and product of sensors, open source system in sort and history, deep learning embedded, the challenge and future. Additionally, visual inertial odometry is supplemented. For Lidar and visual fused SLAM, the paper highlights the multi-sensors calibration, the fusion in hardware, data, task layer. The open
APA, Harvard, Vancouver, ISO, and other styles
13

Chen, Zhijian, Aigong Xu, Xin Sui, et al. "Improved-UWB/LiDAR-SLAM Tightly Coupled Positioning System with NLOS Identification Using a LiDAR Point Cloud in GNSS-Denied Environments." Remote Sensing 14, no. 6 (2022): 1380. http://dx.doi.org/10.3390/rs14061380.

Full text
Abstract:
Reliable absolute positioning is indispensable in long-term positioning systems. Although simultaneous localization and mapping based on light detection and ranging (LiDAR-SLAM) is effective in global navigation satellite system (GNSS)-denied environments, it can provide only local positioning results, with error divergence over distance. Ultrawideband (UWB) technology is an effective alternative; however, non-line-of-sight (NLOS) propagation in complex indoor environments severely affects the precision of UWB positioning, and LiDAR-SLAM typically provides more robust results under such condit
APA, Harvard, Vancouver, ISO, and other styles
14

Frosi, Matteo, and Matteo Matteucci. "ART-SLAM: Accurate Real-Time 6DoF LiDAR SLAM." IEEE Robotics and Automation Letters 7, no. 2 (2022): 2692–99. http://dx.doi.org/10.1109/lra.2022.3144795.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Lou, Lu, Yitian Li, Qi Zhang, and Hanbing Wei. "SLAM and 3D Semantic Reconstruction Based on the Fusion of Lidar and Monocular Vision." Sensors 23, no. 3 (2023): 1502. http://dx.doi.org/10.3390/s23031502.

Full text
Abstract:
Monocular camera and Lidar are the two most commonly used sensors in unmanned vehicles. Combining the advantages of the two is the current research focus of SLAM and semantic analysis. In this paper, we propose an improved SLAM and semantic reconstruction method based on the fusion of Lidar and monocular vision. We fuse the semantic image with the low-resolution 3D Lidar point clouds and generate dense semantic depth maps. Through visual odometry, ORB feature points with depth information are selected to improve positioning accuracy. Our method uses parallel threads to aggregate 3D semantic po
APA, Harvard, Vancouver, ISO, and other styles
16

Huang, Baichuan, Jun Zhao, Sheng Luo, and Jingbin Liu. "A Survey of Simultaneous Localization and Mapping with an Envision in 6G Wireless Networks." Journal of Global Positioning Systems 17, no. 1 (2021): 94–127. http://dx.doi.org/10.5081/jgps.17.1.94.

Full text
Abstract:
Simultaneous Localization and Mapping (SLAM) achieves the purpose of simultaneous positioning and map construction based on self-perception. The paper makes an overview in SLAM including Lidar SLAM, visual SLAM, and their fusion. For Lidar or visual SLAM, the survey illustrates the basic type and product of sensors, open source system in sort and history, deep learning embedded, the challenge and future. Additionally, visual inertial odometry is supplemented. For Lidar and visual fused SLAM, the paper highlights the multi-sensors calibration, the fusion in hardware, data, task layer. The open
APA, Harvard, Vancouver, ISO, and other styles
17

Wei, Weichen, Bijan Shirinzadeh, Rohan Nowell, Mohammadali Ghafarian, Mohamed M. A. Ammar, and Tianyao Shen. "Enhancing Solid State LiDAR Mapping with a 2D Spinning LiDAR in Urban Scenario SLAM on Ground Vehicles." Sensors 21, no. 5 (2021): 1773. http://dx.doi.org/10.3390/s21051773.

Full text
Abstract:
Solid-State LiDAR (SSL) takes an increasing share of the LiDAR market. Compared with traditional spinning LiDAR, SSLs are more compact, energy-efficient and cost-effective. Generally, the current study of SSL mapping is limited to adapting existing SLAM algorithms to an SSL sensor. However, compared with spinning LiDARs, SSLs are different in terms of their irregular scan patterns and limited FOV. Directly applying existing SLAM approaches on them often increase the instability of a mapping process. This study proposes a systematic design, which consists of a dual-LiDAR mapping system and a th
APA, Harvard, Vancouver, ISO, and other styles
18

Vultaggio, F., F. d’Apolito, C. Sulzbachner, and P. Fanta-Jende. "SIMULATION OF LOW-COST MEMS-LIDAR AND ANALYSIS OF ITS EFFECT ON THE PERFORMANCES OF STATE-OF-THE-ART SLAMS." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVIII-1/W1-2023 (May 25, 2023): 539–45. http://dx.doi.org/10.5194/isprs-archives-xlviii-1-w1-2023-539-2023.

Full text
Abstract:
Abstract. Indoor Unmanned Aerial Vehicles have often been tasked with performing SLAM, and the sensors most used in literature and industry have been cameras. Spanning from stereo to event cameras, visual algorithms have often been the de facto choice for localization. While visual SLAM has reached a high level of accuracy in localization, accurate map reconstruction still proves to be challenging. Meanwhile, LiDAR sensors have been used for years to obtain accurate maps. First in surveying applications and in the past ten years in the automotive sector. The weight, power, and size constraints
APA, Harvard, Vancouver, ISO, and other styles
19

Chen, Weifeng, Chengjun Zhou, Guangtao Shang, et al. "SLAM Overview: From Single Sensor to Heterogeneous Fusion." Remote Sensing 14, no. 23 (2022): 6033. http://dx.doi.org/10.3390/rs14236033.

Full text
Abstract:
After decades of development, LIDAR and visual SLAM technology has relatively matured and been widely used in the military and civil fields. SLAM technology enables the mobile robot to have the abilities of autonomous positioning and mapping, which allows the robot to move in indoor and outdoor scenes where GPS signals are scarce. However, SLAM technology relying only on a single sensor has its limitations. For example, LIDAR SLAM is not suitable for scenes with highly dynamic or sparse features, and visual SLAM has poor robustness in low-texture or dark scenes. However, through the fusion of
APA, Harvard, Vancouver, ISO, and other styles
20

Ismail, Hasan, Rohit Roy, Long-Jye Sheu, Wei-Hua Chieng, and Li-Chuan Tang. "Exploration-Based SLAM (e-SLAM) for the Indoor Mobile Robot Using Lidar." Sensors 22, no. 4 (2022): 1689. http://dx.doi.org/10.3390/s22041689.

Full text
Abstract:
This paper attempts to uncover one possible method for the IMR (indoor mobile robot) to perform indoor exploration associated with SLAM (simultaneous localization and mapping) using LiDAR. Specifically, the IMR is required to construct a map when it has landed on an unexplored floor of a building. We had implemented the e-SLAM (exploration-based SLAM) using the coordinate transformation and the navigation prediction techniques to achieve that purpose in the engineering school building which consists of many 100-m2 labs, corridors, elevator waiting space and the lobby. We first derive the LiDAR
APA, Harvard, Vancouver, ISO, and other styles
21

Harish, I. "Slam Using LIDAR For UGV." International Journal for Research in Applied Science and Engineering Technology V, no. III (2017): 1157–60. http://dx.doi.org/10.22214/ijraset.2017.3211.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Roy, Rohit, You-Peng Tu, Long-Jye Sheu, Wei-Hua Chieng, Li-Chuan Tang, and Hasan Ismail. "Path Planning and Motion Control of Indoor Mobile Robot under Exploration-Based SLAM (e-SLAM)." Sensors 23, no. 7 (2023): 3606. http://dx.doi.org/10.3390/s23073606.

Full text
Abstract:
Indoor mobile robot (IMR) motion control for e-SLAM techniques with limited sensors, i.e., only LiDAR, is proposed in this research. The path was initially generated from simple floor plans constructed by the IMR exploration. The path planning starts from the vertices which can be traveled through, proceeds to the velocity planning on both cornering and linear motion, and reaches the interpolated discrete points joining the vertices. The IMR recognizes its location and environment gradually from the LiDAR data. The study imposes the upper rings of the LiDAR image to perform localization while
APA, Harvard, Vancouver, ISO, and other styles
23

Chen, Guangrong, and Liang Hong. "Research on Environment Perception System of Quadruped Robots Based on LiDAR and Vision." Drones 7, no. 5 (2023): 329. http://dx.doi.org/10.3390/drones7050329.

Full text
Abstract:
Due to the high stability and adaptability, quadruped robots are currently highly discussed in the robotics field. To overcome the complicated environment indoor or outdoor, the quadruped robots should be configured with an environment perception system, which mostly contain LiDAR or a vision sensor, and SLAM (Simultaneous Localization and Mapping) is deployed. In this paper, the comparative experimental platforms, including a quadruped robot and a vehicle, with LiDAR and a vision sensor are established firstly. Secondly, a single sensor SLAM, including LiDAR SLAM and Visual SLAM, are investig
APA, Harvard, Vancouver, ISO, and other styles
24

Bolkas, D., M. O’Banion, and C. J. Belleman. "COMBINATION OF TLS AND SLAM LIDAR FOR LEVEE MONITORING." ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences V-3-2022 (May 17, 2022): 641–47. http://dx.doi.org/10.5194/isprs-annals-v-3-2022-641-2022.

Full text
Abstract:
Abstract. Monitoring of engineering structures is important for ensuring safety of operation. Traditional surveying methods have proven to be reliable; however, the advent of new point cloud technologies such as terrestrial laser scanning (TLS) and small unmanned aerial systems (sUAS) have provided an unprecedented wealth of data. Furthermore, simultaneous localization and mapping (SLAM) is now able to facilitate the collection of registered point clouds on the fly. SLAM is most successful when applied to indoor environments where the algorithm can identify primitives (points, planes, lines) f
APA, Harvard, Vancouver, ISO, and other styles
25

Wen, Weisong, and Li-Ta Hsu. "AGPC-SLAM: Absolute Ground Plane Constrained 3D Lidar SLAM." NAVIGATION: Journal of the Institute of Navigation 69, no. 3 (2022): navi.527. http://dx.doi.org/10.33012/navi.527.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Chen, Wenqiang, Yu Wang, Haoyao Chen, and Yunhui Liu. "EIL‐SLAM: Depth‐enhanced edge‐based infrared‐LiDAR SLAM." Journal of Field Robotics 39, no. 2 (2021): 117–30. http://dx.doi.org/10.1002/rob.22040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Shin, Young-Sik, Yeong Sang Park, and Ayoung Kim. "DVL-SLAM: sparse depth enhanced direct visual-LiDAR SLAM." Autonomous Robots 44, no. 2 (2019): 115–30. http://dx.doi.org/10.1007/s10514-019-09881-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Brindza, Ján, Pavol Kajánek, and Ján Erdélyi. "Lidar-Based Mobile Mapping System for an Indoor Environment." Slovak Journal of Civil Engineering 30, no. 2 (2022): 47–58. http://dx.doi.org/10.2478/sjce-2022-0014.

Full text
Abstract:
Abstract The article deals with developing and testing a low-cost measuring system for simultaneous localisation and mapping (SLAM) in an indoor environment. The measuring system consists of three orthogonally-placed 2D lidars, a robotic platform with two wheel speed sensors, and an inertial measuring unit (IMU). The paper describes the data processing model used for both the estimation of the trajectory of SLAM and the creation of a 3D model of the environment based on the estimated trajectory of the SLAM. The main problem of SLAM usage is the accumulation of errors caused by the imperfect tr
APA, Harvard, Vancouver, ISO, and other styles
29

Chang, Le, Xiaoji Niu, and Tianyi Liu. "GNSS/IMU/ODO/LiDAR-SLAM Integrated Navigation System Using IMU/ODO Pre-Integration." Sensors 20, no. 17 (2020): 4702. http://dx.doi.org/10.3390/s20174702.

Full text
Abstract:
In this paper, we proposed a multi-sensor integrated navigation system composed of GNSS (global navigation satellite system), IMU (inertial measurement unit), odometer (ODO), and LiDAR (light detection and ranging)-SLAM (simultaneous localization and mapping). The dead reckoning results were obtained using IMU/ODO in the front-end. The graph optimization was used to fuse the GNSS position, IMU/ODO pre-integration results, and the relative position and relative attitude from LiDAR-SLAM to obtain the final navigation results in the back-end. The odometer information is introduced in the pre-inte
APA, Harvard, Vancouver, ISO, and other styles
30

Park, K. W., and S. Y. Park. "VISUAL LIDAR ODOMETRY USING TREE TRUNK DETECTION AND LIDAR LOCALIZATION." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVIII-1/W2-2023 (December 13, 2023): 627–32. http://dx.doi.org/10.5194/isprs-archives-xlviii-1-w2-2023-627-2023.

Full text
Abstract:
Abstract. This paper presents a method of visual LiDAR odometry and forest mapping, leveraging tree trunk detection and LiDAR localization techniques. In environments like dense forests, where smooth GPS signals are unreliable, we employ camera and LiDAR sensors to accurately estimate the robot's position. However, forested or orchard settings introduce unique challenges, including a diverse mixture of trees, tall grass, and uneven terrain. To address these complexities, we propose a distance-based filtering method to extract data composed solely of tree trunk information from 2D LiDAR. By res
APA, Harvard, Vancouver, ISO, and other styles
31

Wen, Jingren, Chuang Qian, Jian Tang, Hui Liu, Wenfang Ye, and Xiaoyun Fan. "2D LiDAR SLAM Back-End Optimization with Control Network Constraint for Mobile Mapping." Sensors 18, no. 11 (2018): 3668. http://dx.doi.org/10.3390/s18113668.

Full text
Abstract:
Simultaneous localization and mapping (SLAM) has been investigated in the field of robotics for two decades, as it is considered to be an effective method for solving the positioning and mapping problem in a single framework. In the SLAM community, the Extended Kalman Filter (EKF) based SLAM and particle filter SLAM are the most mature technologies. After years of development, graph-based SLAM is becoming the most promising technology and a lot of progress has been made recently with respect to accuracy and efficiency. No matter which SLAM method is used, loop closure is a vital part for overc
APA, Harvard, Vancouver, ISO, and other styles
32

Filip, Iulian, Juhyun Pyo, Meungsuk Lee, and Hangil Joe. "LiDAR SLAM with a Wheel Encoder in a Featureless Tunnel Environment." Electronics 12, no. 4 (2023): 1002. http://dx.doi.org/10.3390/electronics12041002.

Full text
Abstract:
Simultaneous localization and mapping (SLAM) represents a crucial algorithm in the autonomous navigation of ground vehicles. Several studies were conducted to improve the SLAM algorithm using various sensors and robot platforms. However, only a few works have focused on applications inside low-illuminated featureless tunnel environments. In this work, we present an improved SLAM algorithm using wheel encoder data from an autonomous ground vehicle (AGV) to obtain robust performance in a featureless tunnel environment. The improved SLAM system uses FAST-LIO2 LiDAR SLAM as the baseline algorithm,
APA, Harvard, Vancouver, ISO, and other styles
33

Abdelaziz, Nader, and Ahmed El-Rabbany. "Deep Learning-Aided Inertial/Visual/LiDAR Integration for GNSS-Challenging Environments." Sensors 23, no. 13 (2023): 6019. http://dx.doi.org/10.3390/s23136019.

Full text
Abstract:
This research develops an integrated navigation system, which fuses the measurements of the inertial measurement unit (IMU), LiDAR, and monocular camera using an extended Kalman filter (EKF) to provide accurate positioning during prolonged GNSS signal outages. The system features the use of an integrated INS/monocular visual simultaneous localization and mapping (SLAM) navigation system that takes advantage of LiDAR depth measurements to correct the scale ambiguity that results from monocular visual odometry. The proposed system was tested using two datasets, namely, the KITTI and the Leddar P
APA, Harvard, Vancouver, ISO, and other styles
34

He, Yuhang, Bo Li, Jianyuan Ruan, Aihua Yu, and Beiping Hou. "ZUST Campus: A Lightweight and Practical LiDAR SLAM Dataset for Autonomous Driving Scenarios." Electronics 13, no. 7 (2024): 1341. http://dx.doi.org/10.3390/electronics13071341.

Full text
Abstract:
This research proposes a lightweight and applicable dataset with a precise elevation ground truth and extrinsic calibration toward the LiDAR (Light Detection and Ranging) SLAM (Simultaneous Localization and Mapping) task in the field of autonomous driving. Our dataset focuses on more cost-effective platforms with limited computational power and low-resolution three-dimensional LiDAR sensors (16-beam LiDAR), and fills the gaps in the existing literature. Our data include abundant scenarios that include degenerated environments, dynamic objects, and large slope terrain to facilitate the investig
APA, Harvard, Vancouver, ISO, and other styles
35

Wu, H., R. Zhong, D. Xie, et al. "MR-MD:MULTI-ROBOT MAPPING WITH MANHATTAN DESCRIPTOR." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVIII-1/W2-2023 (December 13, 2023): 687–92. http://dx.doi.org/10.5194/isprs-archives-xlviii-1-w2-2023-687-2023.

Full text
Abstract:
Abstract. Simultaneous Localization and Mapping (SLAM) technology, utilizing Light Detection and Ranging (LiDAR) sensors, is crucial for 3D environment perception and mapping. However, the absence of absolute observations and the inefficiency of single-robot perception present challenges for LiDAR SLAM in indoor environments. In this paper, we propose a multi-robot (MR) collaborative mapping method based on the Manhattan descriptor (MD) named MR-MD to overcome these limitations and improve the perception accuracy of LiDAR SLAM in indoor environments. The proposed method consists of two modules
APA, Harvard, Vancouver, ISO, and other styles
36

Karam, S., V. Lehtola, and G. Vosselman. "STRATEGIES TO INTEGRATE IMU AND LIDAR SLAM FOR INDOOR MAPPING." ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences V-1-2020 (August 3, 2020): 223–30. http://dx.doi.org/10.5194/isprs-annals-v-1-2020-223-2020.

Full text
Abstract:
Abstract. In recent years, the importance of indoor mapping increased in a wide range of applications, such as facility management and mapping hazardous sites. The essential technique behind indoor mapping is simultaneous localization and mapping (SLAM) because SLAM offers suitable positioning estimates in environments where satellite positioning is not available. State-of-the-art indoor mobile mapping systems employ Visual-based SLAM or LiDAR-based SLAM. However, Visual-based SLAM is sensitive to textureless environments and, similarly, LiDAR-based SLAM is sensitive to a number of pose config
APA, Harvard, Vancouver, ISO, and other styles
37

Peng, Hongrui, Ziyu Zhao, and Liguan Wang. "A Review of Dynamic Object Filtering in SLAM Based on 3D LiDAR." Sensors 24, no. 2 (2024): 645. http://dx.doi.org/10.3390/s24020645.

Full text
Abstract:
SLAM (Simultaneous Localization and Mapping) based on 3D LiDAR (Laser Detection and Ranging) is an expanding field of research with numerous applications in the areas of autonomous driving, mobile robotics, and UAVs (Unmanned Aerial Vehicles). However, in most real-world scenarios, dynamic objects can negatively impact the accuracy and robustness of SLAM. In recent years, the challenge of achieving optimal SLAM performance in dynamic environments has led to the emergence of various research efforts, but there has been relatively little relevant review. This work delves into the development pro
APA, Harvard, Vancouver, ISO, and other styles
38

Jiang, Guolai, Lei Yin, Shaokun Jin, Chaoran Tian, Xinbo Ma, and Yongsheng Ou. "A Simultaneous Localization and Mapping (SLAM) Framework for 2.5D Map Building Based on Low-Cost LiDAR and Vision Fusion." Applied Sciences 9, no. 10 (2019): 2105. http://dx.doi.org/10.3390/app9102105.

Full text
Abstract:
The method of simultaneous localization and mapping (SLAM) using a light detection and ranging (LiDAR) sensor is commonly adopted for robot navigation. However, consumer robots are price sensitive and often have to use low-cost sensors. Due to the poor performance of a low-cost LiDAR, error accumulates rapidly while SLAM, and it may cause a huge error for building a larger map. To cope with this problem, this paper proposes a new graph optimization-based SLAM framework through the combination of low-cost LiDAR sensor and vision sensor. In the SLAM framework, a new cost-function considering bot
APA, Harvard, Vancouver, ISO, and other styles
39

Collings, Simon, Tara J. Martin, Emili Hernandez, et al. "Findings from a Combined Subsea LiDAR and Multibeam Survey at Kingston Reef, Western Australia." Remote Sensing 12, no. 15 (2020): 2443. http://dx.doi.org/10.3390/rs12152443.

Full text
Abstract:
Light Detection and Ranging (LiDAR), a comparatively new technology in the field of underwater surveying, has principally been used for taking precise measurement of undersea structures in the oil and gas industry. Typically, the LiDAR is deployed on a remotely operated vehicle (ROV), which will “land” on the seafloor in order to generate a 3D point cloud of its environment from a stationary position. To explore the potential of subsea LiDAR on a moving platform in an environmental context, we deployed an underwater LiDAR system simultaneously with a multibeam echosounder (MBES), surveying Kin
APA, Harvard, Vancouver, ISO, and other styles
40

Karimi, Mojtaba, Martin Oelsch, Oliver Stengel, Edwin Babaians, and Eckehard Steinbach. "LoLa-SLAM: Low-Latency LiDAR SLAM Using Continuous Scan Slicing." IEEE Robotics and Automation Letters 6, no. 2 (2021): 2248–55. http://dx.doi.org/10.1109/lra.2021.3060721.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Ai, M., M. Elhabiby, I. Asl Sabbaghian Hokmabadi, and N. El-Sheimy. "LIDAR-INERTIAL NAVIGATION BASED ON MAP AIDED DISTANCE CONSTRAINT AND FACTOR GRAPH OPTIMIZATION." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVIII-1/W2-2023 (December 13, 2023): 875–80. http://dx.doi.org/10.5194/isprs-archives-xlviii-1-w2-2023-875-2023.

Full text
Abstract:
Abstract. The simultaneous localization and mapping (SLAM) is one of the well-developed positioning technology that provides high accuracy and reliability positioning for automatic vehicles and robotics applications. Integrating Light Detection and Ranging (LiDAR) with an Inertial Measurement Unit (IMU) has emerged as a promising technique for achieving stable navigation results in dense urban environments, outperforming vision-based or pure Inertial Navigation System (INS) solutions. However, conventional LiDAR-Inertial SLAM systems often suffer from limited perception of surrounding geometri
APA, Harvard, Vancouver, ISO, and other styles
42

Xu, Y., C. Chen, Z. Wang, et al. "PMLIO: PANORAMIC TIGHTLY-COUPLED MULTI-LIDAR-INERTIAL ODOMETRY AND MAPPING." ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences X-1/W1-2023 (December 5, 2023): 703–8. http://dx.doi.org/10.5194/isprs-annals-x-1-w1-2023-703-2023.

Full text
Abstract:
Abstract. The limited field of view (FoV) of single LiDAR poses challenges for robots to achieve comprehensive environmental perception. Incorporating multiple LiDAR sensors can effectively broaden the FoV of robots, providing abundant measurements to facilitate simultaneous localization and mapping (SLAM). In this paper, we propose a panoramic tightly-coupled multi-LiDAR-inertial odometry and mapping framework, which fully leverages the properties of solid-state LiDAR and spinning LiDAR. The key of the proposed framework lies in the effective completion of multi-LiDAR spatial-temporal fusion.
APA, Harvard, Vancouver, ISO, and other styles
43

Sun, Y., F. Huang, W. Wen, L. T. Hsu, and X. Liu. "MULTI-ROBOT COOPERATIVE LIDAR SLAM FOR EFFICIENT MAPPING IN URBAN SCENES." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVIII-1/W1-2023 (May 25, 2023): 473–78. http://dx.doi.org/10.5194/isprs-archives-xlviii-1-w1-2023-473-2023.

Full text
Abstract:
Abstract. We first use the multi-robot SLAM framework DiSCo-SLAM to evaluate the performance of cooperative SLAM based on the complicated dataset in urban scenes. Besides, we perform comparisons of single-robot SLAM and multi-robot SLAM to explore whether the cooperative framework can noticeably improve robot localization performance and the influence of inter-robot constraints in local pose graph, utilizing an identical dataset generated via the Carla simulator. Our findings indicate that under specific conditions, the integration of inter-robot constraints may effectively mitigate drift in l
APA, Harvard, Vancouver, ISO, and other styles
44

Suleymanoglu, B., M. Soycan, and C. Toth. "INDOOR MAPPING: EXPERIENCES WITH LIDAR SLAM." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B1-2022 (May 30, 2022): 279–85. http://dx.doi.org/10.5194/isprs-archives-xliii-b1-2022-279-2022.

Full text
Abstract:
Abstract. Indoor mapping is gaining more interest in both research as well as in emerging applications. Building information systems (BIM) and indoor navigation are probably the driving force behind this trend. For accurate mapping, the platform trajectory reconstruction, or in other words sensor orientation, is essential to reduce or even eliminate for extensive ground control. Simultaneous localization and mapping (SLAM) is the computation problem of how to simultaneously estimate the platform/sensor trajectory while reconstructing the object space; usually, a real-time operation is assumed.
APA, Harvard, Vancouver, ISO, and other styles
45

Seki, Hiroshi, Yuhi Yamamoto, and Sumito Nagasawa. "The Influence of Micro-Hexapod Walking-Induced Pose Changes on LiDAR-SLAM Mapping Performance." Sensors 24, no. 2 (2024): 639. http://dx.doi.org/10.3390/s24020639.

Full text
Abstract:
Micro-hexapods, well-suited for navigating tight or uneven spaces and suitable for mass production, hold promise for exploration by robot groups, particularly in disaster scenarios. However, research on simultaneous localization and mapping (SLAM) for micro-hexapods has been lacking. Previous studies have not adequately addressed the development of SLAM systems considering changes in the body axis, and there is a lack of comparative evaluation with other movement mechanisms. This study aims to assess the influence of walking on SLAM capabilities in hexapod robots. Experiments were conducted us
APA, Harvard, Vancouver, ISO, and other styles
46

Messbah, Hind, Mohamed Emharraf, and Mohammed Saber. "Robot Indoor Navigation: Comparative Analysis of LiDAR 2D and Visual SLAM." IAES International Journal of Robotics and Automation (IJRA) 13, no. 1 (2024): 41. http://dx.doi.org/10.11591/ijra.v13i1.pp41-49.

Full text
Abstract:
<span lang="EN-US">Robot indoor navigation has become a significant area of research and development for applications such as autonomous robots, smart homes, and industrial automation. This article presents an in-depth comparative analysis of LiDAR 2D and visual sensor simultaneous localization and mapping (SLAM) approaches for robot indoor navigation. The increasing demand for autonomous robots in indoor environments has led to the development of various SLAM techniques for mapping and localization. LiDAR 2D and visual sensor-based SLAM methods are widely used due to their low cost and
APA, Harvard, Vancouver, ISO, and other styles
47

He, Jionglin, Jiaxiang Fang, Shuping Xu, and Dingzhe Yang. "Indoor Robot SLAM with Multi-Sensor Fusion." International Journal of Advanced Network, Monitoring and Controls 9, no. 1 (2024): 10–21. http://dx.doi.org/10.2478/ijanmc-2024-0002.

Full text
Abstract:
Abstract In order to solve the problem of large positioning error and incomplete mapping of SLAM based on two-dimensional lidar in indoor environment, a multi-sensor fusion SLAM algorithm for indoor robots was proposed. Aiming at the mismatch problem of the traditional ICP algorithm in the front end of the lidar SLAM, the algorithm adopts the PL-ICP algorithm that is more suitable for the indoor environment, and uses the extended Kalman filter to fuse the wheel odometer and IMU to provide the initial motion estimation value. Then, during the mapping phase, the pseudo 2D laser data converted fr
APA, Harvard, Vancouver, ISO, and other styles
48

Song, Chengqun, Bo Zeng, Jun Cheng, Fuxiang Wu, and Fusheng Hao. "PSMD-SLAM: Panoptic Segmentation-Aided Multi-Sensor Fusion Simultaneous Localization and Mapping in Dynamic Scenes." Applied Sciences 14, no. 9 (2024): 3843. http://dx.doi.org/10.3390/app14093843.

Full text
Abstract:
Multi-sensor fusion is pivotal in augmenting the robustness and precision of simultaneous localization and mapping (SLAM) systems. The LiDAR–visual–inertial approach has been empirically shown to adeptly amalgamate the benefits of these sensors for SLAM across various scenarios. Furthermore, methods of panoptic segmentation have been introduced to deliver pixel-level semantic and instance segmentation data in a single instance. This paper delves deeper into these methodologies, introducing PSMD-SLAM, a novel panoptic segmentation assisted multi-sensor fusion SLAM approach tailored for dynamic
APA, Harvard, Vancouver, ISO, and other styles
49

Wen, Weisong, Li-Ta Hsu, and Guohao Zhang. "Performance Analysis of NDT-based Graph SLAM for Autonomous Vehicle in Diverse Typical Driving Scenarios of Hong Kong." Sensors 18, no. 11 (2018): 3928. http://dx.doi.org/10.3390/s18113928.

Full text
Abstract:
Robust and lane-level positioning is essential for autonomous vehicles. As an irreplaceable sensor, Light detection and ranging (LiDAR) can provide continuous and high-frequency pose estimation by means of mapping, on condition that enough environment features are available. The error of mapping can accumulate over time. Therefore, LiDAR is usually integrated with other sensors. In diverse urban scenarios, the environment feature availability relies heavily on the traffic (moving and static objects) and the degree of urbanization. Common LiDAR-based simultaneous localization and mapping (SLAM)
APA, Harvard, Vancouver, ISO, and other styles
50

Yang, Xin, Xiaohu Lin, Wanqiang Yao, Hongwei Ma, Junliang Zheng, and Bolin Ma. "A Robust LiDAR SLAM Method for Underground Coal Mine Robot with Degenerated Scene Compensation." Remote Sensing 15, no. 1 (2022): 186. http://dx.doi.org/10.3390/rs15010186.

Full text
Abstract:
Simultaneous localization and mapping (SLAM) is the key technology for the automation of intelligent mining equipment and the digitization of the mining environment. However, the shotcrete surface and symmetrical roadway in underground coal mines make light detection and ranging (LiDAR) SLAM prone to degeneration, which leads to the failure of mobile robot localization and mapping. To address these issues, this paper proposes a robust LiDAR SLAM method which detects and compensates for the degenerated scenes by integrating LiDAR and inertial measurement unit (IMU) data. First, the disturbance
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!