MAPPING THE WHEEL ROBOT WORKING ENVIRONMENT WITH SLAM GMAPPING METHOD USING LIDAR SENSOR
Keywords:
IMU ROS, Slam, Rplidar, Mapping, Raspberry Pi 4Abstract
The wheeled robot used to deliver documents between rooms must be able to move according to the environmental conditions of the work area. For this reason, the robot must have knowledge of the conditions of the work environment to be passed. In this final project, the work area environment mapping on the wheeled robot is carried out. Mapping was done using the Simultaneous Localization and Mapping (SLAM) method. The equipment used in the mapping is a lidar sensor. The robot system consists of a raspberry Pi 4 which is used as the main controller of the robot. The robot has two sensors. The first sensor is a lidar sensor, which is used to detect the distance of the object in front of the robot. Then the IMU sensor is used to detect the robot's orientation and position changes. In the robot there is a motor driver which is used as a robot control signal processor to drive a DC motor. Map making is done by means of a lidar sensor reading the robot's working area environment. The lidar sensor output signal is processed using the SLAM gmapping method. In this test, to determine the environment of the robot's work area, using a laser scanner to produce a two-dimensional (2D) map, while estimating the position of the robot on the map using a particle filter. This simultaneous mapping uses the Simultaneous Localization and Mapping (SLAM) mapping algorithm based on Raspberry Pi 4. The results obtained are maps in grayscale. In addition to SLAM gmapping, this article also shows that there are one to three robot position 2D testing arenas.
References
[2] A. N. Ranaminanta, D. Nurcipto, dan M. A. Heryanto, “Pemetaan Ruangan Dua Dimensi Menggunakan Sensor Lidar 360 Derajat Pada Mobile Robot,†vol. 4, hal. 30–38, 2022.
[3] Werede Gunaza Teame, Dr. Yanan Yu, dan Professor Wang Zhongmin, “Optimization of SLAM Gmapping based on Simulation,†Int. J. Eng. Res., vol. V9, no. 04, hal. 74–81, 2020, doi: 10.17577/ijertv9is040107.
[4] B. A. Issa, “RP Lidar Sensor for Multi-Robot Localization using Leader Follower Algorithm,†vol. 15, no. 2, 2019.
[5] M. A. Maulyda, E. Hidayanto, dan S. Rahardjo, “Representation of Trigonometry Graph Funcsion Colage Students Using GeoGebra,†Int. J. Trends Math. Educ. Res., vol. 2, no. 4, hal. 1–7, 2019, doi: 10.33122/ijtmer.v2i4.
[6] T. Lane, “A short history of robotic surgery,†Ann. R. Coll. Surg. Engl., vol. 100, hal. 5–7, 2018, doi: 10.1308/rcsann.supp1.5.
[7] S. Pietrzik dan B. Chandrasekaran, “Setting up and Using ROS-Kinetic and Gazebo for Educational Robotic Projects and Learning,†J. Phys. Conf. Ser., vol. 1207, no. 1, 2019, doi: 10.1088/1742-6596/1207/1/012019.
[8] K. Shabalina, A. Sagitov, K.-L. Su, K.-H. Hsia, dan E. Magid, “Avrora Unior Car-like Robot in Gazebo Environment,†Proc. Int. Conf. Artif. Life Robot., vol. 24, hal. 116–119, 2019, doi: 10.5954/icarob.2019.os4-3.
[9] M. Kim, J. Cho, S. Lee, dan Y. Jung, “Imu sensor-based hand gesture recognition for human-machine interfaces,†Sensors (Switzerland), vol. 19, no. 18, hal. 1–13, 2019, doi: 10.3390/s19183827.
[10] G. Grisetti, C. Stachniss, dan W. Burgard, “Improved techniques for grid mapping with Rao-Blackwellized particle filters,†IEEE Trans. Robot., vol. 23, no. 1, hal. 34–46, 2007, doi: 10.1109/TRO.2006.889486.
[11] H. Yoshida, H. Fujimoto, D. Kawano, Y. Goto, M. Tsuchimoto, dan K. Sato, “Range extension autonomous driving for electric vehicles based on optimal velocity trajectory and driving braking force distribution considering road gradient information,†IECON 2015 - 41st Annu. Conf. IEEE Ind. Electron. Soc., no. Figure 1, hal. 4754–4759, 2015, doi: 10.1109/IECON.2015.7392843.
[12] E. Yurtsever, J. Lambert, A. Carballo, dan K. Takeda, “A Survey of Autonomous Driving: Common Practices and Emerging Technologies,†IEEE Access, vol. 8, hal. 58443–58469, 2020, doi: 10.1109/ACCESS.2020.2983149.
[13] M. Mirdanies dan R. P. Saputra, “Experimental review of distance sensors for indoor mapping,†J. Mechatronics, Electr. Power, Veh. Technol., vol. 8, no. 2, hal. 85–94, 2017, doi: 10.14203/j.mev.2017.v8.85-94.
[14] M. Teichmann dan B. Mishra, “Probabilistic algorithms for efficient grasping and fixturing,†Algorithmica (New York), vol. 26, no. 3–4, hal. 345–363, 2000, doi: 10.1007/s004539910017.
[15] A. Priyambudi, B. Firman, dan S. Kristiyana, “Kendali Kecepatan Motor Pada Robot Dengan Empat Roda Omni Menggunakan Metode,†J. Teknol. Technoscientia, vol. 10, no. 2, hal. 209–217, 2018, [Daring]. Tersedia pada: https://ejournal.akprind.ac.id/index.php/technoscientia/article/view/104.
[16] I. Arun Faisal, T. Waluyo Purboyo, dan A. Siswo Raharjo Ansori, “A Review of Accelerometer Sensor and Gyroscope Sensor in IMU Sensors on Motion Capture,†J. Eng. Appl. Sci., vol. 15, no. 3, hal. 826–829, 2019, doi: 10.36478/jeasci.2020.826.829.
[17] M. N. Masrukhan, M. P. Mulyo, D. Ajiatmo, dan M. Ali, “Optimasi Kecepatan Motor DC Menggunakan PID dengan Tuning Ant Colony Optimization (ACO) Controller,†Pros. SENTIA-2016, vol. 8, no. 1, hal. B49–B52, 2016.
[18] M. Du, R. F. Ribeiro, dan J. Yuen-Zhou, “Remote Control of Chemistry in Optical Cavities,†Chem, vol. 5, no. 5, hal. 1167–1181, 2019, doi: 10.1016/j.chempr.2019.02.009.
[19] Syaiful Ahdan dan Setiawansyah, “Pengembangan Sistem Informasi Geografis Untuk Pendonor Darah dengan Algoritma Dijkstra berbasis Android,†J. Sains dan Inform., vol. 6, no. 2, hal. 67–77, 2020, [Daring]. Tersedia pada: http://doi.org/10.22216/jsi.v6i2.5573.
Communication System Design for Wheeled Soccer Robot Controller,†Adv. Sci. Lett., vol. 24, no. 11, pp. 8782–8786, 2018, doi: 10.1166/asl.2018.12345.
[1] M. Occhiogrosso, Graphs of Trigonometric Functions: Trigonometry. Milliken Publishing Company, 2007, 2007.
[2] E. W. Dijkstra, “A note on two problems in connexion with graphs,†Numer. Math., vol. 1, no. 1, pp. 269–271, 1959, doi: 10.1007/BF01386390.
[3] S. A. Fadzli, S. I. Abdulkadir, M. Makhtar, and A. A. Jamal, “Robotic indoor path planning using dijkstra’s algorithm with multi-layer dictionaries,†2015 IEEE 2nd Int. Conf. InformationScience Secur. ICISS 2015, 2016, doi: 10.1109/ICISSEC.2015.7371031.
[4] S. Pietrzik and B. Chandrasekaran, “Setting up and Using
ROS-Kinetic and Gazebo for Educational Robotic
Projects and Learning,†in Journal of Physics: Conference Series, 2019, vol. 1207, no. 1, doi: 10.1088/1742-6596/1207/1/012019.
[5] N. Koenig and A. Howard, “Design and use paradigms for Gazebo, an open-source multi-robot simulator,†2004 IEEE/RSJ Int. Conf. Intell. Robot. Syst., vol. 3, pp. 2149–2154, 2004, doi: 10.1109/iros.2004.1389727.
[6] lentin joseph, “Robotic Operating System,†Introduction to ROS. p. 385, 2017, [Online]. Available: https://www.ros.org.
[7] W. Yao, W. Dai, J. Xiao, H. Lu, and Z. Zheng, “A simulation system based on ROS and Gazebo for RoboCup Middle Size League,†in 2015 IEEE International Conference on Robotics and Biomimetics, IEEE-ROBIO 2015, 2015, pp. 54–59, doi: 10.1109/ROBIO.2015.7414623.