18.97.14.84
18.97.14.84
close menu
Agricultural Robot Dataset for Visual Simultaneous Localization and Mapping (VSLAM) in Orchard Farms
( Anditya Sridamar Pratyasta ) , ( Xiongzhe Han ) , ( Jongwoo Ha )
UCI I410-ECN-151-24-02-088679943
This article is 4 pages or less.

Agricultural robotics and precision farming have gained increasing attention, including the development of autonomous robot sprayers based on Visual Simultaneous Localization and Mapping (VSLAM). However, the availability of datasets for simulating the development of autonomous robot algorithms for outdoor environments is limited, as it requires specific land management and timing for data acquisition. This study presented a small-scale fruit orchard visual dataset for localization and mapping, representing the orchard before planting to develop an autonomous spraying robot. The platform used in this study was implemented with the SLAM visual framework and recorded environmental datasets at a fruit orchard farm near Chuncheon, South Korea, during the Spring season of 2023 for two weeks, with data recorded twice per week. The robotic platform includes an RGB-D camera to capture detailed information about the land use of the orchard, as well as an IMU and GPS for localization, navigation, and mapping, with all sensors pre-calibrated. After data collection, a 3D map of the orchard robot is expected to be reconstructed, and an image dataset of the fruit orchard layout will be obtained to provide a dataset for simulating the development of autonomous driving in orchard fruit environments. This research is to assist other researchers in simulating the development of autonomous robots, particularly in the fruit orchard environment.

[자료제공 : 네이버학술정보]
×