Agricultural robotics and precision farming have gained increasing attention, including the development of autonomous robot sprayers based on Visual Simultaneous Localization and Mapping (VSLAM). However, the availability of datasets for simulating the development of autonomous robot algorithms for outdoor environments is limited, as it requires specific land management and timing for data acquisition. This study presented a small-scale fruit orchard visual dataset for localization and mapping, representing the orchard before planting to develop an autonomous spraying robot. The platform used in this study was implemented with the SLAM visual framework and recorded environmental datasets at a fruit orchard farm near Chuncheon, South Korea, during the Spring season of 2023 for two weeks, with data recorded twice per week. The robotic platform includes an RGB-D camera to capture detailed information about the land use of the orchard, as well as an IMU and GPS for localization, navigation, and mapping, with all sensors pre-calibrated. After data collection, a 3D map of the orchard robot is expected to be reconstructed, and an image dataset of the fruit orchard layout will be obtained to provide a dataset for simulating the development of autonomous driving in orchard fruit environments. This research is to assist other researchers in simulating the development of autonomous robots, particularly in the fruit orchard environment.