Preview

Horticulture and viticulture

Advanced search
Open Access Open Access  Restricted Access Subscription Access

Navigation of robotic platforms in commercial horticulture: a comparative analysis of transformers for semantic segmentation

https://doi.org/10.31676/0235-2591-2025-4-51-59

Abstract

The article presents the results of research on the use of deep learning methods based on SegFormer models for semantic image segmentation and autonomous navigation of robotic platforms in rows of orchard plantings. The study compared different versions of SegFormer models pretrained on ADE20K and CityScapes. The data sets show a difference in accuracy of up to 4–7 % and diff er in the number of classes and the number of model parameters (from 3.7 million for B0 to 82 million for B5). For transfer learning of the models, a dataset was prepared and labeled. The dataset consisted of 1200 images of rows of orchard plantings with annotations for six classes of objects. These included the class Tree (apple trees with a height of ≥1.5 m), the class Near-Trunk (zones around the trunk, with a radius of 0.5 m), the class Pole (support structures with a height of ≥2 m), the class Sky (the sky area, including clouds), the class Track (row spacing, 3 m wide), and the class Background (the surrounding environment, other background objects) in Robofl ow. To expand the dataset and improve the quality of the models, we performed data augmentation (image rotation, brightness correction). The experiments showed that increasing the dimensionality of hidden layers in the SegFormer B0–B5 models enhanced feature extraction from images, correlating with the growth metrics for semantic segmentation. The comparative analysis of 12 versions of SegFormer models (B0–B5) identifi ed the optimal ratio between accuracy and performance. For instance, the B4 SegFormer version achieves the highest accuracy (Val Dice=0.7927) and is recommended for high-detail mapping tasks. The B0 SegFormer version ensures the highest processing speed (1.52 FPS) and is applicable for real-time navigation. The designed trajectory construction algorithm based on DBSCAN clustering and RANSAC approximation showed high efficiency in eliminating noise in segmentation and generating movement routes. Thus, robotic platforms are enabled to adapt to dynamic conditions, including changes in row geometry and background interference. The developed algorithm will improve positioning accuracy and reduce the dependence of robotic platforms on expensive multisensor systems.

About the Authors

A. I. Kutyrev
Federal Scientific Agroengineering Center VIM
Russian Federation

Kutyrev A. I., PhD (Tech.), Leading Researcher

5, 1st Institute Passage, Moscow, 109428



N. A. Andriyanov
Federal Scientific Agroengineering Center VIM
Russian Federation

Andriyanov N. A., PhD (Tech.), Leading Researcher

Moscow



References

1. Nijak M., Skrzypczyński P., Ćwia K., Zawada M., Szymczyk S., Wojciechowski J. On the Importance of Precise Positioning in Robotised Agriculture, Remote Sens. 2024;16(6):985. DOI: 10.3390/rs16060985.

2. Wang R., Chen L., Huang Z., Zhang W., Wu S. A Review on the High-Effi ciency Detection and Precision Positioning Technology Application of Agricultural Robots, Processes. 2024;12(9):1833. DOI: 10.3390/pr12091833.

3. Liu C., Nguyen B. K. Low-Cost Real-Time Localisation for Agricultural Robots in Un-structured Farm Environments, Machines. 2024;12(9):612. DOI: 10.3390/machines12090612.

4. Hort D. O., Kutyrev A. I., Smirnov I. G., Moiseev G. V., Solov’ev V. I. Motion control of the agricultural autonomous robotic platform, Sel’skohozyajstvennye mashiny I tekhnologii. 2023;17(1):25-34. DOI: 10.22314/2073-7599-2023-17-1-25-34. (in Russ.).

5. Cheng B., He X., Li X., Zhang N., Song, W., Wu H. Research on Positioning and Navigation System of Greenhouse Mobile Robot Based on Multi-Sensor Fusion, Sensors. 2024;24(15):4998. DOI: 10.3390/s24154998.

6. Su F., Zhao Y., Shi Y., Zhao D., Wang G., Yan Y., Zu L., Chang S. Tree Trunk and Obstacle Detection in Apple Orchard Based on Improved YOLOv5s Model, Agronomy. 2022;12(10):2427. DOI: 10.3390/agronomy12102427.

7. Huang W., Miao Z., Wu T., Guo Z., Han W., Li T. Design of and Experiment with a Dual-Arm Apple Harvesting Robot System, Horticulturae. 2024;10(12):1268. DOI: 10.3390/horticulturae10121268.

8. Li Y., Li J., Zhou W., Yao Q., Nie J., Qi X. Robot Path Planning Navigation for dense planting red jujube orchards based on the joint improved A* and DWA algorithms under laser SLAM, Agriculture. 2022;12(9):1445. DOI: 10.3390/agriculture12091445.

9. Gu B., Liu Q., Gao Y., Tian G., Zhang B., Wang H., Li H. Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows, Sensors. 2023;23(21):8807. DOI: 10.3390/s23218807.

10. Fue K., Porter W., Barnes E., Li C., Rains G. Evaluation of a Stereo Vision System for Cotton Row Detection and Boll Location Estimation in Direct Sunlight, Agronomy. 2020;10(8):1137. DOI: 10.3390/agronomy10081137.

11. Kutyryov A. I. Recognition and classifi cation of diseases of apple leaves based on the analysis of their images by convolutional neural network models (CNN), Vestnik Ul’yanovskoj gosudarstvennoj sel’skohozyajstvennoj akademii. 2023;3(63):215- 223. DOI: 10.31676/0235-2591-2024-6-51-59. (in Russ.).

12. Kutyryov A. I., Smirnov I. G., Andriyanov N. A. Neural network models of apple fruit identifi cation in tree crowns: comparative analysis, Sadovodstvo i vinogradarstvo. 2023;(5):56-63. DOI: 10.31676/0235-2591-2023-5-56-63. (in Russ.).

13. Karim M. R., Reza M. N., Jin H., Haque M. A., Lee K. H., Sung J., Chung S. O. Application of LiDAR Sensors for Crop and Working Environment Recognition in Agriculture: A Review, Remote Sens. 2024;16(24):4623. DOI: 10.3390/rs16244623.

14. Li Y., Feng Q., Ji C., Sun J., Sun Y. GNSS and LiDAR Integrated Navigation Method in Orchards with Intermittent GNSS Dropout, Appl. Sci. 2024;14(8):3231. DOI: 10.3390/app14083231.

15. Tao H., Zhang R., Zhang L., Zhang D., Yi T., Wu M. Tea Harvest Robot Navigation Path Generation Algorithm Based on Semantic Segmentation Using a Visual Sensor, Electronics. 2025;14(5):988. DOI: 10.3390/electronics14050988.

16. Agarla M., Napoletano P., Schettini R. Quasi Real-Time Apple Defect Segmentation Using Deep Learning, Sensors. 2023;23(18):7893. DOI: 10.3390/s23187893.

17. Mo L., Fan Y., Wang G., Yi X., Wu X., Wu, P. Deep- MDSCBA: An Improved Semantic Segmentation Model Based on DeepLabV3+ for Apple Images, Foods. 2022;11(24):3999. DOI: 10.3390/foods11243999.

18. Jin T., Kang S.M., Kim N.R., Kim H.R., Han X. Comparative Analysis of CNN-Based Semantic Segmentation for Apple Tree Canopy Size Recognition in Automated Variable-Rate Spraying, Agriculture. 2025;15(7):789. DOI: 10.3390/agriculture15070789.

19. Yang L., Zhang T., Zhou S., Guo J. AAB-YOLO: An Improved YOLOv11 Network for Apple Detection in Natural Environments, Agriculture. 2025;15(8):836. DOI: 10.3390/agriculture15080836.

20. Machefer M., Lemarchand F., Bonnefond V., Hitchins A., Sidiropoulos P. Mask R-CNN Refi tting Strategy for Plant Counting and Sizing in UAV Imagery, Remote Sens. 2020;12(18):3015. DOI: 10.3390/rs12183015.

21. Mo L., Fan Y., Wang G., Yi X., Wu X., Wu P. Deep- MDSCBA: An Improved Semantic Segmentation Model Based on DeepLabV3+ for Apple Images, Foods. 2022;11(24):3999. DOI: 10.3390/foods11243999.

22. Yang T., Zhou S., Xu A., Ye J., Yin J. YOLO-SegNet: A Method for Individual Street Tree Segmentation Based on the Improved YOLOv8 and the SegFormer Network, Agriculture. 2024;14(9):1620. DOI: 10.3390/agriculture14091620.

23. Gurita A., Mocanu I.G. Image Segmentation Using Encoder-Decoder with Deformable Convolutions, Sensors. 2021;21(5):1570. DOI: 10.3390/s21051570.

24. Jiang A., Ahamed T. Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection, Sensors. 2023;23(10):4808. DOI: 10.3390/s23104808.


Review

For citations:


Kutyrev A.I., Andriyanov N.A. Navigation of robotic platforms in commercial horticulture: a comparative analysis of transformers for semantic segmentation. Horticulture and viticulture. 2025;(4):51-59. (In Russ.) https://doi.org/10.31676/0235-2591-2025-4-51-59

Views: 322


ISSN 0235-2591 (Print)
ISSN 2618-9003 (Online)