THE USE OF OMNIDIRECTIONAL CAMERAS FOR AGRICULTURAL LAND GROUND-BASED REMOTE SENSING SYSTEMS

Main Article Content

Игнат Николаевич Пеньшин

Abstract

Today, the technologies of terrestrial and near earth remote sensing systems are moving into the phase of active development. This technology is designed to obtain detailed and accurate information about the environment in the immediate vicinity of the earth's surface. One of the key advantages of near-Earth remote sensing systems is the availability and high frequency of obtaining images and other field data with a high level of accuracy, which allows them to be used in various agricultural tasks.


The use of omnidirectional cameras in agriculture is a new way to capture high-quality and detailed images of the environment, as the 360 degree horizontal and 180 degree vertical view makes them a suitable tool for monitoring and analyzing agricultural processes such as crop growth, soil conditions and irrigation management.


In this paper, the use of omnidirectional cameras is considered in the context of integration with other technologies, such as machine learning algorithms for creating intelligent agricultural systems, as well as autonomous and automated devices for monitoring and processing agricultural land, allowing to optimize processes and increase yields. Analysis of the technology and evaluation of the quality indicators of omnidirectional cameras for agriculture will speed up the implementation of this solution and improve the efficiency and quality of field work in order to increase productivity.

Article Details

Section
Land economics and policies

References

1. Stepanov D. N. Mathematical models for obtaining stereo images from two-mirror catadioptric systems taking into account lens distortion // Computer Optics. - 2019. - T. 43. - No. 1. - S. 105 - 114.
2. Bakushinsky A. B. On the problem of convergence of the iteratively regularized Gauss-Newton method // Journal of Computational Mathematics and Mathematical Physics. - 1992. - T. 32. - No. 9. - S. 1503 - 1509.
3. Zhimbueva L. D. Method for determining the total distortion of digital images // Computer Optics. - 2011. - T. 35. - No. 3. - S. 347 - 355.
4. Borodulina S. V., Zaitsev Yu. A. Theoretical foundations for constructing nonlinear perspective images // Bulletin of the Saratov State Technical University. - 2006. - T. 4. - No. 2 (17). - S. 67 - 76.
5. Z. Zhang, H. Rebecq, C. Forster and D. Scaramuzza. Benefit of large field-of-view cameras for visual odometry. In 2016 IEEE International Conference on Robotics and Automation (ICRA), pages 801 – 808, May 2016.
6. Scaramuzza D., Martinelli A., Siegwart R. A flexible technique for accurate omnidirectional camera calibration and structure from motion // Fourth IEEE International Conference on Computer Vision Systems (ICVS'06). – IEEE, 2006. – С. 45 – 45.
7. Aghayari S. et al. Geometric calibration of full spherical panoramic Ricoh-Theta camera //ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences IV-1/W1 (2017). – 2017. – Т. 4. – С. 237 – 245.
8. Huang M. et al. High-precision calibration of wide-angle fisheye lens with radial distortion projection ellipse constraint (RDPEC) //Machine Vision and Applications. – 2022. – Т. 33. – №. 3. – С. 1 – 23.
9. Zhang Y., Huang F. Panoramic visual slam technology for spherical images //Sensors. – 2021. – Т. 21. – №. 3. – С. 705.
10. Furgale P., Rehder J., Siegwart R. Unified temporal and spatial calibration for multi-sensor systems // 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. – IEEE, 2013. – С. 1280 – 1286.
11. Geyer C., Daniilidis K. A unifying theory for central panoramic systems and practical implications //European conference on computer vision. – Springer, Berlin, Heidelberg, 2000. – С. 445 – 461.
12. Wei L. et al. Intelligent vehicle localization in urban environments using ekf-based visual odometry and gps fusion //IFAC Proceedings Volumes. – 2011. – Т. 44. – №. 1. – С. 13776 – 13781.