Japan Geoscience Union Meeting 2022

Presentation information

[J] Oral

M (Multidisciplinary and Interdisciplinary) » M-TT Technology & Techniques

[M-TT46] Introducing metaverse to agriculture. Are we ready?

Thu. May 26, 2022 10:45 AM - 12:15 PM 202 (International Conference Hall, Makuhari Messe)

convener:Seishi Ninomiya(Graduate School of Agriculture and Life Sciences, the University of Tokyo), convener:Yukihiro Takahashi(Department of Cosmosciences, Graduate School of Science, Hokkaido University), Chairperson:Nobuyasu Naruse(Faculty of Medicine, Shiga University of Medical Science)

12:00 PM - 12:15 PM

[MTT46-06] Stable small UAV plant imaging and handling system in greenhouse for precision horticulture

*Kunihiro Kodama1, Masanori Ishii2, Takanari Tanabata1, Sachiko Isobe1, Wei Guo2 (1.Kazusa DNA Research Institute, 2.The University of Tokyo)

Keywords:drone, structure from motion, indoor, automatic flight, precision agriculture

In precision agriculture, it is important to obtain phenotypic traits of individual plants during the growth period, and to apply the optimal cultivation management according to the growth. Measurement and data analysis through automation are also important because the number of measurement targets and traits in agricultural fields tends to be huge. Automatic imaging systems using drones and robots have been well made in the field. however, the use of drones indoors is still under the developing stage. Drones have obstacle tolerance and flexibility in data gathering position and angle that ground robots do not have and could be used especially for plants that grow tall. Therefore, we have developed an automatic plant imaging and image analysis system using a small drone to record the detailed growth status of plants.
We have developed an image analysis method that allows users to efficiently select desired images from a large set of images taken by small drone in the greenhouse without tags for indicating each plant position. In this study, we does not use GPS for plant recognition, because the GPS signal is often unavailable in the greenhouse. So, we use SfM(Structure from Motion) to reconstruct the situation when image are taken. The SfM method can estimate the positional relationship of the captured images without GPS information. Because the repetition of similar features in the housing component and plants organs often led to collapse of the SfM results, the columns with dot patterns were installed to add the number of feature points. Then the target position is expressed by a bounding box whose coordinate is known and each point is projected to every image to investigate whether points are inside or outside of the image. If inside, the projected box is painted white to become a mask image. By comparing the original image and mask image, where the target in the image is revealed. We applied this image analysis method to the tomato plant bed(15m x 2m) in the greenhouse. The image data sets were obtained several times during the growth period of plants. The accuracy was investigated by comparing known marker distances with reconstructed distances by the SfM. The differential distances were at most 0.5%. Therefore, the system can be used sufficiently for trait measurement.
This system enables the efficient selection of images taken by angle-flexible drones, which can contribute to the development of precision agriculture in the future.
(https://github.com/kkodamakazusa/JpGU2022_materials)