Japan Geoscience Union Meeting 2022

Presentation information

[J] Poster

M (Multidisciplinary and Interdisciplinary) » M-TT Technology & Techniques

[M-TT46] Introducing metaverse to agriculture. Are we ready?

Fri. Jun 3, 2022 11:00 AM - 1:00 PM Online Poster Zoom Room (35) (Ch.35)

convener:Seishi Ninomiya(Graduate School of Agriculture and Life Sciences, the University of Tokyo), convener:Yukihiro Takahashi(Department of Cosmosciences, Graduate School of Science, Hokkaido University), Chairperson:Seishi Ninomiya(Graduate School of Agriculture and Life Sciences, the University of Tokyo)

11:00 AM - 1:00 PM

[MTT46-P01] Data collection and xR platform for digital clone of agricultural fields and plants

*Masayuki Hirafuji1, Wei Guo1, Seishi Ninomiya1 (1.The University of Tokyo)

Keywords:agriculture, digital clone, plant

Introduction
We have been conducting research on outdoor IoT devices such as Field Server, integration of weather data, and agricultural applications for about 20 years. In recent years, we have been engaged in research on the construction of big data and knowledge discovery of plant phenotypic data (3D data of individual plants and plant individuals, plant coverage, flowering date, etc.) and environmental data using drones and the IoT devices (JST CREST "Construction of agricultural big data mainly based on field sensing time series data and discovery of new knowledge", 2015-2021). Next-generation DNA sequencers have made it possible to collect genomic information on plants and symbiotic microorganisms at low cost. We are automating the mechanical sampling of plant tissues and other materials.
The disruptive power of AI and the big data that supports it is thought to lie in exponentially increasing computational speed and storage, and the lack of data has become rather significant.

Multidimensional data
In addition to ordinary RGB images, image data is becoming multidimensional with multispectral cameras and hyperspectral cameras. In addition, due to the rapid development of low-cost vector network analyzers (VNA) and software defined radios (SDR), we can expect to collect multidimensional data on plant and soil properties using electromagnetic waves over a wide band. However, there is a relative lack of sensor data that can serve as ground-truth data for this. Therefore, we are currently working on a method to integrate and view various data as digital clones in the metaverse and analyze them intuitively while self-stimulating the ground-truth data (JST AIP acceleration, " Studies of CPS platform to raise big-data-driven AI agriculture", 2021-2023).

Heterogenous massive data files
The image data (about 100 million images) and sensor data collected to date consists of many files. It takes more than a month to simply copy them to the hard disk, and they are in different formats and file attributes. It is difficult to store them as an RDBMS, so they are categorized and stored in Hadoop, KVS, or folders.
To automate some of the metadata assignment work, an AI application (YOLO v4) is used to recognize objects in all image files by batch processing (it took about six months on a typical PC to process 100 million images) and is used to retrieve the images. We think that a platform equipped with a 3D viewer is necessary to make comprehensive and intuitive decisions for smart agriculture and agronomical studies.

Digital clones
Since the structure of plants varies depending on the environment, we are generating different growth conditions by cultivating crops in outdoor and indoor environments to collect a variety of digital clone data. We are also developing an open-source platform that can reproduce the various data collected with a sense of "realism" as if we were in the field. The core of digital cloning is 3D modeling that realistically visualizes the image data of the field. The 3D point cloud model and mesh model of the field are generated from the data acquired by Lidar and cameras mounted on drones and ground robots. In addition, we aim to generate realistic digital clones by estimating the parts of plants that cannot be photographed, in conjunction with plant growth simulation.
The user interface for visualization and data manipulation will be done with xR (VR, AR, MR) using camera-equipped smart glasses and Android devices. By using real-world general object detection and gesture recognition functions to operate with an overlay display of current, past, and future digital clones and the related data in a particular scene, we can intuitively grasp the heterogenous big data.

Conclusion
It is expected that big data collected within a specific field can be used as ground-truth data for remote sensing by satellites and other means, thereby improving the practicality of remote sensing technology in agriculture. In order to promote data integration, it is necessary to clarify the ownership of data, and we are currently considering the use of NFT.