CIGR VI 2019

Presentation information

Oral Session

Postharvest/Food Technology and Process Engineering

[6-1015-C] Postharvest/Food Technology and Process Engineering (6)

Fri. Sep 6, 2019 10:15 AM - 11:30 AM Room C (3rd room)

Chair:Xujun Ye(Hirosaki University, Japan)

10:45 AM - 11:00 AM

[6-1015-C-03] Evaluating the Performance of Unmanned Crop Sensing Robot for Rice

*Dhirendranath Singh1, Shigeru Ichiura1, Mitsuhiko Katahira2,1 (1. United Graduate School of Agriculture , Iwate University(Japan), 2. Faculty of Agriculture, Yamagata University(Japan))

Keywords:Crop Sensing, Unmanned Ground Vehicle (UGV), Precision Agriculture, Rice

Precision Agriculture has emerged as a new scientific field that seeks to drive agricultural productivity while minimizing its environmental impacts. As the demand for food increases, farmers are in search of technology that would allow them cultivate more land with less labour at the same time increasing their productivity. In rice cultivation, this has led to the adoption of technologies such as Unmanned aerial vehicles (UAV) for crop monitoring. While this has increased precision from traditional satellite images, it still has the limitation of being restricted to capturing images of the crop canopy. Unmanned Ground Vehicles (UGV) on the other hand has the potential to capture a wider range of data with pin point accuracy. This paper reports on the work done thus far in evaluating the performance of a field robot developed by the World Wide Food Platform, Japan for rice crop sensing. The study was conducted in 3 rice fields at the Yamagata University’s Farm in Takasaka, Tsurouka, Japan and a Farmers’ Field in Mikawa, Yamagata, Japan. The cultivation system in the fields were transplanting, hilldrop and broadcasting at Takasaka, while in Mikawa drill seeding was done. The robot is equipped with sensors for temperature, humidity, sunlight, wind speed, soil temperature, water level and temperature and cameras (Sony FDR-X3000) for image capture. RTK GPS was used for location logging with an accuracy of 5 cm. Data captured were mapped into QGIS 3.4 for visualization an analysis of growth parameters every two weeks after germination, with observations made on the robots’ maneuverability in the various field conditions. Plant height, leaf and tiller number, and SPAD values were collected manually in each field to compare for image data. It was found that the robot was able to maneuver in different field conditions without major issue, utilizing the reverse function instead of turning full circle appears to be the most efficient method for turning while causing minimal damage to young seedlings. The weight distribution will have to be considered to obtain optimum performance in deep fields. Data collected from the array of sensors and cameras provides location specific information throughout the field and can be used to guide farmers in precision management.