Japan Geoscience Union Meeting 2025

Presentation information

[J] Oral

M (Multidisciplinary and Interdisciplinary) » M-IS Intersection

[M-IS21] Understanding plastic pollution: The reality and countermeasures

Sun. May 25, 2025 9:00 AM - 10:30 AM 103 (International Conference Hall, Makuhari Messe)

convener:Shinichiro Kako(Graduate School of Science and Engineering, Kagoshima University), Atsuhiko Isobe(Kyushu University, Research Institute for Applied Mechanics), Toshiaki Sasao(Ritsumeikan University), MASASHI YAMAMOTO(Kanagawa University), Chairperson:Shinichiro Kako(Graduate School of Science and Engineering, Kagoshima University), Atsuhiko Isobe(Kyushu University, Research Institute for Applied Mechanics)

10:00 AM - 10:15 AM

[MIS21-05] A trial to detect underwater plastic litter using a high-resolution acoustic video camera

*Katsunori Mizuno1, Yilong Zhang1, Xiaoteng Zhou1 (1.Graduate School of Frontier Sciences, The University of Tokyo)

Keywords:Acoustic video camera, plastic waste, image classification, remote sensing

The clarification of "plastic waste flow", such as how much plastic waste is flowing into the ocean via rivers, is the key to implementing effective pollution countermeasures. However, at present, there is no unified method for accurately surveying the amount of plastic waste in cities, rivers and coastal areas that is necessary to obtain this information. As a result, accurate data is not provided for most of the physical quantities related to plastic waste, both in Japan and overseas. In fact, due to the lack of specific numerical values, some studies use highly uncertain and unrealistic values to estimate the amount of plastic waste flowing from cities to the ocean. For example, there is no solid quantitative evidence for either the global environmental discharge of dumped plastic waste, which is estimated at around 30 million tons, or the annual discharge of plastic waste into the ocean, which is estimated at around 2 million tons. In order to overcome this situation, methods are being developed to quantify plastic waste in rivers and on beaches using remote sensing and image analysis, such as drones and webcams.
So far, although there has been active technological development in the sensing of plastic waste distributed on land and on the water surface, sensing of plastic waste in water is limited. The main reason for this is that light is greatly attenuated in water, so that conventional remote sensing technology using optical technology cannot be used as it is. On the other hand, sound, which is used exclusively for surveying the seabed and resource exploration, has the advantage of being less attenuated underwater than light, and is hardly affected by light intensity or turbidity. However, acoustic images have no color information, and in addition to having low resolution, they also have their own unique issues, such as speckle noise, acoustic shadowing, and reverberation. In this study, we are developing a method for detecting plastic waste using acoustic measurement technology, taking into account these issues. In particular, we have conducted basic research on detecting plastic waste using a high-resolution acoustic video camera that uses high-frequency sound, and we will introduce our approach and results.
In this study, the acoustic video camera ARIS was installed at the bottom of the circulating water tank (25 m (length) × 1.8 m (width) × 1.4 m (height)) owned by the Institute of Industrial Science, the University of Tokyo, and recording was carried out while emitting sound in the direction of about 20° upwards. The center frequency of ARIS was set to 3.0 MHz, the recording range was about 5 m, and the frame rate was 5 fps. At this time, the resolution of the acoustic images recorded as video was 2 mm/pixel. In addition, plastic bottles, plastic bags, and cans were prepared as measurement targets. The flow velocity of the circulating water tank was set to 0.1 ~ 0.3 m/s. The measurement targets were repeatedly flowed from the upstream and photographed with the acoustic video camera, and a total of more than 1000 acoustic images were obtained.
Next, we examined a method for detecting plastic waste from the acquired acoustic images. Specifically, we used the YOLO model (version 11). We labeled plastic bottles, plastic bags, and cans on the acoustic images to create 800 pieces of training data. In addition, since waves on the water surface were also appearing on the acoustic images, we added waves to the labels and trained the model. Furthermore, we expanded the training data by expanding the data (color space, rotation, translate, flip, scaling, etc.).
Although the accuracy of the classification of pet bottles and cans was initially low, it improved when the data was expanded with wave labels. It was also found that the accuracy was higher when continuous scenes were captured as a single group rather than classifying images of objects as they flowed by using snapshots. This is because the way they appear in the acoustic image changes depending on the relative position of the acoustic camera and the measurement target.
In this study, we tested the detection of plastic waste in a water channel using a high-resolution acoustic video camera. It was found that classification could be performed with high accuracy by performing classification processing while understanding the characteristics of sound. In the future, we would like to test this technology in actual fields.