10:20 AM - 10:40 AM
[4Q1-IS-2c-05] Human Activity Recognition Framework Based on Generative Imputation of Missing Modalities
Keywords:Human Activity Recognition, Deep Learning, Missing Modality Imputation, Wearable Sensors
Human Activity Recognition (HAR) holds significance in understanding and assisting humans and so is required in related applications like healthcare and security. Especially, HAR using machine learning techniques for sensor data measured with wearable devices has been attracting attention because of its high potential. For noise and fault tolerant HAR, we propose a framework that imputes missing modalities in sensor data and recognize human activities simultaneously. Our framework consists of feature extraction by an autoencoder (AE), activity classification by a multilayer perceptron (MLP), and missing modality generation by a conditional generative adversarial network (CGAN), trained by multitask learning. In the experiment, our framework was applied to the CogAge dataset of which task was the recognition of six state activities using two modalities. The framework that was input with only one of the two modalities performed comparably to MLP and the combination of AE and MLP with both modalities.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.