MetaSense

Few-Shot Adaptation to Untrained Conditions in Deep Mobile Sensing



Abstract


Recent improvements in deep learning and hardware support offer a new breakthrough in mobile sensing; we could enjoy context-aware services and mobile healthcare on a mobile device powered by artificial intelligence. However, most related studies perform well only with a certain level of similarity between trained and target data distribution, while in practice, a specific user's behaviors and device make sensor inputs different. Consequently, the performance of such applications might suffer in diverse user and device conditions as training deep models in such countless conditions is infeasible. To mitigate the issue, we propose MetaSense, an adaptive deep mobile sensing system utilizing only a few (e.g., one or two) data instances from the target user. MetaSense employs meta learning that learns how to adapt to the target user's condition, by rehearsing multiple similar tasks generated from our unique task generation strategies in offline training. The trained model has the ability to rapidly adapt to the target user's condition when a few data are available. Our evaluation with real-world traces of motion and audio sensors shows that MetaSense not only outperforms the state-of-the-art transfer learning by 18% and meta learning based approaches by 15% in terms of accuracy, but also requires significantly less adaptation time for the target user.


Video




Publications


Adapting to Unknown Conditions in Learning-based Mobile Sensing
Taesik Gong, Yeonsu Kim, Ryuhaerang Choi, Jinwoo Shin, and Sung-Ju Lee
IEEE Transactions on Mobile Computing 2021.
PDF

MetaSense: Few-Shot Adaptation to Untrained Conditions in Deep Mobile Sensing
Taesik Gong, Yeonsu Kim, Jinwoo Shin, and Sung-Ju Lee
Proceedings of ACM SenSys 2019.
PDF Slides Code

Towards Condition-Independent Deep Mobile Sensing
Taesik Gong, Yeonsu Kim, Jinwoo Shin, and Sung-Ju Lee
Proceedings of ACM MobiSys 2019 (Poster).
PDF


People


Taesik Gong

KAIST

Yeonsu Kim

KAIST

Ryuhaerang Choi

KAIST

Jinwoo Shin

KAIST

Sung-Ju Lee

KAIST

Awards


Excellence Award | KAIST Undergraduate Research Program Workshop



Dataset


We provide our datasets, Individual-Condition Human Activity Recognition (ICHAR) and Individual-Condition Speech Recognition (ICSR), to foster further studies. For more information, please refer to the README.md file in our code.

Dataset Link