TY - GEN
T1 - On-Device Learning for Model Personalization with Large-Scale Cloud-Coordinated Domain Adaption
AU - Yan, Yikai
AU - Niu, Chaoyue
AU - Gu, Renjie
AU - Wu, Fan
AU - Tang, Shaojie
AU - Hua, Lifeng
AU - Lyu, Chengfei
AU - Chen, Guihai
N1 - Publisher Copyright: © 2022 ACM.
PY - 2022/8/14
Y1 - 2022/8/14
N2 - Cloud-based learning is currently the mainstream in both academia and industry. However, the global data distribution, as a mixture of all the users' data distributions, for training a global model may deviate from each user's local distribution for inference, making the global model non-optimal for each individual user. To mitigate distribution discrepancy, on-device training over local data for model personalization is a potential solution, but suffers from serious overfitting. In this work, we propose a new device-cloud collaborative learning framework under the paradigm of domain adaption, called MPDA, to break the dilemmas of purely cloud-based learning and on-device training. From the perspective of a certain user, the general idea of MPDA is to retrieve some similar data from the cloud's global pool, which functions as large-scale source domains, to augment the user's local data as the target domain. The key principle of choosing which outside data depends on whether the model trained over these data can generalize well over the local data. We theoretically analyze that MPDA can reduce distribution discrepancy and overfitting risk. We also extensively evaluate over the public MovieLens 20M and Amazon Electronics datasets, as well as an industrial dataset collected from Mobile Taobao over a period of 30 days. We finally build a device-tunnel-cloud system pipeline, deploy MPDA in the icon area of Mobile Taobao for click-through rate prediction, and conduct online A/B testing. Both offline and online results demonstrate that MPDA outperforms the baselines of cloud-based learning and on-device training only over local data, from multiple offline and online metrics.
AB - Cloud-based learning is currently the mainstream in both academia and industry. However, the global data distribution, as a mixture of all the users' data distributions, for training a global model may deviate from each user's local distribution for inference, making the global model non-optimal for each individual user. To mitigate distribution discrepancy, on-device training over local data for model personalization is a potential solution, but suffers from serious overfitting. In this work, we propose a new device-cloud collaborative learning framework under the paradigm of domain adaption, called MPDA, to break the dilemmas of purely cloud-based learning and on-device training. From the perspective of a certain user, the general idea of MPDA is to retrieve some similar data from the cloud's global pool, which functions as large-scale source domains, to augment the user's local data as the target domain. The key principle of choosing which outside data depends on whether the model trained over these data can generalize well over the local data. We theoretically analyze that MPDA can reduce distribution discrepancy and overfitting risk. We also extensively evaluate over the public MovieLens 20M and Amazon Electronics datasets, as well as an industrial dataset collected from Mobile Taobao over a period of 30 days. We finally build a device-tunnel-cloud system pipeline, deploy MPDA in the icon area of Mobile Taobao for click-through rate prediction, and conduct online A/B testing. Both offline and online results demonstrate that MPDA outperforms the baselines of cloud-based learning and on-device training only over local data, from multiple offline and online metrics.
KW - device-cloud collaborative learning
KW - domain adaptation
KW - model personalization
KW - recommender systems
UR - https://www.scopus.com/pages/publications/85137145445
U2 - 10.1145/3534678.3539263
DO - 10.1145/3534678.3539263
M3 - Conference contribution
T3 - Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
SP - 2180
EP - 2190
BT - KDD 2022 - Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
PB - Association for Computing Machinery
T2 - 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2022
Y2 - 14 August 2022 through 18 August 2022
ER -