Pre-training General User Representation with Multi-type APP Behaviors

Pre-training General User Representation with Multi-type APP Behaviors

Yuren Zhang, Min Hou, Kai Zhang, Yuqing Yuan, Chao Song, Zhihao Ye, Enhong Chen, Yang Yu

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 5535-5544. https://doi.org/10.24963/ijcai.2024/612

In numerous user-centric services on mobile applications (apps), accurately mining user interests and generating effective user representations are paramount. Traditional approaches, which often involve training task-specific user representations, are becoming increasingly impractical due to their high computational costs and limited adaptability. This paper introduces a novel solution to this challenge: the Multi-type App-usage Fusion Network (MAFN). MAFN innovatively pre-trains universal user representations, leveraging multi-type app behaviors to overcome key limitations in existing methods. We address two primary challenges: 1) the varying frequency of user behaviors (ranging from low-frequency actions like (un)installations to high-frequency yet insightful app launches); and 2) the integration of multi-type behaviors to form a cohesive representation. Our approach involves the creation of novel pre-training tasks that harness self-supervised signals from diverse app behaviors, capturing both long-term and short-term user interests. MAFN's unique fusion approach effectively amalgamates these interests into a unified vector space, facilitating the development of a versatile, general-purpose user representation. With a practical workflow, extensive experiments with three typical downstream tasks on real-world datasets verify the effectiveness of our approach.
Keywords:
Machine Learning: ML: Representation learning
Data Mining: DM: Mining heterogenous data
Machine Learning: ML: Self-supervised Learning
Machine Learning: ML: Applications