Towards Flexible and Adaptive Neural Process for Cold-Start Recommendation

No Thumbnail Available
File version
Author(s)
Lin, X
Zhou, C
Wu, J
Zou, L
Pan, S
Cao, Y
Wang, B
Wang, S
Yin, D
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
2023
Size
File type(s)
Location
License
Abstract

Recommender systems have been widely adopted in various online personal e-commerce applications for improving user experience. A long-standing challenge in recommender systems is how to provide accurate recommendation to users in cold-start situations where only a few user-item interactions can be observed. Recently, meta learning methods provide a promising solution, and most of them follow a way of parameter initialization where predictions can be fast adapted via multiple gradient descent steps. While these meta-learning recommenders promote model performance, how to derive a fundamental paradigm that enables both flexible approximations of complex user interaction distributions and effective task adaptations of global knowledge still remains a critical yet under-explored problem. To this end, we present the Flow-based Adaptive Neural Process (FANP), a new probabilistic meta-learning model where estimating the preference of each user is governed by an underlying stochastic process. Following an encoder-decoder generative framework, FANP is an effective few-shot function estimator that directly maps limited user interactions to a predictive distribution without complicated gradient updates. Through introducing a conditional normalization flow-based encoder, FANP can get rid of the model bias on latent variables and thereby derive more flexible variational distributions. Meanwhile, we propose a task-adaptive mechanism capturing the relevance of different tasks for improving adaptation ability of global knowledge. The learned task-specific and task-relevant representations are simultaneously exploited to generate the decoder parameters via a novel modulation-augmented hypernetwork. FANP is evaluated on both scenario-specific and user-specific cold-start recommendations on various real-world datasets. Extensive experimental results and detailed model analyses demonstrate that our model yields superior performance compared with multiple state-of-the-art meta-learning recommenders.

Journal Title

IEEE Transactions on Knowledge and Data Engineering

Conference Title
Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note

This publication has been entered in Griffith Research Online as an advanced online version.

Access the data
Related item(s)
Subject

Neural networks

Deep learning

Information and computing sciences

Persistent link to this record
Citation

Lin, X; Zhou, C; Wu, J; Zou, L; Pan, S; Cao, Y; Wang, B; Wang, S; Yin, D, Towards Flexible and Adaptive Neural Process for Cold-Start Recommendation, IEEE Transactions on Knowledge and Data Engineering, 2023

Collections