A Multimodal User-Adaptive Recommender System
Abstract
Traditional recommendation systems have predominantly relied on user-provided ratings as explicit input. Concurrently, visually aware recommender systems harness inherent visual cues within data to decode item characteristics and deduce user preferences. However, the untapped potential of incorporating item images into the recommendation process warrants investigation. This paper introduces an original convolutional neural network (CNN) architecture that leverages multimodal information, connecting user ratings with product images to enhance item recommendations. A central innovation of the proposed model is the User-Adaptive Filtering Module, a dynamic component that utilizes user profiles to generate personalized filters. Through meticulous visual influence analysis, the effectiveness of these filters is demonstrated. Furthermore, experimental results underscore the competitive performance of the approach compared to traditional collaborative filtering methods, thereby offering a promising avenue for personalized recommendations. This approach capitalizes on user adaptation patterns, enhancing the understanding of user preferences and visual attributes.
Más información
Título según WOS: | A Multimodal User-Adaptive Recommender System |
Título de la Revista: | ELECTRONICS |
Volumen: | 12 |
Número: | 17 |
Editorial: | MDPI |
Fecha de publicación: | 2023 |
DOI: |
10.3390/electronics12173709 |
Notas: | ISI |