Content-based artwork recommendation: integrating painting metadata with neural and manually-engineered visual features
Keywords: recommender systems, metadata, visual features, artwork, Deep neural networks, Content-based recommender, Hybrid recommendations
Recommender Systems help us deal with information overload by suggesting relevant items based on our personal preferences. Although there is a large body of research in areas such as movies or music, artwork recommendation has received comparatively little attention, despite the continuous growth of the artwork market. Most previous research has relied on ratings and metadata, and a few recent works have exploited visual features extracted with deep neural networks (DNN) to recommend digital art. In this work, we contribute to the area of content-based artwork recommendation of physical paintings by studying the impact of the aforementioned features (artwork metadata, neural visual features), as well as manually-engineered visual features, such as naturalness, brightness and contrast. We implement and evaluate our method using transactional data from UGallery.com, an online artwork store. Our results show that artwork recommendations based on a hybrid combination of artist preference, curated attributes, deep neural visual features and manually-engineered visual features produce the best performance. Moreover, we discuss the trade-off between automatically obtained DNN features and manually-engineered visual features for the purpose of explainability, as well as the impact of user profile size on predictions. Our research informs the development of next-generation content-based artwork recommenders which rely on different types of data, from text to multimedia.
|Título según WOS:||Content-based artwork recommendation: integrating painting metadata with neural and manually-engineered visual features|
|Título de la Revista:||USER MODELING AND USER-ADAPTED INTERACTION|
|Fecha de publicación:||2019|
|Página de inicio:||251|