A Multiarmed Bandit Approach for House Ads Recommendations

Schiappacasse, Mario; Goic, Marcel

Abstract

Nowadays, websites use a variety of recommendation systems to decide the content to display to their visitors. In this work, we use a multiarmed bandit approach to dynamically select the combination of house ads to exhibit to a heterogeneous set of customers visiting the website of a large retailer. House ads correspond to promotional information displayed on the website to highlight some specific products and are an important marketing tool for online retailers. As the number of clicks they receive not only depends on their own attractiveness but also on how attractive are other products displayed around them, we decide about complete collections of ads that capture those interactions. Moreover, as ads can wear out, in our recommendations we allow for nonstationary rewards. Furthermore, considering the sparsity of customer-level information, we embed a deep neural network to provide personalized recommendations within a bandit scheme. We tested our methods in controlled experiments where we compared them against decisions made by an experienced team of managers and the recommendations of a variety of other bandit policies. Our results show a more active exploration of the decision space and a significant increment in click-through and add-to-cart rates.

Más información

Título según WOS: A Multiarmed Bandit Approach for House Ads Recommendations
Título de la Revista: MARKETING SCIENCE
Volumen: 42
Número: 2
Editorial: INFORMS
Fecha de publicación: 2023
Página de inicio: 271
Página final: 292
DOI:

10.1287/mksc.2022.1378

Notas: ISI