- Wednesday 25 May 2022, 12:00 - 13:00
- Theil Building
We propose a new method for learning product complementarity patterns in shopping baskets, inspired by Google’s Bidirectional Encoder Representations from Transformers (BERT) for natural language processing.
We reformulate BERT’s masked learning task in a marketing context and learn to accurately identify missing products from a real-life grocery shopping basket based on the other products purchased in that same basket.
The resulting model, which we call BaskERT, can be used by retailers for personalized product recommendations and for analyzing product complementarity patterns across the assortment. BaskERT outperforms several state-of-the-art benchmarks in a basket completion task. Different procedures for sampling the missing product impact the variety of recommendations returned by the model, favoring either the more popular or less popular products.
The model is easily scalable to large assortments with thousands of products. As our model only requires basket data from the current shopping trip, it is applicable in many situations, also when personal or past purchased data is not available, for example because of privacy regulations..