Improving conversion rate prediction via self-supervised pre-training in online advertising

2023 IEEE International Conference on Big Data (BigData)(2024)

引用 0|浏览2
暂无评分
摘要
The task of predicting conversion rates (CVR) lies at the heart of online advertising systems aiming to optimize bids to meet advertiser performance requirements. Even with the recent rise of deep neural networks, these predictions are often made by factorization machines (FM), especially in commercial settings where inference latency is key. These models are trained using the logistic regression framework on labeled tabular data formed from past user activity that is relevant to the task at hand. Many advertisers only care about click-attributed conversions. A major challenge in training models that predict conversions-given-clicks comes from data sparsity - clicks are rare, conversions attributed to clicks are even rarer. However, mitigating sparsity by adding conversions that are not click-attributed to the training set impairs model calibration. Since calibration is critical to achieving advertiser goals, this is infeasible. In this work we use the well-known idea of self-supervised pre-training, and use an auxiliary auto-encoder model trained on all conversion events, both click-attributed and not, as a feature extractor to enrich the main CVR prediction model. Since the main model does not train on non click-attributed conversions, this does not impair calibration. We adapt the basic self-supervised pre-training idea to our online advertising setup by using a loss function designed for tabular data, facilitating continual learning by ensuring auto-encoder stability, and incorporating a neural network into a large-scale real-time ad auction that ranks tens of thousands of ads, under strict latency constraints, and without incurring a major engineering cost. We show improvements both offline, during training, and in an online A/B test. Following its success in A/B tests, our solution is now fully deployed to the Yahoo native advertising system.
更多
查看译文
关键词
Advertising,Online Advertising,Neural Network,Training Set,Deep Neural Network,Main Model,Sparse Data,Auction,Strict Constraints,Autoencoder Model,Latency Constraints,Positive Bias,Multilayer Perceptron,Online Learning,Latent Space,Prediction Task,Random Data,Embedding Dimension,Latent Representation,Kernel Methods,Latent Vector,Log Loss,Worth Of Data,Self-supervised Learning,Embedding Vectors,Reconstruction Loss,Test Split
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要