Generative Pretraining at Scale: Transformer-Based Encoding of Transactional Behavior for Fraud Detection
Abstract: In this work, we introduce an innovative autoregressive model leveraging Generative Pretrained Transformer (GPT) architectures, tailored for fraud detection in payment systems. Our approach innovatively confronts token explosion and reconstructs behavioral sequences, providing a nuanced understanding of transactional behavior through temporal and contextual analysis. Utilizing unsupervised pretraining, our model excels in feature representation without the need for labeled data. Additionally, we integrate a differential convolutional approach to enhance anomaly detection, bolstering the security and efficacy of one of the largest online payment merchants in China. The scalability and adaptability of our model promise broad applicability in various transactional contexts.
- Session-based Recommendations with Recurrent Neural Networks.
- Attention Is All You Need.
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
- Self-Attentive Sequential Recommendation.
- BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer.
- S3-Rec: Self-Supervised Learning for Sequential Recommendation with Mutual Information Maximization.
- Contrastive Learning for Sequential Recommendation.
- Parameter-Efficient Transfer from Sequential Behaviors for User Modeling and Recommendation.
- UPRec: User-Aware Pre-training for Recommender Systems.
- Towards Universal Sequence Representation Learning for Recommender Systems.
- Recommendation as Language Processing (RLP): A Unified Pretrain, Personalized Prompt & Predict Paradigm (P5).
- M6-Rec: Generative Pretrained Language Models are Open-Ended Recommender Systems.
- Progressive Generation of Long Text with Pretrained Language Models.
- Language Models are Few-Shot Learners.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.