GCOF: Self-iterative Text Generation for Copywriting Using Large Language Model (2402.13667v1)
Abstract: LLMs(LLM) such as ChatGPT have substantially simplified the generation of marketing copy, yet producing content satisfying domain specific requirements, such as effectively engaging customers, remains a significant challenge. In this work, we introduce the Genetic Copy Optimization Framework (GCOF) designed to enhance both efficiency and engagememnt of marketing copy creation. We conduct explicit feature engineering within the prompts of LLM. Additionally, we modify the crossover operator in Genetic Algorithm (GA), integrating it into the GCOF to enable automatic feature engineering. This integration facilitates a self-iterative refinement of the marketing copy. Compared to human curated copy, Online results indicate that copy produced by our framework achieves an average increase in click-through rate (CTR) of over $50\%$.
- Junyi Chen. 2023. A survey on large language models for personalized and explainable recommendations.
- Plug and play language models: A simple approach to controlled text generation. In ICLR 2020. OpenReview.net.
- Controlled Text Generation via Language Model Arithmetic. arXiv e-prints, page arXiv:2311.14479.
- A survey on in-context learning.
- Gsum: A general framework for guided neural abstractive summarization. CoRR, abs/2010.08014.
- A unified framework for multi-domain ctr prediction via large language models.
- Connecting large language models with evolutionary algorithms yields powerful prompt optimizers. ArXiv, abs/2309.08532.
- CTRL: A conditional transformer language model for controllable generation. CoRR, abs/1909.05858.
- Gedi: Generative discriminator guided sequence generation. In Findings of EMNLP 2021, pages 4929–4952. Association for Computational Linguistics.
- Xiang Lisa Li and Percy Liang. 2021. Prefix-tuning: Optimizing continuous prompts for generation. CoRR, abs/2101.00190.
- Interpretable click-through rate prediction through hierarchical attention. In Proceedings of the 13th International Conference on Web Search and Data Mining, pages 313–321.
- How can recommender systems benefit from large language models: A survey.
- Styleptb: A compositional benchmark for fine-grained controllable text style transfer. In NAACL-HLT 2021, pages 2116–2138. Association for Computational Linguistics.
- Tsi: An ad text strength indicator using text-to-ctr and semantic-ad-similarity. Proceedings of the 30th ACM International Conference on Information & Knowledge Management.
- OpenAI. 2023. Gpt-4 technical report.
- Minet: Mixed interest network for cross-domain click-through rate prediction.
- Practice on long sequential user behavior modeling for click-through rate prediction. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pages 2671–2679.
- Rainer Storn and Kenneth Price. 1997. Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. J. of Global Optimization, 11(4):341–359.
- Chain of thought prompting elicits reasoning in large language models. CoRR, abs/2201.11903.
- CREATER: CTR-driven advertising text generation with controlled pre-training and contrastive fine-tuning. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Track, pages 9–17, Hybrid: Seattle, Washington + Online. Association for Computational Linguistics.
- Kevin Yang and Dan Klein. 2021. FUDGE: controlled text generation with future discriminators. CoRR, abs/2104.05218.
- Yanwu Yang and Panyu Zhai. 2022. Click-through rate prediction in online advertising: A literature review. Information Processing & Management, 59(2):102853.
- Tree of thoughts: Deliberate problem solving with large language models. ArXiv, abs/2305.10601.
- Self-rewarding language models.
- A survey of controllable text generation using transformer-based pre-trained language models. CoRR, abs/2201.05337.
- Ia-gcn: Interactive graph convolutional network for recommendation.
- Judging llm-as-a-judge with mt-bench and chatbot arena.
- Deep interest evolution network for click-through rate prediction.
- Deep interest network for click-through rate prediction.
- Controlled text generation with natural language instructions. ArXiv, abs/2304.14293.
- Fine-tuning language models from human preferences. CoRR, abs/1909.08593.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.