Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generative Job Recommendations with Large Language Model (2307.02157v1)

Published 5 Jul 2023 in cs.IR and cs.CL

Abstract: The rapid development of online recruitment services has encouraged the utilization of recommender systems to streamline the job seeking process. Predominantly, current job recommendations deploy either collaborative filtering or person-job matching strategies. However, these models tend to operate as "black-box" systems and lack the capacity to offer explainable guidance to job seekers. Moreover, conventional matching-based recommendation methods are limited to retrieving and ranking existing jobs in the database, restricting their potential as comprehensive career AI advisors. To this end, here we present GIRL (GeneratIve job Recommendation based on LLMs), a novel approach inspired by recent advancements in the field of LLMs. We initially employ a Supervised Fine-Tuning (SFT) strategy to instruct the LLM-based generator in crafting suitable Job Descriptions (JDs) based on the Curriculum Vitae (CV) of a job seeker. Moreover, we propose to train a model which can evaluate the matching degree between CVs and JDs as a reward model, and we use Proximal Policy Optimization (PPO)-based Reinforcement Learning (RL) method to further fine-tine the generator. This aligns the generator with recruiter feedback, tailoring the output to better meet employer preferences. In particular, GIRL serves as a job seeker-centric generative model, providing job suggestions without the need of a candidate set. This capability also enhances the performance of existing job recommendation models by supplementing job seeking features with generated content. With extensive experiments on a large-scale real-world dataset, we demonstrate the substantial effectiveness of our approach. We believe that GIRL introduces a paradigm-shifting approach to job recommendation systems, fostering a more personalized and comprehensive job-seeking experience.

The paper "Generative Job Recommendations with LLM" introduces a novel approach for job recommendation systems using generative LLMs. Traditional job recommendation methodologies predominantly rely on collaborative filtering or deterministic person-job matching paradigms, which are often limited by their black-box nature and primarily focus on ranking existing jobs rather than generating novel job opportunities.

Key Contributions:

  1. Generative Approach: The authors propose a generative job recommendation method named GIRL (GeneratIve job Recommendation based on LLMs), which utilizes LLMs to create personalized job descriptions (JDs) directly from job seekers' curriculum vitae (CVs). This generative model is designed to provide job seekers with tailored job opportunities that align closely with their skills and aspirations, potentially filling gaps that traditional methods leave open.
  2. Training Methodology: The core of GIRL's approach lies in a three-step training process:
    • Supervised Fine-Tuning (SFT): The LLM generator is initially trained to generate appropriate JDs based on existing CV-JD pairs, leveraging a dataset of historical matches.
    • Reward Model Training (RMT): A reward model is trained to predict matching scores between CVs and JDs, mimicking recruiter feedback. This is done using a dataset of both matched and mismatched CV-JD pairs.
    • Reinforcement Learning (RL): Proximal Policy Optimization (PPO) is employed to refine the JD generator further. The LLM is trained to align its outputs with the reward model's predictions, incorporating recruiter preferences into the generative process.
  3. Augmenting Discriminative Models: In addition to direct job generation, GIRL enhances traditional job recommendation systems. By utilizing generated job features, it aids in improving matching accuracy by supplementing the discriminative models' input space.

Experimental Insights:

  • Generative Quality: Through extensive experiments, including tasks assessed with external evaluators like ChatGPT, the generated job descriptions are shown to possess high quality, clarity, and relevance, significantly outperforming traditional LLM configurations without specific domain tuning.
  • Enhanced Recommendation Performance: When integrated into traditional recommendation systems, the generated JDs improve predictive performance metrics such as AUC and LogLoss, demonstrating GIRL's capability to enhance conventional approaches, especially beneficial in cold-start scenarios with new job seekers.
  • Different Predictor Approaches: The paper compares different predictors, including MLP and Dot product-based methods, for incorporating generated JDs into traditional models. Both methods show performance improvements when utilizing the generation-enhanced job seeker representations.

The proposed framework, GIRL, represents a shift towards more personalized and comprehensive job-seeking assistance by exploiting the generative capabilities of LLMs, marking a substantial enhancement over existing deterministic and discriminative approaches in job recommendations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Zhi Zheng (46 papers)
  2. Zhaopeng Qiu (13 papers)
  3. Xiao Hu (151 papers)
  4. Likang Wu (25 papers)
  5. Hengshu Zhu (66 papers)
  6. Hui Xiong (244 papers)
Citations (16)