Can Large Language Models be Used to Provide Psychological Counselling? An Analysis of GPT-4-Generated Responses Using Role-play Dialogues (2402.12738v1)
Abstract: Mental health care poses an increasingly serious challenge to modern societies. In this context, there has been a surge in research that utilizes information technologies to address mental health problems, including those aiming to develop counseling dialogue systems. However, there is a need for more evaluations of the performance of counseling dialogue systems that use LLMs. For this study, we collected counseling dialogue data via role-playing scenarios involving expert counselors, and the utterances were annotated with the intentions of the counselors. To determine the feasibility of a dialogue system in real-world counseling scenarios, third-party counselors evaluated the appropriateness of responses from human counselors and those generated by GPT-4 in identical contexts in role-play dialogue data. Analysis of the evaluation results showed that the responses generated by GPT-4 were competitive with those of human counselors.
- Methods in predictive techniques for mental health status on social media: a critical review. NPJ digital medicine, Vol. 3, No. 1, p. 43, 2020.
- PAL: Persona-augmented emotional support conversation generation. In Findings of the Association for Computational Linguistics: ACL 2023, pp. 535–554, 2023.
- Toxicity in chatgpt: Analyzing persona-assigned language models, 2023.
- Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (woebot): a randomized controlled trial. JMIR mental health, Vol. 4, No. 2, p. e7785, 2017.
- Characterisation of mental health conditions in social media using informed deep learning. Scientific reports, Vol. 7, No. 1, p. 45141, 2017.
- Supervised learning for suicidal ideation detection in online user content. Complexity, Vol. 2018, , 2018.
- Certifying llm safety against adversarial prompting, 2023.
- Chatcounselor: A large language models for mental health support, 2023.
- Towards emotional support dialog systems. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 3469–3483, 2021.
- Ministry of Health, Labour and Welfare of Japan. White paper on suicide prevention, 2020.
- Improving the generalizability of depression detection by leveraging clinical questionnaires. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 8446–8459, 2022.
- World Health Organization. World mental health report: Transforming mental health for all, 2022.
- Mental health disorder identification from motivational conversations. IEEE Transactions on Computational Social Systems, 2022.
- Towards facilitating empathic conversations in online mental health support: A reinforcement learning approach. WWW ’21, p. 194â205. Association for Computing Machinery, 2021.
- Chain-of-thought prompting elicits reasoning in large language models. Advances in Neural Information Processing Systems, Vol. 35, pp. 24824–24837, 2022.
- Red teaming chatgpt via jailbreaking: Bias, robustness, reliability and toxicity, 2023.
- Michimasa Inaba (7 papers)
- Mariko Ukiyo (3 papers)
- Keiko Takamizo (3 papers)