Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exploring Semi-supervised Hierarchical Stacked Encoder for Legal Judgement Prediction (2311.08103v1)

Published 14 Nov 2023 in cs.CL, cs.AI, and cs.IR

Abstract: Predicting the judgment of a legal case from its unannotated case facts is a challenging task. The lengthy and non-uniform document structure poses an even greater challenge in extracting information for decision prediction. In this work, we explore and propose a two-level classification mechanism; both supervised and unsupervised; by using domain-specific pre-trained BERT to extract information from long documents in terms of sentence embeddings further processing with transformer encoder layer and use unsupervised clustering to extract hidden labels from these embeddings to better predict a judgment of a legal case. We conduct several experiments with this mechanism and see higher performance gains than the previously proposed methods on the ILDC dataset. Our experimental results also show the importance of domain-specific pre-training of Transformer Encoders in legal information processing.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Nishchal Prasad (3 papers)
  2. Mohand Boughanem (4 papers)
  3. Taoufiq Dkaki (2 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.