Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Russian SuperGLUE 1.1: Revising the Lessons not Learned by Russian NLP models (2202.07791v1)

Published 15 Feb 2022 in cs.CL and cs.AI

Abstract: In the last year, new neural architectures and multilingual pre-trained models have been released for Russian, which led to performance evaluation problems across a range of language understanding tasks. This paper presents Russian SuperGLUE 1.1, an updated benchmark styled after GLUE for Russian NLP models. The new version includes a number of technical, user experience and methodological improvements, including fixes of the benchmark vulnerabilities unresolved in the previous version: novel and improved tests for understanding the meaning of a word in context (RUSSE) along with reading comprehension and common sense reasoning (DaNetQA, RuCoS, MuSeRC). Together with the release of the updated datasets, we improve the benchmark toolkit based on \texttt{jiant} framework for consistent training and evaluation of NLP-models of various architectures which now supports the most recent models for Russian. Finally, we provide the integration of Russian SuperGLUE with a framework for industrial evaluation of the open-source models, MOROCCO (MOdel ResOurCe COmparison), in which the models are evaluated according to the weighted average metric over all tasks, the inference speed, and the occupied amount of RAM. Russian SuperGLUE is publicly available at https://russiansuperglue.com/.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Alena Fenogenova (17 papers)
  2. Maria Tikhonova (10 papers)
  3. Vladislav Mikhailov (31 papers)
  4. Tatiana Shavrina (18 papers)
  5. Anton Emelyanov (4 papers)
  6. Denis Shevelev (5 papers)
  7. Alexandr Kukushkin (1 paper)
  8. Valentin Malykh (24 papers)
  9. Ekaterina Artemova (53 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.