Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Simple Geometric Method for Cross-Lingual Linguistic Transformations with Pre-trained Autoencoders (2104.03630v2)

Published 8 Apr 2021 in cs.CL and cs.LG

Abstract: Powerful sentence encoders trained for multiple languages are on the rise. These systems are capable of embedding a wide range of linguistic properties into vector representations. While explicit probing tasks can be used to verify the presence of specific linguistic properties, it is unclear whether the vector representations can be manipulated to indirectly steer such properties. For efficient learning, we investigate the use of a geometric mapping in embedding space to transform linguistic properties, without any tuning of the pre-trained sentence encoder or decoder. We validate our approach on three linguistic properties using a pre-trained multilingual autoencoder and analyze the results in both monolingual and cross-lingual settings.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Maarten De Raedt (4 papers)
  2. Fréderic Godin (23 papers)
  3. Pieter Buteneers (1 paper)
  4. Chris Develder (59 papers)
  5. Thomas Demeester (76 papers)
Citations (1)