2000 character limit reached
MetaMT,a MetaLearning Method Leveraging Multiple Domain Data for Low Resource Machine Translation (1912.05467v1)
Published 11 Dec 2019 in cs.CL and cs.LG
Abstract: Manipulating training data leads to robust neural models for MT.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.