2000 character limit reached
Amadeus-Verbo Technical Report: The powerful Qwen2.5 family models trained in Portuguese (2506.00019v1)
Published 20 May 2025 in cs.CL and cs.AI
Abstract: This report introduces the experience of developing Amadeus Verbo, a family of LLMs for Brazilian Portuguese. To handle diverse use cases, Amadeus Verbo includes base-tuned, merged, and instruction-tuned models in sizes of 0.5B, 1.5B, 3B, 7B, 14B, 32B, and 72B parameters. Thus, the main objective is to show how easy it is to fine-tune foundation models to democratize the open-source development of Brazilian Portuguese LLMs when data and resources are available. Amadeus-Verbo family models are all available at HuggingFace at https://huggingface.co/collections/amadeusai/amadeus-verbo-qwen25-67cf2e7aae69ce2b3bcdcfda.