Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Automatic Composition of Guitar Tabs by Transformers and Groove Modeling (2008.01431v1)

Published 4 Aug 2020 in cs.SD and eess.AS

Abstract: Deep learning algorithms are increasingly developed for learning to compose music in the form of MIDI files. However, whether such algorithms work well for composing guitar tabs, which are quite different from MIDIs, remain relatively unexplored. To address this, we build a model for composing fingerstyle guitar tabs with Transformer-XL, a neural sequence model architecture. With this model, we investigate the following research questions. First, whether the neural net generates note sequences with meaningful note-string combinations, which is important for the guitar but not other instruments such as the piano. Second, whether it generates compositions with coherent rhythmic groove, crucial for fingerstyle guitar music. And, finally, how pleasant the composed music is in comparison to real, human-made compositions. Our work provides preliminary empirical evidence of the promise of deep learning for tab composition, and suggests areas for future study.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Yu-Hua Chen (14 papers)
  2. Yu-Hsiang Huang (4 papers)
  3. Wen-Yi Hsiao (11 papers)
  4. Yi-Hsuan Yang (89 papers)
Citations (29)

Summary

We haven't generated a summary for this paper yet.