Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Systematically Deriving Domain-Specific Transformation Languages (1511.05366v1)

Published 17 Nov 2015 in cs.SE

Abstract: Model transformations are helpful to evolve, refactor, refine and maintain models. While domain-specific languages are normally intuitive for modelers, common model transformation approaches (regardless of whether they transform graphical or textual models) are based on the modeling language's abstract syntax requiring the modeler to learn the internal representation of the model to describe transformations. This paper presents a process that allows to systematically derive a textual domainspecific transformation language from the grammar of a given textual modeling language. As example, we apply this systematic derivation to UML class diagrams to obtain a domain-specific transformation language called CDTrans. CDTrans incorporates the concrete syntax of class diagrams which is already familiar to the modeler and extends it with a few transformation operators. To demonstrate the usefulness of the derived transformation language, we describe several refactoring transformations.

Citations (28)

Summary

We haven't generated a summary for this paper yet.