Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Proceedings Tenth International Workshop on Graph Computation Models (1912.08966v1)

Published 19 Dec 2019 in cs.LO and cs.FL

Abstract: This volume contains the post-proceedings of the Tenth International Workshop on Graph Computation Models (GCM 2019: http://gcm2019.imag.fr). The workshop was held in Eindhoven, The Netherlands, on July 17th, 2019, as part of STAF 2019 (Software Technologies: Applications and Foundations). Graphs are common mathematical structures that are visual and intuitive. They constitute a natural and seamless way for system modelling in science, engineering and beyond, including computer science, biology, business process modelling, etc. Graph computation models constitute a class of very high-level models where graphs are first-class citizens. The aim of the International GCM Workshop series is to bring together researchers interested in all aspects of computation models based on graphs and graph transformation. It promotes the cross-fertilizing exchange of ideas and experiences among senior and young researchers from the different communities interested in the foundations, applications, and implementations of graph computation models and related areas. These post-proceedings contain four selected papers from GCM2019 proceedings and an invited presentation that gives an account of the very successful panel discussion dedicated to the Analysis of Graph Transformation Systems, which took place during the workshop and was animated by Reiko Heckel, Leen Lambers and Maryam Ghaffari Saadat. All submissions were subject to careful refereeing. The topics of accepted papers include theoretical aspects of graph transformation and parsing techniques as well as an application to model-driven engineering.

Summary

We haven't generated a summary for this paper yet.