Transformation of Turing Machines into Context-Dependent Fusion Grammars

Aaron Lye

Context-dependent fusion grammars were recently introduced as devices for the generation of hypergraph languages. In this paper, we show that this new type of hypergraph grammars, where the application of fusion rules is restricted by positive and negative context conditions, is a universal computation model. Our main result is that Turing machines can be transformed into these grammars such that the recognized language of the Turing machine and the generated language of the corresponding context-dependent fusion grammar coincide up to representation of strings as graphs. As a corollary we get that context-dependent fusion grammars can generate all recursively enumerable string languages.

In Rachid Echahed and Detlef Plump: Proceedings Tenth International Workshop on Graph Computation Models (GCM 2019), Eindhoven, The Netherlands, 17th July 2019, Electronic Proceedings in Theoretical Computer Science 309, pp. 53–70.
Published: 20th December 2019.

ArXived at: https://dx.doi.org/10.4204/EPTCS.309.3 bibtex PDF
References in reconstructed bibtex, XML and HTML format (approximated).
Comments and questions to: eptcs@eptcs.org
For website issues: webmaster@eptcs.org