Learning Functors using Gradient Descent

Bruno Gavranović

Neural networks are a general framework for differentiable optimization which includes many other machine learning approaches as special cases. In this paper we build a category-theoretic formalism around a neural network system called CycleGAN. CycleGAN is a general approach to unpaired image-to-image translation that has been getting attention in the recent years. Inspired by categorical database systems, we show that CycleGAN is a "schema", i.e. a specific category presented by generators and relations, whose specific parameter instantiations are just set-valued functors on this schema. We show that enforcing cycle-consistencies amounts to enforcing composition invariants in this category. We generalize the learning procedure to arbitrary such categories and show a special class of functors, rather than functions, can be learned using gradient descent. Using this framework we design a novel neural network system capable of learning to insert and delete objects from images without paired data. We qualitatively evaluate the system on the CelebA dataset and obtain promising results.

In John Baez and Bob Coecke: Proceedings Applied Category Theory 2019 (ACT 2019), University of Oxford, UK, 15-19 July 2019, Electronic Proceedings in Theoretical Computer Science 323, pp. 230–245.
This paper is a condensed version of the master thesis of the author (https://arxiv.org/abs/1907.08292)
Published: 15th September 2020.

ArXived at: http://dx.doi.org/10.4204/EPTCS.323.15 bibtex PDF
References in reconstructed bibtex, XML and HTML format (approximated).
Comments and questions to: eptcs@eptcs.org
For website issues: webmaster@eptcs.org