tutorial logo

This tutorial will bridge the two often separate communities of tensor factorizations and circuit representations, which investigate concepts that are intimately related.

By connecting these fields, we highlight a series of opportunities that can benefit both communities. We will draw theoretical as well as practical connections, e.g., in efficient probabilistic inference, reliable neuro-symbolic AI and scalable statistical modeling. Second, we will introduce a modular “Lego block” approach to build tensorized circuit architectures in a unified way. This, in turn, allows us to systematically construct and explore various circuit and tensor factorization models while maintaining tractability. At the end of it, the audience will learn about the state-of-the-art in representing, learning and scaling tensor factorizations and circuits.

Outline. The tutorial will start from classical tensor factorizations and extend them to a hierarchical setting, where the connection to circuit representations will be highlighted. Then, we will list several opportunities by bridging the two communities, such as using hierarchical tensor factorizations for neuro-symbolic inference or exploiting algorithms from the tensor network communities to learn circuits. Lastly, we will showcase how one can understand the many recent algorithms and representations to learn circuits from data as hierarchical tensor factorizations.

Prerequisite knowledge. The tutorial requires basic knowledge of machine learning, probability theory and linear algebra at the level of an introductory course. All other required notions will be provided during the tutorial.

✨ Check out also the Workshop on Connecting Low-Rank Representations in AI at AAAI-25! ✨

News

Materials

Stay tuned!

Speakers


University of Edinburgh

University of Edinburgh

Recommended reading

Last build date: 2024-11-11.