Negative Mixture Models via Squaring: Representation and Learning

Published in TPM 2023

Authors: Lorenzo Loconte, Stefan Mengel, Nicolas Gillis, Antonio Vergari

Abstract: Negative mixture models (NMMs) can potentially be more expressive than classical non-negative ones by allowing negative coefficients, thus greatly reducing the number of components and parameters to fit. However, modeling NMMs features a number of challenges, from ensuring that negative combinations still encode valid densities or masses, to effectively learning them from data. In this paper, we investigate how we can model both shallow and hierarchical NMMs in a generic framework, via squaring. We do so by representing NMMs as probabilistic circuits (PCs) – structured computational graphs that ensure tractability. Then, we show when and how we can represent these squared NMMs as tensorized computational graphs efficiently, while theoretically proving that for certain function classes including negative parameters can exponentially reduce the model size.

Bibtex:
@inproceedings{loconte2023nmm,
title={Negative Mixture Models via Squaring: Representation and Learning},
author={Lorenzo Loconte and Stefan Mengel and Nicolas Gillis and Antonio Vergari},
booktitle={The 6th Workshop on Tractable Probabilistic Modeling},
year={2023}}