Invited Talk: ''Forward and Backward Mappings for Quantum Graphical Models''

invited

    Bio

    Zhengfeng Ji is …

    Abstract

    Graphical models offer a unifying framework for various statistical learning algorithms and models. Central to these models are the forward and backward mapping problems, which have been studied through both exact and approximate algorithms. This talk explores these mapping problems within the context of quantum graphical models, where quantum states generalize classical probability distributions. The forward mapping problem involves deriving mean parameters from model parameters and is closely linked to approximating the partition function—a typically challenging task often requiring heuristics and approximations. We’ll discuss quantum belief propagation, which has shown success in one-dimensional systems, as well as variational methods such as Markov entropy decomposition that tackle the problem from an optimization perspective. The task of the backward mapping problem aims to compute model parameters from mean parameters. It is related to the Hamiltonian learning problem, a topic of growing interest in quantum information science lately. We’ll review some existing algorithms and introduce the quantum iterative scaling (QIS) algorithm that reduces the backward mapping problem to a series of forward mapping problems. We’ll present a convergence proof for QIS and demonstrate its advantages over gradient descent methods. Furthermore, we’ll explore how quasi-Newton methods can enhance QIS and gradient descent algorithms, showcasing significant efficiency improvements.