Yadav, Anuj KumarShkel, Yanina2023-11-112023-11-112023-11-112023-08-2210.1109/ISIT54713.2023.10206466https://infoscience.epfl.ch/handle/20.500.14299/202095Given two jointly distributed random variables (X,Y), a functional representation of X is a random variable Z independent of Y, and a deterministic function g(⋅,⋅) such that X=g(Y,Z). The problem of finding a minimum entropy functional representation is known to be equivalent to the problem of finding a minimum entropy coupling where, given a collection of probability distributions P1,…,Pm, the goal is to find a coupling X1,…,Xm(Xi∼Pi) with the smallest entropy Hα(X1,…,Xm). This paper presents a new information spectrum converse, and applies it to obtain direct lower bounds on minimum entropy in both problems. The new results improve on all known lower bounds, including previous lower bounds based on the concept of majorization. In particular, the presented proofs leverage both - the information spectrum and the majorization - perspectives on minimum entropy couplings and functional representations.Information Spectrum Converse for Minimum Entropy Couplings and Functional Representationstext::conference output::conference paper not in proceedings