Abstract

We investigate two closely related successive refinement (SR) coding problems: 1) In the hypothesis testing (HT) problem, bivariate hypothesis H-o : P-XY against H-1 : P-X P-Y, i.e., test against independence is considered. One remote sensor collects data stream X and sends summary information, constrained by SR coding rates, to a decision center which observes data stream Y directly. 2) In the one-helper (OH) problem, X and Y are encoded separately and the receiver seeks to reconstruct Y losslessly. Multiple levels of coding rates are allowed at the two sensors, and the transmissions are performed in an SR manner. We show that the SR-HT rate-error-exponent region and the SR-OH rate region can be reduced to essentially the same entropy characterization form. Single-letter solutions are thus provided in a unified fashion, and the connection between them is discussed. These problems are also related to the information bottleneck (IB) problem, and through this connection we provide a straightforward operational meaning for the IB method. Connection to the pattern recognition problem, the notion of successive refinability, and two specific sources are also discussed. A strong converse for the SR-HT problem is proved by generalizing the image size characterization method, which shows the optimal type-two error exponents under constant type-one error constraints are independent of the exact values of those constants.

Details

Actions