Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Successive refinement for hypothesis testing and lossless one-helper problem
 
conference paper

Successive refinement for hypothesis testing and lossless one-helper problem

Tian, Chao
•
Chen, Jun
2008
Ieee Transactions On Information Theory
IEEE International Symposium on Information Theory

We investigate two closely related successive refinement (SR) coding problems: 1) In the hypothesis testing (HT) problem, bivariate hypothesis H-o : P-XY against H-1 : P-X P-Y, i.e., test against independence is considered. One remote sensor collects data stream X and sends summary information, constrained by SR coding rates, to a decision center which observes data stream Y directly. 2) In the one-helper (OH) problem, X and Y are encoded separately and the receiver seeks to reconstruct Y losslessly. Multiple levels of coding rates are allowed at the two sensors, and the transmissions are performed in an SR manner. We show that the SR-HT rate-error-exponent region and the SR-OH rate region can be reduced to essentially the same entropy characterization form. Single-letter solutions are thus provided in a unified fashion, and the connection between them is discussed. These problems are also related to the information bottleneck (IB) problem, and through this connection we provide a straightforward operational meaning for the IB method. Connection to the pattern recognition problem, the notion of successive refinability, and two specific sources are also discussed. A strong converse for the SR-HT problem is proved by generalizing the image size characterization method, which shows the optimal type-two error exponents under constant type-one error constraints are independent of the exact values of those constants.

  • Details
  • Metrics
Type
conference paper
DOI
10.1109/TIT.2008.928951
Web of Science ID

WOS:000259407000016

Author(s)
Tian, Chao
Chen, Jun
Date Issued

2008

Published in
Ieee Transactions On Information Theory
Volume

54

Start page

4666

End page

4681

Subjects

entropy characterization

•

error exponent

•

hypothesis testing

•

image size characterization

•

information bottleneck

•

one-helper problem

•

successive refinement

•

Side Information

•

Rate-Distortion

•

Achievable Rates

•

Error Exponents

•

Channels

•

Quantization

•

Noise

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
IC  
Event nameEvent placeEvent date
IEEE International Symposium on Information Theory

Nice, FRANCE

Jun 24-29, 2007

Available on Infoscience
November 30, 2010
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/61021
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés