Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Student works
  4. ON THE INEQUALITIES IN INFORMATION THEORY
 
Loading...
Thumbnail Image
semester or other student projects

ON THE INEQUALITIES IN INFORMATION THEORY

Pulikkoonattu, Rethnakaran  
2007

Claude Elwood Shannon in 1948, then of the Bell Labs, published one of the ground breaking papers in the history of engineering [1]. This paper (”A Mathematical Theory of Communication”, Bell System Tech. Journal, Vol. 27, July and October 1948, pp. 379 - 423 and pp. 623 - 656) laid the groundwork of an entirely new scientific discipline, information Theory, that enabled engineers for the first time to deal quantitatively with the elusive concept of information”. In his celebrated work, Shannon cleanly laid the foundation for transmission and storage of information. Using a probabilistic model, his theory helped to get further insight into the achievable limits of information transfer over perturbed medium called channel. Indeed the very same concept is used to predict the limits on data compression and achievable transmission rate on a probabilistic channel.These underlying concepts can be thought of as inequalities involving measures of probability distributions. Shannon defined several such basic measures in his original work. The field of Information Theory grew with researchers finding more results and insights into the fundamental problem of transmission of and storage using probabilistic models. By nature of the subject itself, the results obtained are usually inequalities involving basic Shannon’s measures such as entropies. Some of them are elementary, some rather complicated expressions. In order to prove further theorems as well it required to check whether certain expressions are true in an Information Theoretic sense. This motivated researchers to seek a formal method to check all possible inequalities. Raymond Yeung [2] in 1998 came out with a remarkable framework, which could verify many of the inequalities in this field. His framework thus enabled to verify all inequalities, derived from the basic Shannon measure properties. A central notion of Information Theory is entropy, which Shannon defines as measure of information itself. Given a set of jointly distributed random variables X1, X2, . . . , Xn, we can consider entropies of all random variables H(Xi), entropies of all pairs H (Xi , Xj ), etc. (2n − 1 entropy values for all nonempty subsets of {X1 , X2 , ..., Xn }). For every n-tuple of random variables we get a point in R2n−1, representing entropies of the given distribution. Following [2] we call a point in R2n−1 constructible if it represents entropy values of some collection of n random variables. The set of all constructible points is denoted b y Γ ∗n It is hard to characterize Γ∗n for an arbitrary n (for n ≥ 3, it is not even closed [?]). A more feasible (but also highly non- trivial) problem is to describe the closure Γ ̄∗n of the set Γ∗n. The set Γ ̄∗n n is a convex cone [?], and to characterize it we should describe the class of all linear inequalities of the form λ1H(X1) + . . . + λnH(Xn) + λ1,2H(X1X2) + . . . + λ1,2,3H(X1, X2, X3) + . . . + λ1,2,3,...,nH(X1, X2, X3, . . . , Xn) which are true for any random variables X1, X2, . . . , Xn (λi are real coefficients). Information inequalities are widely used for proving converse coding theorems in Information Theory. Recently interesting applications of information inequalities beyond Information Theory were found [10],[12],[14]. So investigation of the class of all valid information inequalities is an interesting problem. We refer the reader to [15] for a comprehensive treatment of the subject. Yeung’s framework thus helped to verify all the Shannon type inequalities. Yeung and Yan have also developed a software, to computationally verify such inequalities. Since the software is rather outdated, we have made an attempt to make a more efficient and user friendly implementation of the software, hinging from the original work of Yeung. The software, which we call information inequality solver (iis) is freely available for download from EPFL website. The new software suit has the added advantage that it is freed of dependencies on any licensed products such as Matlab (or toolboxes).

  • Files
  • Details
  • Metrics
Type
semester or other student projects
Author(s)
Pulikkoonattu, Rethnakaran  
Advisors
Perron, Etienne  
•
Diggavi, Suhas  
Date Issued

2007

Subjects

xitip

•

Information theoretic inequalities

URL

URL

http://xitip.epfl.ch/
Written at

EPFL

EPFL units
LICOS  
Available on Infoscience
December 29, 2011
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/76187
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés