Near-Optimal Noisy Group Testing via Separate Decoding of Items

In this paper, we revisit an efficient algorithm for noisy group testing in which each item is decoded separately (Malyutov and Mateev, 1980), and develop novel performance guarantees via an information-theoretic framework for general noise models. For the noiseless and symmetric noise models, we find that the asymptotic number of tests required for vanishing error probability is within a factor log 2 ≈ 0.7 of the informationtheoretic optimum at low parsity levels, and that when a small fraction of incorrectly-decoded items is allowed, this guarantee extends to all sublinear sparsity levels. In many scaling regimes, these are the best known theoretical guarantees for any noisy group testing algorithm.


Published in:
Proceedings of the 2018 IEEE International Symposium on Information Theory (ISIT), 2311-2315
Presented at:
IEEE International Symposium on Information Theory, Colorado, USA., June 17-22. 2018
Year:
Jun 17 2018
Publisher:
Colorado, USA, IEEE
ISBN:
978-1-5386-4781-3
Laboratories:




 Record created 2018-05-14, last modified 2019-08-12

Final:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)