Choose the location you want to see specific content and pricing for:

Information-Theoretic Methods in Data Science

Information-Theoretic Methods in Data Science

Information-Theoretic Methods in Data Science

Miguel R. D. Rodrigues , University College London
Yonina C. Eldar , Weizmann Institute of Science, Israel
June 2021
In stock
Hardback
9781108427135
$106.00
USD
Hardback
USD
eBook

    Learn about the state-of-the-art at the interface between information theory and data science with this first unified treatment of the subject. Written by leading experts in a clear, tutorial style, and using consistent notation and definitions throughout, it shows how information-theoretic methods are being used in data acquisition, data representation, data analysis, and statistics and machine learning. Coverage is broad, with chapters on signal acquisition, data compression, compressive sensing, data communication, representation learning, emerging topics in statistics, and much more. Each chapter includes a topic overview, definition of the key problems, emerging and open problems, and an extensive reference list, allowing readers to develop in-depth knowledge and understanding. Providing a thorough survey of the current research area and cutting-edge trends, this is essential reading for graduate students and researchers working in information theory, signal processing, machine learning, and statistics.

    • The first book covering the interface between information theory and data science
    • Provides a tutorial approach to the subject, with each chapter including a topic overview, definition of the key problems, emerging and open problems, and an extensive reference list
    • Uses consistent notation and definitions throughout

    Product details

    June 2021
    Hardback
    9781108427135
    560 pages
    250 × 176 × 34 mm
    1.1kg
    74 b/w illus.
    Available

    Table of Contents

    • 1. Introduction Miguel Rodrigues, Stark Draper, Waheed Bajwa and Yonina Eldar
    • 2. An information theoretic approach to analog-to-digital compression Alon Knipis, Yonina Eldar and Andrea Goldsmith
    • 3. Compressed sensing via compression codes Shirin Jalali and Vincent Poor
    • 4. Information-theoretic bounds on sketching Mert Pillanci
    • 5. Sample complexity bounds for dictionary learning from vector- and tensor-valued data Zahra Shakeri, Anand Sarwate and Waheed Bajwa
    • 6. Uncertainty relations and sparse signal recovery Erwin Riegler and Helmut Bölcskei
    • 7. Understanding phase transitions via mutual Information and MMSE Galen Reeves and Henry Pfister
    • 8. Computing choice: learning distributions over permutations Devavrat Shah
    • 9. Universal clustering Ravi Raman and Lav Varshney
    • 10. Information-theoretic stability and generalization Maxim Raginsky, Alexander Rakhlin and Aolin Xu
    • 11. Information bottleneck and representation learning Pablo Piantanida and Leonardo Rey Vega
    • 12. Fundamental limits in model selection for modern data analysis Jie Ding, Yuhong Yang and Vahid Tarokh
    • 13. Statistical problems with planted structures: information-theoretical and computational limits Yihong Wu and Jiaming Xu
    • 14. Distributed statistical inference with compressed data Wenwen Zhao and Lifeng Lai
    • 15. Network functional compression Soheil Feizi and Muriel Médard
    • 16. An introductory guide to Fano's inequality with applications in statistical estimation Jonathan Scarlett and Volkan Cevher.
      Contributors
    • Miguel Rodrigues, Stark Draper, Waheed Bajwa, Yonina Eldar, Alon Knipis, Andrea Goldsmith, Shirin Jalali, Vincent Poor, Mert Pillanci, Zahra Shakeri, Anand Sarwate, Erwin Riegler, Helmut Bölcskei, Galen Reeves, Henry Pfister, Devavrat Shah, Ravi Raman, Lav Varshney, Maxim Raginsky, Alexander Rakhlin, Aolin Xu, Pablo Piantanida, Leonardo Rey Vega, Jie Ding, Yuhong Yang, Vahid Tarokh, Yihong Wu, Jiaming Xu, Wenwen Zhao, Lifeng Lai, Soheil Feizi, Muriel Médard, Jonathan Scarlett, Volkan Cevher

    • Editors
    • Miguel R. D. Rodrigues

      Miguel R. D. Rodrigues is a Reader in Information Theory and Processing in the Department of Electronic and Electrical Engineering, University College London, and a Faculty Fellow at the Turing Institute, London.

    • Yonina C. Eldar

      Yonina C. Eldar is a Professor in the Faculty of Mathematics and Computer Science at the Weizmann Institute of Science, a Fellow of the IEEE and Eurasip, and a member of the Israel Academy of Sciences and Humanities. She is the author of Sampling Theory (Cambridge, 2015), and co-editor of Convex Optimization in Signal Processing and Communications (Cambridge, 2009), and Compressed Sensing (Cambridge, 2012).

    Thank You

    You will receive email communication regarding the availability of this product