EE Seminar: Recent information theoretic contributions to statistical inference
הרישום לסמינר יבוצע בתחילת הסמינר באמצעות סריקת הברקוד למודל (יש להיכנס לפני כן למודל, לא באמצעות האפליקציה)
Registration to the seminar is done at the beginning of the seminar by scanning the barcode for the Moodle (Please enter ahead to the Moodle, NOT by application)
(The talk will be given in English)
Speaker: Prof. Sergio Verdú
National Academy of Sciences and the National Academy of Engineering of USA
|
011 hall, Electrical Engineering-Kitot Building |
Monday, December 8th, 2025
13:00 - 14:00
|
|
Recent information theoretic contributions to statistical inference
Abstract
Information measures, such as relative entropy (Kullback-Leibler divergence), Chernoff information, and f-divergences have a long track record of contributing to the analysis of the fundamental limits of hypothesis testing and to necessary and sufficient conditions for the sufficiency of statistics of the data.
This talk gives an account of several recent information theoretic contributions to hypothesis testing and to the theory of sufficient statistics, obtained through the analysis of the relative information spectrum, i.e. the distribution of the log likelihood ratio.
We introduce a new measure of discrepancy between probability measures, the NP-divergence, and show that it gives the non-asymptotic fundamental limit of the largest area of conditional error probabilities achieved by the optimal Neyman-Pearson tests.
We also introduce a new easy-to-check criterion for sufficient statistics and explore its relationships with the criteria introduced by Fisher, Blackwell, and Kolmogorov.
Short Bio
Sergio Verdú is a Member of the National Academy of Sciences and the National Academy of Engineering of USA. He is also the recipient of the 2007 Claude Shannon Award.

