Research
My research lies at the intersection of Information Theory, Probability Theory and Functional Analysis. To get to know my views on the (fundamental) connection among the fields you can check out my thesis. Applications of my results range from Learning Theory (see here, here, here, here, and here), to Probability Theory (see here, here) and Estimation Theory (see here and here).
The work I am most proud of is, at the moment, only available as a conference paper (a longer version is on its way). It encapsulates my views in which Information Measures simply are a bridge between spaces of measures and spaces of functions and therein lies their power.
My current interests consist in a deeper exploration of the framework I set up during my PhD, and I am grateful for being able to do so with Marco Mondelli.
Preprints
- "Lower Bounds on the Bayesian Risk via Information Measures", Amedeo Roberto Esposito, Adrien Vandenbroucque, Michael Gastpar, Arxiv Pre-print, LINK
- "Concentration without Independence via Information Measures", Amedeo Roberto Esposito, Marco Mondelli, Arxiv Pre-print, LINK
- "A new approach to adaptive data analysis and learning via maximal leakage", Amedeo Roberto Esposito, Ibrahim Issa, Michael Gastpar, Arxiv Pre-print, LINK
Journals & Conferences
- "Generalization Error Bounds Via Rényi-, f-Divergences and Maximal Leakage", Amedeo Roberto Esposito, Ibrahim Issa, Michael Gastpar, in IEEE Transactions on Information Theory Volume: 67, Issue: 8, LINK
- "Asymptotically Optimal Generalization Error Bounds for Noisy, Iterative Algorithms", Ibrahim Issa, Amedeo Roberto Esposito, Michael Gastpar, Accepted for presentation at the 2023 Conference on Learning Theory (COLT) , LINK
- "Concentration without Independence via Information Measures", Amedeo Roberto Esposito, Marco Mondelli, Accepted for presentation at the 2023 IEEE International Symposium on Information Theory (ISIT) , LINK
- "From Generalisation Error to Transportation-cost Inequalities and Back", Amedeo Roberto Esposito, Michael Gastpar in 2022 IEEE International Symposium on Information Theory (ISIT), LINK
- "On Sibson’s α-Mutual Information", Amedeo Roberto Esposito, Adrien Vandenbroucque, Michael Gastpar in 2022 IEEE International Symposium on Information Theory (ISIT), LINK
- "Lower-bounds on the Bayesian Risk in Estimation Procedures via f–Divergences", Adrien Vandenbroucque, Amedeo Roberto Esposito, Michael Gastpar in 2022 IEEE International Symposium on Information Theory (ISIT), LINK
- "Towards a Standard Testing Data Set in Privacy", Amedeo Roberto Esposito, in BCS Learning & Development, LINK
- "Lower-bounds on the Bayesian Risk in estimation procedures via Sibson's α-Mutual Information", Amedeo Roberto Esposito, Michael Gastpar in 2021 IEEE International Symposium on Information Theory (ISIT), LINK
- "On conditional Sibson's α-Mutual Information", Amedeo Roberto Esposito, Diyuan Wu, Michael Gastpar in 2021 IEEE International Symposium on Information Theory (ISIT), LINK
- "Robust Generalization via f−Mutual Information", Amedeo Roberto Esposito, Michael Gastpar, Ibrahim Issa in 2020 IEEE International Symposium on Information Theory (ISIT), LINK
- "Robust Generalization via α-Mutual Information", Amedeo Roberto Esposito, Michael Gastpar, Ibrahim Issa in 2020 International Zurich Seminar on Information and Communication, LINK
- "Learning and adaptive data analysis via maximal leakage", Amedeo Roberto Esposito, Michael Gastpar, Ibrahim Issa in 2019 IEEE Information Theory Workshop (ITW), LINK
- "Strengthened information-theoretic bounds on the generalization error", Ibrahim Issa, Amedeo Roberto Esposito, Michael Gastpar in 2019 IEEE International Symposium on Information Theory (ISIT), LINK