Posts by Collection

portfolio

Portfolio item number 1

Short description of portfolio item number 1

Portfolio item number 2

Short description of portfolio item number 2

publications

Strengthened information-theoretic bounds on the generalization error

Ibrahim Issa, Amedeo Roberto Esposito, Michael Gastpar

Published in 2019 IEEE International Symposium on Information Theory (ISIT)

Download here

Learning and adaptive data analysis via maximal leakage

Amedeo Roberto Esposito, Michael Gastpar, Ibrahim Issa

Published in 2019 IEEE Information Theory Workshop (ITW)

Download here

Robust Generalization via α-Mutuaaal Information

Amedeo Roberto Esposito, Michael Gastpar, Ibrahim Issa

Published in 2020 International Zurich Seminar on Information and Communication (IZS)

Download here

Robust Generalization via f−Mutual Information

Amedeo Roberto Esposito, Michael Gastpar, Ibrahim Issa

Published in 2020 IEEE International Symposium on Information Theory (ISIT)

Download here

From Generalisation Error to Transportation-cost Inequalities and Back

Amedeo Roberto Esposito, Michael Gastpar

Published in 2022 IEEE International Symposium on Information Theory (ISIT)

Download here

Lower-bounds on the Bayesian Risk in Estimation Procedures via f–Divergences

Adrien Vandenbroucque, Amedeo Roberto Esposito, Michael Gastpar

Published in 2022 IEEE International Symposium on Information Theory (ISIT)

Download here

On Sibson’s α-Mutual Information

Amedeo Roberto Esposito, Adrien Vandenbroucque, Michael Gastpar

Published in 2022 IEEE International Symposium on Information Theory (ISIT)

Download here

Towards a Standard Testing Data Set in Privacy

Amedeo Roberto Esposito

Published in BCS Learning & Development

Download here

On conditional Sibson’s α-Mutual Information

Amedeo Roberto Esposito, Diyuan Wu, Michael Gastpar

Published in 2021 IEEE International Symposium on Information Theory (ISIT)

Download here

Lower-bounds on the Bayesian Risk in estimation procedures via Sibson’s α-Mutual Information

Amedeo Roberto Esposito, Michael Gastpar

Published in 2021 IEEE International Symposium on Information Theory (ISIT)

Download here

Generalization Error Bounds Via Rényi-, f-Divergences and Maximal Leakage

Amedeo Roberto Esposito, Michael Gastpar, Ibrahim Issa

Published in IEEE Transactions on Information Theory Volume: 67, Issue: 8

Download here

talks

Talk 1 on Relevant Topic in Your Field

Published:

This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!

Conference Proceeding talk 3 on Relevant Topic in Your Field

Published:

This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.

teaching

COM-512, Networks out of control

Graduate course, EPFL, 2018

CS-119, Information, Computation, Communication

Undergraduate course, Life sciences, EPFL, 2018

COM-417, Advanced Probability And Applicaitons

Graduate course, EPFL, 2019

COM-406, Foundations of Data Science

Graduate course, EPFL, 2019

EL-205, Signals and systems

Undergraduate course, EPFL, 2020

CS-119, Information, Computation, Communication

Undergraduate course, Life sciences, EPFL, 2020

COM-102, Advanced information, computation, communication II

Undergraduate course, EPFL, 2021

I have been head Teaching Assistant for Advanced information, computation, communication II (AICC2) for two years in a row: spring 2021 and spring 2022. AICC2 is a large undergraduate class (~300 students) with meaningful challenges. Its purpose is to expose first year bachelor students to applied probability theory, information theory, coding theory and cryptography. I was in charge of leading the teaching assistants team (2 PhD students, 10/12 bachelor/master students), coordinating the homework, exam, etc.

Information Theory (for Data Science)

Ph.D. course, ISTA, 2023

I am excited to announce that I am co-teaching (along with Marco Mondelli) the Ph.D. class on Information Theory (for Data Science). The class covers a variety of classical and more modern results with an information-theoretic flavor. We will talk about: Information measures, compression, prediction, estimation, large deviations (Sanov style), multi-armed bandits, exponential families, exploration bias and generalisation error.