Technical Program

Paper Detail

Paper Title Towards a non-stochastic information theory
Paper IdentifierTU2.R9.1
Authors Anshuka Rangi, Massimo Franceschetti, University of California, San Diego, United States
Session Channel Models
Location Pontoise, Level 5
Session Time Tuesday, 09 July, 11:40 - 13:00
Presentation Time Tuesday, 09 July, 11:40 - 12:00
Manuscript  Click here to download the manuscript
Abstract The $\delta$-mutual information between uncertain variables is introduced as a generalization of Nair's non-stochastic information functional. Several properties of this new quantity are illustrated, and used to prove a channel coding theorem in a non-stochastic setting. Namely, it is shown that the largest $\delta$-mutual information between a metric space and its $\epsilon$-packing equals the $(\epsilon, \delta)$-capacity of the space. This notion of capacity generalizes the Kolmogorov $\epsilon$-capacity to packing sets of overlap at most $\delta$, and is a variation of a previous definition proposed by one of the authors. These results provide a framework for developing a non-stochastic information theory motivated by potential applications in control and learning theories. Compared to previous non-stochastic approaches, the theory admits the possibility of decoding errors as in Shannon's probabilistic setting, while retaining its worst-case non-stochastic character.