Technical Program

Paper Detail

Paper Title Universal Learning of Individual Data
Paper IdentifierTH4.R1.1
Authors Yaniv Fogel, Meir Feder, Tel Aviv University, Israel
Session Learning and Regression
Location Le Théatre (Parterre), Level -1
Session Time Thursday, 11 July, 16:40 - 18:00
Presentation Time Thursday, 11 July, 16:40 - 17:00
Manuscript  Click here to download the manuscript
Abstract Universal learning is considered from an information theoretic point of view following the universal prediction approach, see [1]. The standard supervised “batch” learning is considered where prediction is done on a test sample once the entire training data is observed, in the individual setting where the features and labels, both in the training and the test, are specific individual quantities. Prediction loss is naturally measured by the log-loss. The presented results provide a minimax universal learning scheme, termed Predictive Normalized Maximum Likelihood (pNML) that competes with a “genie” (or reference) that knows the true test label. In addition, a pointwise “learnability” measure associated with the pNML, for the specific training and test, is provided. This measure may also indicate the performance of the commonly used Empirical Risk Minimizer (ERM) learner.