Skip to content

Latest commit

 

History

History
52 lines (52 loc) · 1.82 KB

2022-11-28-ji22a.md

File metadata and controls

52 lines (52 loc) · 1.82 KB
title abstract video layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Test Sample Accuracy Scales with Training Sample Density in Neural Networks
Intuitively, one would expect accuracy of a trained neural network’s prediction on test samples to correlate with how densely the samples are surrounded by seen training samples in representation space. We find that a bound on empirical training error smoothed across linear activation regions scales inversely with training sample density in representation space. Empirically, we verify this bound is a strong predictor of the inaccuracy of the network’s prediction on test samples. For unseen test sets, including those with out-of-distribution samples, ranking test samples by their local region’s error bound and discarding samples with the highest bounds raises prediction accuracy by up to 20% in absolute terms for image classification datasets, on average over thresholds.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
ji22a
0
Test Sample Accuracy Scales with Training Sample Density in Neural Networks
629
646
629-646
629
false
Ji, Xu and Pascanu, Razvan and Hjelm, R. Devon and Lakshminarayanan, Balaji and Vedaldi, Andrea
given family
Xu
Ji
given family
Razvan
Pascanu
given family
R. Devon
Hjelm
given family
Balaji
Lakshminarayanan
given family
Andrea
Vedaldi
2022-11-28
Proceedings of The 1st Conference on Lifelong Learning Agents
199
inproceedings
date-parts
2022
11
28