Convolutional Neural Networks with Compression Complexity Pooling for Out-of-Distribution Image Detection
Convolutional Neural Networks with Compression Complexity Pooling for Out-of-Distribution Image Detection
Sehun Yu, Dongha Lee, Hwanjo Yu
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2435-2441.
https://doi.org/10.24963/ijcai.2020/337
To reliably detect out-of-distribution images based on already deployed convolutional neural networks, several recent studies on the out-of-distribution detection have tried to define effective confidence scores without retraining the model. Although they have shown promising results, most of them need to find the optimal hyperparameter values by using a few out-of-distribution images, which eventually assumes a specific test distribution and makes it less practical for real-world applications. In this work, we propose a novel out-of-distribution detection method termed as MALCOM, which neither uses any out-of-distribution sample nor retrains the model. Inspired by an observation that the global average pooling cannot capture spatial information of feature maps in convolutional neural networks, our method aims to extract informative sequential patterns from the feature maps. To this end, we introduce a similarity metric that focuses on shared patterns between two sequences based on the normalized compression distance. In short, MALCOM uses both the global average and the spatial patterns of feature maps to identify out-of-distribution images accurately.
Keywords:
Machine Learning: Deep Learning: Convolutional networks
Uncertainty in AI: Other