TY - JOUR A1 - Mundt, Martin A1 - Pliushch, Iuliia A1 - Majumder, Sagnik A1 - Hong, Yongwon A1 - Ramesh, Visvanathan T1 - Unified probabilistic deep continual learning through generative replay and open set recognition T2 - Journal of imaging N2 - Modern deep neural networks are well known to be brittle in the face of unknown data instances and recognition of the latter remains a challenge. Although it is inevitable for continual-learning systems to encounter such unseen concepts, the corresponding literature appears to nonetheless focus primarily on alleviating catastrophic interference with learned representations. In this work, we introduce a probabilistic approach that connects these perspectives based on variational inference in a single deep autoencoder model. Specifically, we propose to bound the approximate posterior by fitting regions of high density on the basis of correctly classified data points. These bounds are shown to serve a dual purpose: unseen unknown out-of-distribution data can be distinguished from already trained known tasks towards robust application. Simultaneously, to retain already acquired knowledge, a generative replay process can be narrowed to strictly in-distribution samples, in order to significantly alleviate catastrophic interference. KW - catastrophic forgetting KW - continual deep learning KW - deep generative models KW - open-set recognition KW - variational inference Y1 - 2022 UR - http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/82823 UR - https://nbn-resolving.org/urn:nbn:de:hebis:30:3-828237 SN - 2313-433X N1 - Funding: EU H2020 Project AEROBI ; 687384 N1 - Funding: EU H2020 Project RESIST ; 769066 N1 - Funding: BMBF project AISEL ; 01IS19062 N1 - Additional financial support from Goethe University was instrumental in concluding the research. VL - 8 IS - 4, art. 93 SP - 1 EP - 34 PB - MDPI CY - Basel ER -