Search
Search Results
-
On Syntactic Forgetting Under Uniform Equivalence
Forgetting in Answer Set Programming (ASP) aims at reducing the language of a logic program without affecting the consequences over the remaining... -
Meta-Learning with Less Forgetting on Large-Scale Non-Stationary Task Distributions
The paradigm of machine intelligence moves from purely supervised learning to a more practical scenario when many loosely related unlabeled data are... -
Reducing Catastrophic Forgetting in Neural Networks via Gaussian Mixture Approximation
Our paper studies the continual learning (CL) problems in which data comes in sequence and the trained models are expected to be capable of utilizing... -
Knowledge Learning Without Forgetting for the Detection of Alzheimer’s Disease
Alzheimer’s disease (AD) is an extremely damaging, slow-progressing neurological disease that causes tremendous inconvenience to patients’ lives.... -
On Robustness of Generative Representations Against Catastrophic Forgetting
Catastrophic forgetting of previously learned knowledge while learning new tasks is a widely observed limitation of contemporary neural networks.... -
‘Right to Be Forgotten’: Analyzing the Impact of Forgetting Data Using K-NN Algorithm in Data Stream Learning
New international regulations concerning personal management data guarantee the ‘Right to Be Forgotten’. One might request to have their data erased... -
Utilizing incremental branches on a one-stage object detection framework to avoid catastrophic forgetting
The tremendous success of deep learning on object detection tasks compels researchers to adopt deep learning models for autonomous driving vehicles....
-
Studying Catastrophic Forgetting in Neural Ranking Models
Several deep neural ranking models have been proposed in the recent IR literature. While their transferability to one target domain held by a dataset... -
State Primitive Learning to Overcome Catastrophic Forgetting in Robotics
People can learn continuously a wide range of tasks without catastrophic forgetting. To mimic this functioning of continual learning, current methods...
-
Forgetting Alternatives
This segment explains how easy (and risky) it is to neglect adding else clauses to conditionals. It also describes how writing conditionals counter... -
Rough Forgetting
Recent work in the area of Knowledge Representation and Reasoning has focused on modification and optimization of knowledge bases (KB) through the... -
Attaining Class-Level Forgetting in Pretrained Model Using Few Samples
In order to address real-world problems, deep learning models are jointly trained on many classes. However, in the future, some classes may become... -
Incremental class learning using variational autoencoders with similarity learning
Catastrophic forgetting in neural networks during incremental learning remains a challenging problem. Previous research investigated catastrophic...
-
LETHE: Forgetting and Uniform Interpolation for Expressive Description Logics
Uniform interpolation and forgetting describe the task of projecting a given ontology into a user-specified vocabulary, that is, of computing a new...
-
Continual text classification based on knowledge distillation and class-aware experience replay
Continual text classification aims at constantly classifying the texts from an infinite text stream while preserving stable classification...
-
Overcomplete-to-sparse representation learning for few-shot class-incremental learning
Few-shot class-incremental learning (FSCIL) aims to continually learn new semantics given a few training samples of new classes. As training examples...
-
Respecializing swarms by forgetting reinforced thresholds
Response threshold reinforcement is a powerful model for decentralized task allocation and specialization in multiagent swarms. In dynamic...
-
Forget less, count better: a domain-incremental self-distillation learning benchmark for lifelong crowd counting
Crowd counting has important applications in public safety and pandemic control. A robust and practical crowd counting system has to be capable of...
-
Towards Long-Term Remembering in Federated Continual Learning
BackgroundFederated Continual Learning (FCL) involves learning from distributed data on edge devices with incremental knowledge. However, current FCL...
-
Prompt Based Lifelong Person Re-identification
In the real world, training data for person re-identification (ReID) comes in streams and the domain distribution may be inconsistent, which requires...