Abstract: In the context of incremental class learning, deep neural networks are prone to catastrophic forgetting, where the accuracy of old classes declines substantially as new knowledge is learned.
Abstract: FSCIL (Few-shot class-incremental learning) is a prominent research topic in the ML community. It faces two significant challenges: forgetting old class knowledge and overfitting to limited ...