Dark experience for general continual learning: a strong, simple baseline
| [0030]
|
euroscience-inspired artificial intelligence
| [0030]
|
The stability-plasticity dilemma: Investigating the continuum from catastrophic forgetting to age-limited learning effects
| [0030]
|
Connectionist models of recognition memory: constraints imposed by learning and forgetting functions
| [0030]
|
Gradient episodic memory for continual learning
| [0030]
|
Efficient lifelong learning witha-gem
| [0030]
|
Lampert. icarl: Incremental classifier and representation learning
| [0030]
|
Learning with pseudo-ensembles
| [0030]
|
S4l: Self-supervised semi-supervised learning
| [0030]
|
Regularization with stochastic transformations and perturbations for deep semi-supervised learning
| [0030]
|
Virtual adversarial training: a regularization method for supervised and semi-supervised learning
| [0030]
|
Measuring and regularizing networks in function space
| [0030]
|
Learning fast, learning slow: A general continual learning method based on complementary learning system
| [0030]
|
An empirical investigation of catastrophic forgetting in gradient-based neural networks
| [0030]
|
Continual lifelong learning with neural networks: A review
| [0030]
|
Continual learning through synaptic intelligence
| [0030]
|
Progress & compress: A scalable framework for continual learning
| [0030]
|
Progressive neural networks
| [0030]
|
Towards robust evaluations of continual learning
| [0030]
|
Learning multiple layers of features from tiny images
| [0030]
|
Gradient episodic memory for continual learning
| [0030]
|
Gradient-based learning applied to document recognition
| [0030]
|
Deep residual learning for image recognition.
| [0030]
|