View in #questions-forum on Slack
@Snowcrasher312: Hi folks! I’ve been doing a bit of background reading on CL and wondering if the following problem fits into the paradigm here. If I have a model I’ve already trained on a large dataset for classification and get new data points with labels that fall within those already learned classes, could I apply some strategies from CL to update the model on just those data points without retraining from scratch (while maintaining accuracy on the prior dataset)? It seems to fit well, but wanted to confirm with you all.
Then, if so, could anyone help point me at some strategies that might be more effective in this setting? I’ve looked and tested a couple - including A-GEM and EWC - and they seem to work decently well, but not too much better than simply training on the new data points. Thanks in advance!
@Toshi: This seems to fall under the area of ‘Domain Adaptation’.
@Eden_Belouadah: I agree with Toshi.
To the best of my knowledge, online IL approaches is what’s convenient for you. For example the paper (Remind your neural network to prevent catastrophic forgetting).
@Snowcrasher312: Thank you both!! I will look into this area