Pseudo rehearsal implementation

Hi, I would like to clarify on the how to implement pseudo rehearsal. Right now I am generating random samples of images using torch.rand() or torch.normal(), then passing them through the model to generate pseudo labels. Finally I batch the (sample, label) pairs with the new data for the new task and train them together as a minibatch.

Is this the right way to do pseudo-rehearsal? I am asking because my results is showing that the model is still forgetting previous tasks.

Hi, can you provide some references e.g. papers with such conception described?

Hi, I am trying to implement something that is described in this paper:
[PDF] Catastrophic Forgetting, Rehearsal and Pseudorehearsal | Semantic Scholar