Generator forgetting in Generative Replay

Hi, just want to check with anyone who has implemented generative replay before if you have observed forgetting to happen when training the generator of the new Scholar. I implemented a DCGAN as my generator and I observed the generator to be only able to generate the latest set of samples.

I am not sure if I interpreted the paper wrongly. Is the generator of each new scholar supposed to be trained from scratch?

Dear @yedanqi, in this paper we explain all in detail.
I don’t recall the specs of the paper you mentioned but training incrementally the generator and the discriminator is known to be quite challenging.

@ggraffieti, the main author of the paper and an expert on Generative models trained continually, is your guy! :slight_smile: