No. Neural nets do not have infinite memory; they have a finite storage capacity.
The storage capacity depends on the configuration of the net(structure, neuronal connection types, number of neurons etc.).
Take for instance a Hopfield net: training it using too many exemplars will result in interference between stored patterns and hence "memory loss". (Its storage capacity is logarithmically dependent on the number of neurons in the structure.)
Now, there are some architectures that are far better at retaining new information as it's presented e.g. Adaptive Resonance Theory, but ultimately there is still a memory limit.