1
$\begingroup$

As far as I remember a neural network cannot forget anything. Does this mean that no matters how evolved the network is, it's always going to throw me back the right output if I feed it an input?

And when I say "right output" I mean "precise output" without getting a bit wrong

2 Answers 2

2

neural networks only have a finite capacity to "remember" (called capacity). Assuming that you don't keep trying to update it with new information, then it would theoretically 'last forever'. But if you keep adjusting the weights to learn new and different things, eventually it will forget old patterns.

  • 0
    some networks have more memory than others mind you. But yeah, since a neural network stores a finite amount of information it can only remember finitely many things.2012-07-12
0

No. Neural nets do not have infinite memory; they have a finite storage capacity.

The storage capacity depends on the configuration of the net(structure, neuronal connection types, number of neurons etc.).

Take for instance a Hopfield net: training it using too many exemplars will result in interference between stored patterns and hence "memory loss". (Its storage capacity is logarithmically dependent on the number of neurons in the structure.)

Now, there are some architectures that are far better at retaining new information as it's presented e.g. Adaptive Resonance Theory, but ultimately there is still a memory limit.