[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Seeding the random number generator
> To put this in context, I'm training a 5x5 SOM, for 400 iterations,
> with a gradually decreasing circular neighbourhood and learning
> rate. 5 runs from different initialisations end up with the same
> weights within 100 iterations. However the weights the networks have
> still change from iteration to iteration...
> The initial learning rate is 0.9 and it reduces by 1/400th of the
> initial value each iteration (actually reaches 0 at end of final
> iteration). The neighbourhood radius starts at 5 and decreases
> similarly, though never goes below 1.
> Anyway, should this *ever* happen?! It seems very strange to me...
Could happen, especially if you don't have the wrap flag set -- in
this case there could be a unique mapping in the hidden layer of your
data, and the network will learn it! If you do have wrap set then
there shouldn't be a unique configuration and I'd be puzzled..