[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
genetic algorithms and distributed training
Hi all, I've been learning how to use PDP++ over the
last few weeks and am just starting to get a grasp on
this very powerful software. I'm wondering if anyone
has done some work that they'd be willing to share on:
1) genetic algorithms, that is automatically
configuring many different kinds of nets, testing them
out, saving the best ones, and then repeating the
process; and 2) distributed training, what i mean by
this is an ability to run an instance of the software
on many different computers over a network
automatically and coordinate the training on the
disparate machines. 2 is somewhat easier than 1, and
admittedly both are not the most difficult thing in
the world. I'm just wondering if anyone's already
grappled with this or would be interested in
collaborating on it because I don't like to reinvent
the wheel if I don't have to.
-Aaron (the Octavio is fake)
P.S. Prof O'Reilly: cheers to you for making this
software opensource and supporting the community
actively through the mailing list. It's no doubt a lot
of work but will likely result in quite a few new
neural net researchers.
Do You Yahoo!?
Get email at your own domain with Yahoo! Mail.