[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: genetic algorithms and distributed training
I'm a student from Peking University, China and I
am very happy to work with you. I've come up with the
idea of distributed training for some time, and have begun to
implement the multi-threaded training in pdp++. In my
code, I preserved the original framework and data structure
faily well. After it is debugged after several days, I would
like to share it with you and other pdp++ users/developers.
I hope it will save your time in developing the distributed
----- Original Message -----
From: "Octavio Lopez" <email@example.com>
Sent: Saturday, March 03, 2001 11:42 AM
Subject: genetic algorithms and distributed training
> Hi all, I've been learning how to use PDP++ over the
> last few weeks and am just starting to get a grasp on
> this very powerful software. I'm wondering if anyone
> has done some work that they'd be willing to share on:
> 1) genetic algorithms, that is automatically
> configuring many different kinds of nets, testing them
> out, saving the best ones, and then repeating the
> process; and 2) distributed training, what i mean by
> this is an ability to run an instance of the software
> on many different computers over a network
> automatically and coordinate the training on the
> disparate machines. 2 is somewhat easier than 1, and
> admittedly both are not the most difficult thing in
> the world. I'm just wondering if anyone's already
> grappled with this or would be interested in
> collaborating on it because I don't like to reinvent
> the wheel if I don't have to.
> -Aaron (the Octavio is fake)
> P.S. Prof O'Reilly: cheers to you for making this
> software opensource and supporting the community
> actively through the mailing list. It's no doubt a lot
> of work but will likely result in quite a few new
> neural net researchers.
> Do You Yahoo!?
> Get email at your own domain with Yahoo! Mail.