Thinking as a Hobby


Home
Get Email Updates
LINKS
JournalScan
Email Me

Admin Password

Remember Me

3477558 Curiosities served
Share on Facebook

Science and Setbacks
Previous Entry :: Next Entry

Read/Post Comments (0)

Philip and I have been trying to get our neural network evolver to solve the basic logical problem of XOR for a while now. But trial and error are an inevitable part of programming, engineering, and science.

Just ask NASA.

We're not trying to scour the Martian landscape, and our budget's quite a bit smaller, but there are always kinks.

Here's a fitness graph we thought was successful. I'd enabled recurrency and forgot to disable it for this run, so it wasn't really solving what we thought it was solving. This is what a successful run should look like, though:



Basically the network is trying to map four inputs to the four correct outputs. The maximum error for each output is 2, and we square each error. So the minimum fitness is zero, and the maximum fitness is 16. The blue bars represent the range of fitnesses within a given generation, with the black dot indicating the average fitness for a generation.

And here's a graphical representation of speciation within the population. As the networks evolve, if their topologies are different enough, they are put into a new species, which interbreeds within itself, and not between other species:



As you can see, there is only one species to begin with, but others begin to emerge as the run progresses.

Anyway, we're still at it, and at least we've developed a decent array of diagnostic tools to help us troubleshoot problems, but it's still a fairly complex algorithm, so problems are inherently difficult to troubleshoot.


Read/Post Comments (0)

Previous Entry :: Next Entry

Back to Top

Powered by JournalScape © 2001-2010 JournalScape.com. All rights reserved.
All content rights reserved by the author.
custsupport@journalscape.com