### Sunday, July 25, 2004

Lots of links, part 2. First, one of the classics: Gödel's incompleteness theorem. Perhaps after going through the proof in detail I could come up with more rigorous mathematical arguments for Statistical Metaphysics, although stat meta certainly isn't any one particular formal system (which is also why Gödel-like proofs do not invalidate it - if anything they corroborate it, i.e. we will always need to add interesting new axioms...). And some more neural network webpages I found from Jenny Orr, and Ben Best. Apparently back-propogation is a big deal for training the networks, although you also don't want to overtrain them so that they can generalize better (which can be done by adding random variations). I wonder if just iterative evolutionary processes can be enough to train them - doing lots of local mutations on the connection strengths for different members of a 'species' and then selecting the best one (round-robin tournament?) and making more variations around it... And this talk of competition reminds me that I watched Lance finish off the Tour de France this morning, which was quite inspiring. And oh yeah, I met a cool new physics graduate student Michael Good last week, who is also interested in the mathematical foundations of reality. Apparently his former advisor David Finkelstein also proposes that all mathematical structures exist, which then leads to the natural statistical conclusion that we will always find more complex mathematical generalizations in physics.