Monday, April 12, 2004

More on the holographic principle and black holes from a conversation with Tobin Fricke on Live Journal.

There is a theory going around - the holographic principle - that the amount of information that can be stored in a region goes as the surface area of the region (let's just say a sphere) and NOT the volume. For instance this meshes with black hole thermodynamics - the temperature is related to the mass (inversely - a black hole's Hawking radiation has an average wavelength the size of the black hole, so a solar mass black hole would emit 1.5 kilometer radio waves - i.e. it would be much colder than the current microwave background) and thus to the surface area of the event horizon. Another derivation is given by UNC's own Jack Ng and Hank Van Dam - gr-qc/0403057 - check it out, it's an easy read. Thus my instinct when presented with something like this is to try and break it - and since the volume grows much faster than the surface area it seems like you should take a giant volume and then cram as much matter (and thus info) inside it as possible. The problem is that a black hole's radius grows that much quicker - the event horizon is linear in mass: R=2M (G=c=1). Thus very large black holes have very low average densities - indeed as I found (trivial really), current intergalactic densities give a black hole radius of about the radius of the visible universe. So to break the volume/area thing you actually want to go to small volumes or else you'll form black holes immediately. Let's see, the earth has a mass of about 10^25 kg, which is 10^52 protons, and a Schwarzchild radius of about 1cm, and Lplanck = 10^-33 cm, so there are 10^66 Planck lengths squared on an Earth sized black holes - no good. Hold on, matter at nuclear densities also won't work no matter how small, but taken at schwarzchild radius density... OK 1 kg of neutrons is ~10^27 particles, with a RBH~10^-27 meter = 10^7*Lplanck, so there are ~ 10^15 possible bits on the horizon according to the holographic principle, but 1 bit/neutron would bit 10^27 bits! You can beat it at tiny volumes! I'm going to have to look at this more, but the densities here are much higher than nuclear density - not that this would stop a theorist! Thanks for getting me back onto this Tobin! I'll see if the details pan out... For one thing it would be a quark-gluon plasma, not neutrons - but I don't think the binding energy mass would increase much - only gets big when you try to separate them. Ah, but they are fermions - degeneracy pressure would push the temperature and energy way up - have to check it out.

Oh yeah, and we're not in a black hole because there must be matter beyond our visible horizon at 13.7 billion light years - can tell that because from measuring the CMB radiation we know space is very flat. If it was a vacuum outside, our visible universe would collapse to a singularity in time T=pi/2*(3/(8*pi*G*rho))^1/2, rho~10*10^30*10^11*10^11/(10^10*10^16)^3=10^-25kg/m^3, so T=2*10^17 seconds, which, I'll be damned, is about the age of the universe. But then the metric for a collapsing ball of dust is the same as the Freidmann-Robertson-Walker metric, so maybe I shouldn't be surprised. Huh. So the evolution of the universe forward from the big bang really is very similar to a black hole collapse. Lee Smolin has the idea that collapsing black holes seed new baby universes - then if the physical constants can change at each singularity the system will evolve towards universes that have physical constants most favorable to the formation of lots of new black holes - these will dominate the counting. That's looking even clearer to me now. I can't wait to talk to Lee on Monday, he's coming to give a colloquia. I need to really go back and look at Martin Bojowald's singularity evolution code...

Comments: Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?