Sunday, April 03, 2016

New link between quantum computing and black hole may solve information loss problem

[image source: IBT]

If you leave the city limits of Established Knowledge and pass the Fields of Extrapolation, you enter the Forest of Speculations. As you get deeper into the forest, larger and larger trees impinge on the road, strangely deformed, knotted onto themselves, bent over backwards. They eventually grow so close that they block out the sunlight. It must be somewhere here, just before you cross over from speculation to insanity, that Gia Dvali looks for new ideas and drags them into the sunlight.

Dvali’s newest idea is that every black hole is a quantum computer. And not just any quantum computer, but a quantum computer made of a Bose-Einstein condensate that self-tunes to the quantum critical point. In one sweep, he has combined everything that is cool in physics at the moment.

This link between black holes and Bose-Einstein condensates is based on simple premises. Dvali set out to find some stuff that would share properties with black holes, notably the relation between entropy and mass (BH entropy), the decrease in entropy during evaporation (Page time), and the ability to scramble information quickly (scrambling time). What he found was that certain condensates do exactly this.

Consequently he went and conjectured that this is more than a coincidence, and that black holes themselves are condensates – condensates of gravitons, whose quantum criticality allows the fast scrambling. The gravitons equip black holes with quantum hair on horizon scale, and hence provide a solution to the black hole information loss problem by first storing information and then slowly leaking it out.

Bose-Einstein condensates on the other hand contain long-range quantum effects that make them good candidates for quantum computers. The individual q-bits that have been proposed for use in these condensates are normally correlated atoms trapped in optical lattices. Based on his analogy with black holes however, Dvali suggests to use a different type of state for information storage, which would optimize the storage capacity.

I had the opportunity to speak with Immanuel Bloch from the Max Planck Institute for Quantum Optics about Dvali’s idea, and I learned that while it seems possible to create a self-tuned condensate to mimic the black hole, addressing the states that Dvali has identified is difficult and, at least presently, not practical. You can read more about this in my recent Aeon essay.

But really, you may ask, what isn’t a quantum computer? Doesn’t anything that changes in time according to the equations of quantum mechanics process information and compute something? Doesn’t every piece of chalk execute the laws of nature and evaluate its own fate, doing a computation that somehow implies something with quantum?

That’s right. But when physicists speak of quantum computers, they mean a particularly powerful collection of entangled states, assemblies that allow to hold and manipulate much more information than a largely classical state. It’s this property of quantum computers specifically that Dvali claims black holes must also possess. The chalk just won’t do.

If it is correct what Dvali says, a real black hole out there in space doesn’t compute anything in particular. It merely stores the information of what fell in and spits it back out again. But a better understanding of how to initialize a state might allow us one day – give it some hundred years – to make use of nature’s ability to distribute information enormously quickly.

The relevant question is of course, can you test that it’s true?

I first heard of Dvali’s idea on a conference I attended last year in July. In his talk, Dvali spoke about possible observational evidence for the quantum hair due to modifications of orbits nearby the black hole. At least that’s my dim recollection almost a year later. He showed some preliminary results of this, but the paper hasn’t gotten published and the slides aren’t online. Instead, together with some collaborators, he published a paper arguing that the idea is compatible with the Hawking, Perry, Strominger proposal to solve the black hole information loss, which also relies on black hole hair.

In November then, I heard another talk by Stefan Hofmann, who had also worked on some aspects of the idea that black holes are Bose-Einstein condensates. He told the audience that one might see a modification in the gravitational wave signal of black hole merger ringdowns. Which have since indeed been detected. Again though, there is no paper.

So I am tentatively hopeful that we can look for evidence of this idea in the soon future, but so far there aren’t any predictions. I have an own proposal to add for observational consequences of this approach, which is to look at the scattering cross-section of the graviton condensate with photons in the wave-length regime of the horizon-size (ie radio-waves). I don’t have time to really work on this, but if you’re looking for one-year project in quantum gravity phenomenology, this one seems interesting.

Dvali’s idea has some loose ends of course. Notably it isn’t clear how the condensate escapes collapse, at least it isn’t clear to me and not clear to anyone I talked to. The general argument is that for the condensate the semi-classical limit is a bad approximation, and thus the singularity theorems are rather meaningless. While that might be, it’s too vague for my comfort. The idea also seems superficially similar to the fuzzball proposal, and it would be good to know the relation or differences.

After these words of caution, let me add that this link between condensed matter, quantum information, and black holes isn’t as crazy as it seems at first. In the last years, a lot of research has piled up that tightens the connections between these fields. Indeed, a recent paper by Brown et al hypothesizes that black holes are not only the most efficient storage devices but indeed the fastest computers.

It’s amazing just how much we have learned from a single solution to Einstein’s field equations, and not even a particularly difficult one. “Black hole physics” really should be a research field on its own right.

18 comments:

  1. I seem to recall that some years back a physicist (I think his name was Kenji Hotta) suggested that black holes might be a condensate with the property that the mass and entropy were excluded from the interior and resided only on the surface. Is that possible for Dvali's condensate?

    ReplyDelete
  2. CIP,

    I am not exactly sure what exactly means "interior" in this case, since there is no metric. But roughly speaking I think the answer to your question is no, because Dvali relies on an argument of "densest possible packing" and that packing is in a volume, not on a surface.

    ReplyDelete
  3. Ah.
    Thats cool and funny imaginations of dvali.
    Ya,I agree that there is possibilities of Quantum computations at that Bose-Einstein condensate(and at black hole there is temperature of 60 Nanokelivens)which is suitable conditions of quantum correlations.
    But Where would the computation will be really going?
    There are not any metric as you said in your previous reply.
    Thats funny though.
    In 2010,there is essay published of title " Building up spacetime with quantum entanglement"
    By Mark Van Raamsdonk (arxiv.org/abs/1005.3035v1)
    Which says that the whole universe might be quantum computer,which is much better cool idea than that dvali comics,why you wont consider the whol universe as quantum computer and universe as close system,which doesnt need any 0k temperature

    ReplyDelete
  4. Hi Sabine,
    As always a very nice article. I am not an expert in this field and not familiar with Dvali's work. But from your description, isn't the model by Dvali et al seems similar to gravastar (and related incarnations) models by Mazur, Mottola, Chaplin, Laughlin etc?
    or is it that in Dvali's model there is no "surface", unlike in the models of Mazur et al.

    I have a different question related to non- 0 graviton mass mentioned in a different paper. Do all f(R) gravity models predict a non-0 mass for graviton? That is the claim in arXiv:1603.09551, but this is a news to me.
    shantanu

    ReplyDelete
  5. Shantanu,

    Well, it's similar in that it's about horizon-sized and stable at that size, but that seems to be where the similarities end. I think it's most similar to the fuzzball idea. In either case though, the mechanism by which the collapse is prevented remains somewhat mysterious to me.

    I've never worked on f(R) gravity, but I doubt that they would generally lead to a non-zero graviton mass. This seems highly implausible since higher-order corrections to GR should generically exist. It might be the case for specific types of f. Best,

    B.

    ReplyDelete
  6. Neutron star cores modeled as superconducting proton and superfluid neutron condensates uncreated a vast zoo of exotic particles. Why not a black hole graviton condensate, immortal information, and central singularity uncreation? The universe irreversibly ages.

    Consider digital storage manufactured/year and bytes created/second on planet Earth. Might we be locally depleting or gorging some ineffable property of existence that will reveal itself in a most unfortunate way? I embrace burly experimental apostasies re the Fermi paradox.

    ReplyDelete
  7. Hi Bee,
    In your reply to CIP, you mentioned that there is no metric in Dvali's model.Can you do relativity without metric? I thought the whole relativity theory is equations for metric.

    ReplyDelete
  8. Shantanu,

    A non-zero graviton mass is refuted by the recent LIGO detection of gravitational waves, according to which m(graviton) < 10^(-22) eV.

    Ervin

    ReplyDelete
  9. Dr. Bee, a few questions about black-hole computing:

    Physics cites laws about information—federal (cosmic) and state (terrestrial), plus local (quantum)—but who sees to it that the regulations and procedures are followed? For the religious, I suppose, it would be angels.

    Is a galaxy’s central black hole a sort of mainframe computer, connected digitally to others in a web (GWW)? Do the smaller holes have intranets—maybe networked to minis, or even micros as desktops and tablets?

    Could a galaxy’s IT department get more graviton bandwidth if it was needed to prevent bottlenecks?

    Does the main data-storage system occasionally get defragmented for random-access efficiency?

    Are file types assigned: a few for, say, chemicals’ atoms, and other types for human beings’ thoughts and feelings? Would the latter be grouped in directories—with theories, for example, tagged in a range from “great” to “crap”?

    Is file compression (for rarely accessed stuff like philosophy) used to conserve space?

    Who makes the backups? Where are those kept? And is there an additional, offsite location for disaster-safe storage of info?

    How often is the galactic operating system updated? Does it come in long-term-support versions?

    Is the OS backed up as well? I mean, if the computer crashed, how could the data be accessed?

    And there’s a lot of it, if nothing’s ever deleted. Who gets billed for all that “cloud” storage?

    Could hackers from another galaxy, or another universe, infect and alter a black-hole data-storage system—or lock its information and hold it for ransom?

    Black-hole computing sounds like a lot of bother for someone. The “lost forever” concept is appealingly simpler and cleaner—like emptying my machine’s rubbish basket; when I’m done with a file, so is everybody else, once it gets written over.
    But that’s just me.

    ReplyDelete
  10. kashyap,

    You can't do relativity, but you can do thermodynamics, and that tells you something about what the thing does. Having said that, there should be some kind of quantum-metric (with the classical metric being the mean field) but at least in the papers I have seen so far it hasn't become clear to me how that is supposed to work.

    ReplyDelete
  11. Ervin, I know about the LIGO (and other) limits on mass of the graviton. But the claim in this paper is that all f(R) gravity models
    (which are a big cottage industry among the cosmology community) predict a non-0 graviton mass, which means they are all completely ruled out or constrained if the claim in this paper is correct.

    ReplyDelete
  12. "Dvali’s idea has some lose ends of course."

    Lose ---> loose

    English "lose" is German "verlieren". English "loose" is German "lose", "locker".

    ReplyDelete
  13. Shantanu,

    In my view, it makes no difference of the particular gravity model you consider, they are all ruled out by the experimentally set limits on the graviton mass.

    The topic reminds me about the upper bound of the photon mass, which falls near the theoretical limit derived from the uncertainty principle and the estimated age of the Universe.

    ReplyDelete
  14. Ervin,

    You are misunderstanding Shantanu's question. He is asking whether it is correct what is stated in the paper that all f(R) models lead to a non-zero graviton mass to begin with. Besides, they are of course not ruled out, they are constrained to high precision.

    ReplyDelete
  15. Sabine,

    Thanks, your point is well taken.

    ReplyDelete
  16. I worked (2014) on the principle that black holes behave like confined particles, and that the information might be treated the same way in both. In this picture, vacuum flux unlocks them, and can be considered a condensate over large fields, as can large confined particles (black holes). The difference (between particles and black holes) is the flux gradient.

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.