Gerard ’t Hooft

Some real progress was made. Still today, most philosophers agree that J.S. Bell showed with his theorem the incompatibility of hidden variables with local realism". But then, most of those authors add: "well, there is a loophole, but it is a very tiny one". I now proved that that loophole isn't tiny, it is huge. Bell assumes that observers have 'free will' to measure anything they like, and that there is 'statistical independence'. These assumptions are invalid for the real world. More to the point: I constructed a perfect theory with fast variables at high energies, that, in spite of being totally classical at its core, can generate almost any desired quantum Hamiltonian for its low energy behavior.
This opens up the way to find constraints on theories such as the Standard Model to make them deterministic at high energies but apparently quantum mechanical -- just the real thing -- at low energies. Such models form discrete classes, so one of my conclusions is that there may be better ways to explain the values of coupling strengths than those ugly 'anthropic principles'.
- added May 15, 2020.

General Research interests:        

Gauge theories in elementary particle physics.
This was the topic of the 1999 Nobel Prize. An idea was proposed by C.N. Yang and Robert Mills in 1954: they suggested that particles in the sub-atomic world might interact via fields that are similar to, but more general than electricity and magnetism. But, even though the interactions that had been registered in experiments showed some vague resemblance to the Yang-Mills equations, the details seemed to be all wrong. Attempts to perform accurate calculations were frustrated by infinite - hence meaningless - results. Together with my advisor then, and my co-Nobel-laureate now, M. Veltman, I found in 1970 how to renormalize the theory, and, more importantly, we identified the theories for which this works, and what conditions they must fulfil. One must, for instance, have a so-called Higgs-particle. These theories are now called gauge theories.

It was subsequently discovered that, indeed, the observed details of all known forces exactly agree with this picture. First it was found that the so-called weak force, in combination with the more familiar electro-magnetic one, is exactly described by a Yang-Mills theory. In 1973 it was concluded that also the strong force is a Yang-Mills theory. I was among the small number of people who were already convinced of this from early 1971. During the later 1970s, all pieces fell into place. Of all simple models describing the fundamental particles, one was standing out, the so-called ‘Standard Model’.
Gauge theories are the backbone of this Standard Model. But now it also became clear that this is much more than just a model: it is the Standard Theory. Great precision can be reached, though the practical difficulties in some sectors are still substantial, and it would be great if one could devise more powerful calculation techniques.
Also, in spite of all its successes, the Standard Model, as it is formulated at present, shows deficiencies. It cannot be exactly right. Significant refinements are expected soon, since the new European machine, the Large Hadron Collider (LHC) is now fully operational.

More and more data concerning the Higgs particle are being registered at CERN. The most uncertain parameter has always been its mass, which in principle could be anything between 100 and 1000 GeV. Precision checks obtained from numerous experiments suggested that the most likely mass value should be between 114 and roughly 200 GeV. Using results first from Fermilab in the USA, and now the ones pouring in from LHC, the margin of possible mass values were rapidly narrowed down. As from July 4, 2012, the value 125.1 GeV is being reported, with a margin less than 1 GeV. So far, everything still agrees with the simplest version of the Standard Model ! Modest modifications of the Standard Model would replace the Higgs with a multitude of particles, which will be more difficult to identify. CERN data taking was be interrupted by a long maintenance stop. After that, at higher energies and higher luminosities, more information with much higher precision may be expected.

Quantum gravity and black holes .
For an important discussion of my recent work, see its lecture note format in this link.

The predominant force controlling large scale events in the Universe is the gravitational one. The physical and the mathematical nature of this force were put in an entirely new perspective by Albert Einstein. He noted that gravitation is rooted in geometric properties of space and time themselves. The equations he wrote down for this force show a remarkable resemblance with the gauge forces that control the sub-nuclear world as described in the previous paragraph, but there is one essential difference: if we investigate how individual sub-atomic particles would affect one another gravitationally, we find that the infinities are much worse, and renormalization fails here. Under normal circumstances, the gravitational force between sub-atomic particles is so weak that these difficulties are insignificant, but at extremely tiny distance scales, of the order of   10-33 cm, this force will become strong. We are tempted to believe that, at these tiny distance scales, the fabric of space and time is affected by quantum mechanical phenomena, but exactly how this happens is still very mysterious. One approach to this problem is to ask: under which circumstance is the gravitational force as strong as it ever can be? The answer to this is clear: at the horizon of a black hole.

As I have been emphasizing for more than three decades now, the text book description of quantum gravity (where the Einstein-Hilbert action is quantized using standard procedures) shows flaws here that run deeper than that it generates infinities: it does not allow a description of a black hole as a single quantum object. This is a direct contradiction, a paradox, a problem shouting for a radical solution, saying that there is something we are not doing right. For a long time I was convinced that also superstring theory, in this respect is fundamentally faulty, but two developments forced me to be more cautious here. One: it is now possible to describe at least some members of the black hole family using string theory with multidimensional membranes, called D-branes, added to it. The objects thus obtained are purely quantum mechanical and agree with naive expectations so well that many of my colleagues are convinced that “string theory solves the problem”. But why does this happen? How does string theory resolve the paradox? Curiously, string theorists themselves do not quite understand this. I think that important improvements of the theory are necessary. See below (red text).

Conformal invariance. Here comes twist number two: I claim to have found how to put quantum gravity back in line so as to restore quantum mechanics for pure black holes. It does not happen automatically, you need a new symmetry. It is called local conformal invariance. This symmetry is often used in superstring and supergravity theories, but very often the symmetry is broken by what we call “anomalies”. These anomalies are often looked upon as a nuisance but a fact of life. I now claim that black holes only behave as required in a consistent theory if all conformal anomalies cancel out. This is a very restrictive condition, and, very surprisingly, this condition also affects the Standard Model itself. All particles are only allowed to interact with gravity and with each other in very special ways. Conformal symmetry must be an exact  local symmetry, which is spontaneously broken by the vacuum,  exactly  like in the Higgs mechanism.

This leads to the prediction that models exist where all unknown parameters of the Standard Model, such as the finestructure constant, the proton-electron mass ratio, and in fact all other such parameters are computable. Up till now these have been freely adjustable parameters of the theory, to be determined by experiment but they were not yet predicted by any theory.

I am not able to compute these numbers today because the high energy end of the elementary particle properties is not known. There is one firm prediction: constants of Nature are truly constant. All attempts to detect possible space and time dependence of the Standard Model parameters will give negative results. This is why I am highly interested in precision measurements of possible space-time dependence of constants of Nature, such as the ones done by using a so-called "frequency comb". These are high precision comparisons between different spectral frequencies in atoms and molecules. They tell us something very special about the world we live in.  

The Hierarchy Problem.
An important problem can now be addressed: the hierarchy problem, which is the question why particle masses are 20 orders of magnitude smaller than the Planck mass, and the cosmological constant even more than 120 orders of magnitude. Could my theory explain this? I have been studying some intriguing ideas. Could the coefficients that relate to the cosmological constant and the mass terms be due to instantons? These are known for generating exponentially suppressed amplitudes. My present theory allows me to investigate such approaches. I do have a candidate gravitational instanton that could be the culprit here, but details do not yet work out right.

Fundamental aspects of quantum physics, and their implications for (super) string theory.
My views on the physical interpretation of quantum theory, and its implications for Big Bang theories of the Universe, are rapidly evolving.
My earlier papers (see for instance Hilbert space in deterministic theories, ) may seem to be very formal, and moreover, their contents are disputed, even ridiculed, by some of my colleagues. Is there "free will" or "predestination"? What I am really saying is that, at the most basic level of Nature's equations, there is no quantum mechanics, but only classical logic.

But even so, who cares?

Well, if true, this would be a very important piece of information to use in model building. My claim that it has to be true is based on considerations concerning black holes and the limited amount of quantum information they can contain. Now investigators put forward that what I am claiming  cannot  be true because of the so-called "Bell inequalities": quantum mechanics strongly violates these inequalities while classical systems cannot do that. At the atomic scale, numerous experiments confirm the quantum mechanical predictions. Can I wiggle myself out of this one?

There are long answers and there is a short one.

Producing models that are classical at their most basic levels, while producing something interesting such as the Standard Model at the TeV scale, is not easy. Therefore, the long answers are complicated, and usually my opponents find weak spots in them, but the short answer is more basic. Here is the core of it:

States usually considered are called “physical states”. Any normalised “quantum” superposition of physical states is again a physical state. Atomic physics, elementary particle physics, are all based on considerations of physical states.

Ontological statesform an orthonormal subset of physical states. They form a basis of Hilbert space. Our assumption is that such an ontological basis exists. The Schrӧdinger equation evolves ontological states into ontological states. Models can be constructed where this happens.

Physical states are quantum superpositions of ontological states. Superpositions of different ontological states are never ontological, but describe probability distributions.

Classical states should be distinguishable by inspecting statistical data of the ontological variables. Therefore, they are ontological states as well.

The universe is in an ontological state. It stays in one, regardless what Alice and Bob try to do. Therefore, the outcome of any real experiment is a single classical state, never a superposition. We get a probability distribution as the outcome of our calculations if we took a physical state as our initial state, not knowing what exactly the ontological state was. All this, and more, is explained more extensively in my recent paper,
What I sent to the arXiv here is now the second version; various small or even more elaborate corrections are still to be expected.

I found that comments of colleagues, even the most insulting ones, were actually useful to enable me to formulate the details of this theory more precisely. It is a long paper, but you can lift any section out of it and examine that separately. The grand total now forms a very solidly based theory. Yes, assumptions were made, but everything put together makes these assumptions quite plausible. Here is my discussion page.

Earlier research lead to the consideration of string theories where target space (that is, real space-time) is a lattice. Adding fermions appears to be easy, so my analysis includes superstrings. It led to a startling result: quantised superstrings can be mapped mathematically to classical strings living on a lattice. The equations of motion of these strings are straightforward; they're obviously finite, discrete, and classical.
Caveat: So far, this was done while ignoring string-string interactions. These are complicated, because interaction causes gravity, and gravity causes space-time curvature, which we cannot handle at the moment.
Large amounts of work remain to be done here. I think this subject is fascinating.

More in line with my earlier approach to quantum gravity in connection to black holes: What I should have discovered 30 years ago is that the algebraical relations for the black hole microstates can be solved much more explicitly by expanding everything in spherical harmonics. The result reveals something that came totally unexpected: the black hole evolution operator is only unitary if, on the black hole horizon, antipodal points are identified, so that, the area of the horizon is only half of what is usually written down! This affects space-time topology of the black hole metric. This actually had already been foreseen long ago by Whiting and Sanchez, so I have no priority claim here, but I don't think they saw the implications: black holes are not in a thermally mixed state, as was always claimed by Hawking, but they form pure, entangled states! Hawking particles emitted at one point of the black hole are 100% entangled with the particles emerging at the other side. This resolves important issues, such as the "firewall problem".See arXiv:1605.05119.
But some problems remain: these microstates are not described as Standard Model states, while they should be. We have to figure out how this goes.

To me, Nature is a big jig-saw puzzle, and I see it as my task to try to fit pieces of it together. Click and cut the pieces you see here from the screen and see how they fit, or: read more about it in my book: ‘Bouwstenen van de Schepping’ (Prometheus/Bert Bakker, ISBN 90 351 1327 6) or its English version: ‘In Search of the Ultimate Building Blocks’, Cambridge Univ. Press, Paperback, ISBN 0 521 578833; hardback, ISBN 0 521 550831) . In ‘Planetenbiljart’, a personal view is described of the potentials of scientific and technological developments in the future. Which possibilities are there and are there things that will be impossible forever? Maybe you enjoy SF novels as much as I do, but don’t mistake those for predictions of the future. My book appeared in English: Playing with Planets, World Scientific,  ISBN 978-981-279-307-2 (hard cover), ISBN 978-981-279-020-0 ( pbk), 2008.

July 4, 2012: Important announcement at CERN: a bosonic particle with all the properties expected for the Higgs particle has been detected. This most likely will fill up the last blank spot in the Standard Model. As of December 2012: this still seems to be our Higgs particle! Didn't I tell you it should be there? 

Gravitating misconceptions: response on claims by a group of self proclaimed scientists concerning the validity of the theory of General Relativity.


A frequently asked question:
Can Theoretical Physics explain paranormal phenomena? You do not want to hear about it. So do NOT click on
A sober explanation (in English) .or Poster op het Skepsis-congres(in het Nederlands), 8 mei 1998.