QR4.5.9 Testing The Theory

Science tests a new theory when it contradicts an old one. In this case, reverse engineering predicts that at the highest frequency, light will collide to become matter, but the standard model predicts that light never collides because photons are bosons that share quantum states. Table 4.1 separates matter particles that collide from bosons like light, that don’t, so:

Two photons cannot ever collide. In fact light is quantized only when interacting with matter.Wikipedia 2019.

However, if matter and light are both network processes, then a collision is an overload, so potentially, light can collide in empty space to form matter. Evidence for this includes that:

1. Confined photons have mass. A free photon is massless, but when confined in a hypothetical 100% reflecting mirror box, it has mass, because as the box accelerates, unequal photon pressure on its reflecting walls creates inertia (van der Mark & t’Hooft, 2011). By the same logic, photons confined at a point, in a standing wave, will have mass.

2. Einstein’s equation. That matter equates to energy works both ways so if nuclear bombs can turn mass into energy, photon energy can become mass. The Breit-Wheeler process describes how pure light can potentially transform into matter.

3. Particle accelerator collisions routinely create new matter. Protons that collide and stay intact produce new matter that didn’t exist before. If this matter comes from the collision energy, high energy photons can do the same.

4. Pair production. High-frequency light near a nucleus gives electrons and positrons that annihilate back into space.

5. Light collides. When high-energy photons at the Stanford Linear Accelerator hit an electron beam to accelerate it at almost the speed of light, some electrons knocked a photon back with enough energy to hit the photon behind it, giving matter pairs that a magnetic field pulled apart to detect (Burke et al., 1997).

Hence, extreme light colliding in a vacuum to give matter is a plausible prediction that can be tested by experiment. If pure light colliding in empty space creates matter, the boson-fermion divide of the standard model falls, as bosons can create fermions. The future of physics then lies in colliding light not matter, using light colliders not particle colliders.

The standard model expected particle collisions to unlock the secrets of the universe but they didn’t. Instead of elementary bits, accelerators found the brief flashes of transient particles but in nature, what can’t survive isn’t the future. If matter evolved from light, these ephemeral flashes are evolutionary dead-ends that failed because they weren’t stable.

That matter is primal is just a theory, and scientists who don’t question their theories are priests. Light is simpler than matter, so it is more likely to be primal. The theory that light evolved into matter, based on reverse engineering, is testable, so if it is wrong, let the facts decide.

Next

QR4.5.8 A Processing Model

Figure 4.19. A processing model

The processing model of Figure 4.19 needs neither particles nor virtual agents to activate them. It is based on one process, that of light, which spreads on the quantum network as a wave. This lets our universe boot-up from one photon in one unit of space, both of which increased rapidly, until the expansion of space diluted light enough to stop its creation. The result was a plasma of pure light, with no matter at all, but this highest-frequency light did what ordinary light can’t, namely collide. In processing terms, it overloaded the quantum network to create matter as a standing wave.

Extreme light colliding on one axis produced leptons, either an electron or neutrino, depending on the phase. The same collision on three axes also gave up or down quarks, again depending on phase. In both cases, the overload was mass and its remainder was charge, including the one-third quark charges.

Quarks then combined into protons and neutrons by sharing photons, to give the nucleus around which electrons orbit in atoms. The first atom, Hydrogen, was just one proton and one electron, but adding neutrons to the nucleus let other atoms of the periodic table evolve, based on nucleosynthesis.

This model differs from the particle model as follows:

1. The links in Figure 4.19 signify an evolution, as light evolved into matter, and matter evolved into higher atoms, while the links in Figure 4.18 are categories not causes.

2. The processing model needs no virtual agents because processing always explores every option.

3. The particle model doesn’t predict anti-matter, but processing can run in reverse to cause it.

4. In Figure 4.18, the neutrino seems pointless, but Figure 4.19 makes it an electron byproduct.

5. The particle model doesn’t explain charge, but in processing terms it is a mass byproduct.

6. The particle model doesn’t explain the one-third charge of quarks, but the processing model does.

The processing model of Figure 4.19 is simpler because it has no virtual agents, and one fundamental process, that of light, produces all matter. Instead of many fundamental particles, many of which do nothing, one process does everything. It is also causal, so it answers questions that a particle model can’t, including:

1. Why does matter have mass and charge? (4.3.2)

2. Why do neutrinos have a tiny but variable mass? (4.3.3)

3. Why does anti-matter exist? (4.3.4)

4. Why don’t we see anti-matter around us today? (4.3.5)

5. Why are quark charges in strange thirds? (4.4.3)

6. Why does the force binding quarks increase with distance? (4.4.4)

7. Why don’t protons decay in empty space? (4.4.6)

8. Why does the energy of mass depend on the speed of light? (4.4.8)

9. How did atomic nuclei evolve? (4.6.1)

10. How did electron shells evolve? (4.6.2)

11. Why does mass vary enormously but charge doesn’t? (4.7.3)

12. Why is the universe charge neutral? (4.7.4)

13. What is dark matter? (4.7.6)

14. What is dark energy? (4.7.7)

Some of the above are covered shortly. If a quantum network defines the pixels of space, nothing is needed to keep point matter entities apart. If the quantum network transfer rate is one node per cycle, the speed of light will be constant. If electrons and neutrinos are phases of the same interaction, they will be brother leptons. If up and down quarks are phases of a three-axis interaction, there will have charges in thirds. If a quantum process creates matter, there must be anti-matter. Quantum processing can explain what inert particles pushed around by forces can’t.

It’s time to abandon Newton’s idea that God put the world together, like a clock, from existing bits. If the standard model describes God’s Lego-set, why do many of its supposed fundamentals play no part at all in the world we see? To think that all the bits of our universe were lying around before it began, so it was made as we make things, is to underestimate what happened.

The alternative is that before it began, our universe didn’t exist at all, only a reality that had to create from itself alone. There were no divine shortcuts, as even matter had to be made! This couldn’t occur in one step, so light, being simpler, was made first, and matter followed in due course. Essentially, complex outcomes evolved from a simple process.

The Mandelbrot set illustrates this, as one line of code repeated gives rise to complex forms (Figure 4.20). The complexity of the Mandelbrot set isn’t based on complex bits, but on a simple process that endlessly evolves.

Figure 4.20. Mandelbrot’s set, a. Main, b. Detail

If the null process of space became light that became matter that became us, then the complexity of our universe came from the simplicity of nothing, or as Douglas Adams put it:

The world is a thing of utter inordinate complexity and richness and strangeness that is absolutely awesome. I mean the idea that such complexity can arise not only out of such simplicity, but probably absolutely out of nothing, is the most fabulous extraordinary idea. And once you get some kind of inkling of how that might have happened, it’s just wonderful.” Douglas Adams, quoted by Dawkins in his eulogy for Adams (17 September 2001).

This extraordinary idea, of complexity from simplicity, is exactly what quantum theory describes.

Next

QR4.5.7 Many Particles

Aristotle’s idea that matter is a substance suggests that it will break down into basic parts, so smashing it into bits seemed a good way to find them. Physics then spent much of last century battering matter apart in huge accelerators, to find fundamental particles, defined as those that can’t be broken down further.

Yet when pressed on what a particle actually is, the experts retreat to equations that don’t describe particles at all. This bait-and-switch, talking about particles but giving wave equations, is now normal. If one points out that the equations describe quantum waves not particles, the answer is that they are fictional, so it doesn’t matter! Feynman explains how this double-speak began:

In fact, both objects (electrons and photons) behave somewhat like waves and somewhat like particles. In order to save ourselves from inventing new words such as wavicles, we have chosen to call these objects particles.” (Richard Feynman, 1985), p85.

Now imagine if an engineer said “This vehicle has two wheels like a bicycle and an engine like a car so to avoid inventing a new word like motorcycle, we have chosen to call it a car”. Who would accept that? Physicists with particle accelerators see everything as a particle, just as a boy with a hammer sees everything as a nail, but the evidence suggests otherwise, because what battering matter apart revealed was:

1. Ephemeral. The tau particle of the standard model is actually a million, million, millionth of a second energy spike. A lightning bolt is long-lived compared to that, and it isn’t a particle, so why is a tau? Shouldn’t particles exist longer than that?

2. Transformable. When a neutron decays into a proton and an electron, three fundamental particles become four, so how are they fundamental? Fundamental refers to what forms the base for other things, not what is itself transformed.

3. Massive. The fundamental top quark has the same mass as a gold nucleus of 79 protons and 118 neutrons, but why does the cosmic Lego-set have such a huge building block? It is no surprise that this supposedly fundamental entity plays no part at all in the function of the universe we see.

4. Unstable. If a top quark is fundamental, why does it decay into other particles? Calling a particle that instantly decays fundamental is a strange use of the term. 

Entities that decay and transform into each other aren’t fundamental because what is fundamental shouldn’t decay or transform, and energy events that last less than a millionth of a second aren’t particles because what is substantial should last longer than that. It follows that the fundamental particles of the standard model are neither fundamental nor particles.

To clarify, imagine building a house from bricks that are its fundamental parts. But how can one build a house from bricks that disappear when you pick them up, or from bricks that are bigger than the house itself, or from bricks that fall apart, or from bricks that transform when put together? Of all the matter building blocks of the standard model, only the electron is stable alone, and it adds hardly anything to the mass of an atom.

Figure 4.18. The standard particle model

Figure 4.18 shows the fundamental particles of the standard model, which divide into fermion particles and virtual bosons that make things happen. This, we are told, is the end of the story because particle accelerators can’t break things down any further. How then do particles that exist at a point take up space? Apparently, invisible fields generate virtual particles to keep them apart, and this circular argument can’t be tested because virtual particles are unobservable.

The particle model survives because we are conditioned to not look behind the curtain of physical reality. The wizard of Oz told Dorothy: “Pay no attention to that man behind the curtain” to distract her from what really orchestrates events, and likewise today’s wizards tell us to pay no attention to the quantum waves that quantum theory says orchestrate physical events. Therefore, let us look behind the curtain of physical reality to see what really causes them.

Next

QR4.5.6 The Last Standard Model

Ptolemy’s Model

In the second century, Ptolemy’s Almagest let experts predict the movements of the stars for the first time, based on the belief that heavenly bodies, being heavenly, circled the earth in perfect circles, or in circles within circles (epicycles). It wasn’t true, but it worked, and its followers made it work for a thousand years. When new stars were found, they expanded the model to explain them, which made it more complex. This medieval standard model explained all the planets and stars until a new one was found. It only fell when Kepler, Copernicus, Galileo, and Newton developed a valid model to replace it.

Scientists now see medieval standard model as primitive, but it satisfied the experts of the day, as the modern standard model does today, so it is interesting to compare these models, as both are:

1. Descriptive. Both describe what is, but don’t predict what is new. They describe observed patterns, as equations do, but science isn’t based only on description.

2. Based on free parameters. The medieval standard model let experts choose the free parameters of epicycle, eccentric, and equant to fit the facts, and the modern standard model lets them choose the free parameters of field, boson, and charge.

3. After the factThe medieval standard model defined its epicycles after a new star was found, and the modern standard model bolts on a new field after a new force is found.

4. BarrenThe medieval standard model couldn’t produce anything new, like Kepler’s laws, and the modern standard model is the same, so it can’t deduce that matter evolved from light.

5. Ridiculously complexMedieval astronomers tweaked their model until it became absurdly complex, just as today, the equations of string theory fill pages, even books.

6. Normative. The medieval standard model was the norm of its day, so any criticism of it was seen as an attack on tradition, just as now, any critique of today’s standard model is seen as an attack on physics itself (Smolin, 2006).

7. Invalid. Just as we now know that the planets and stars don’t move in circles around the earth, we may in the future accept that virtual particles are unnecessary agents that probably don’t exist.

When the medieval church pressured Galileo to recant, they didn’t ask him to deny that the earth went around the sun. They just asked him to call it a mathematical fiction, not a reality description. Today, physicists call quantum waves mathematical fictions without a church to make them, but that doesn’t make it true. What if quantum waves really exist, just as the earth really does go around the sun?

The scientific method has three steps: first it describes patterns, then it finds correlations, and finally it attributes causes (Rosenthal & Rosnow, 1991). The standard model is then a descriptive model that didn’t become a causal theory because physicists called quantum theory a fantasy. Ironically, Everett then fantasized about many worlds (Everett, 1957), and Witten built a mathematical castle in the air called M-theory, neither of which led anywhere. The standard model may then be a scientific dead end in the history of science, just as the last standard model was.

Next

QR4.5.5 A Particle Toolbox

The standard model invents virtual particles to explain results after they are found, like a toolbox that can produce any particle, so when anti-matter was discovered, it just added a new particle column, and when family generations were found, it just added new rows. When mesons were discovered, they were so unexpected that Nobel laureate Israel Rabi quipped “Who ordered that?”, but the standard model just called them bosons and carried on. When new facts arrive, the standard model accommodates them in its existing structure, or adds a new room.

Scientific theories should be falsifiable, but how can one falsify a model that absorbs rather than adds knowledge? It proposed gravitons that a long search hasn’t found, so was that a fail? It predicted proton decay, but twenty years of study pushed their lifetime to that of the universe, so was that a fail? It expected matter and anti-matter to exist in equal amounts, so is our universe of matter a fail? It expected massless neutrinos, until experiments found they had mass, and penta-quarks and strange quarks, until a two-decade search found neither, and the list goes on. It expected weakly interacting particles (WIMPs) to explain dark matter, but again a long search found nothing. The standard model is like a hydra, as when the facts cut off one head, it just grows another. What will it take to falsify a model whose failures are called unsolved problems in physics?

The standard model’s success is its ability to calculate results to many decimal places, but in science, accuracy isn’t validity. An equation that accurately interpolates between known data points isn’t a theory that extrapolates to new points. Equations are judged by accuracy but theories are judged by their predictions, yet today’s physicists, fed on equations not science (Kuhn, 1970), think they are the same. As Georgi said:

Students should learn the difference between physics and mathematics from the start” (Woit,2007), p85.

The difference is that theories are based on validity, while equations are based on accuracy. A theory is valid if it is true, and no amount of accuracy can replace that, so if a model can’t predict, it doesn’t matter how accurate it is.

The standard model claims to have predicted top and charm quarks before they were found, but predicting quark generations after finding lepton generations is like predicting the last move in a tic-tac-toe game, inevitable. After all, it didn’t predict family generations in the first place. It also claims to have predicted gluons, weak particles, and the Higgs, but predicting what one invents isn’t prediction. Fitting equations to data then matching their terms to ephemeral flashes in accelerator events is like reading tea-leaves – look hard enough and you’ll find something, as according to Wyszkowski’s Second Law, anything can be made to work if you fiddle with it long enough.

The standard model’s reason why a top quark is 300,000 times heavier than an electron is because it is, so it is no surprise that what baffled physics fifty years ago still baffles it today. Equations don’t have to go beyond the data that made them, but theories do, so where are they? The answer is that only the standard model exists, and it isn’t producing any new knowledge. The last time such a barren model dominated thought so completely was before Newton.

Next

QR4.5.4 A Model That Grows Itself

Occam’s razor, not to multiply causes unnecessarily, is the pruning hook of science, but the standard model is the opposite. Particle physics was once just about mass, charge, and spin, but now it has isospin, hypercharge, color, chirality, flavor, and other esoteric features. The standard model today needs sixty-two particles (Note 1), five fields, sixteen charges, and fourteen bosons to work (Table 4.6). If it was a machine, one would have to hand-set over two dozen knobs just right for it to light up, so it isn’t preferred today because it is simple.

For this complexity, one might expect completeness, but the standard model can’t explain gravity, proton stability, anti-matter, quark charges, neutrino mass, neutrino spin, family generations, or the dark energy and matter that constitute 95% of the universe.

Its main feature is that with each new finding, it grows, so to explain inflation it needs a hypothetical symmetron field, and to explain neutrino mass it needs another 7-8 arbitrary constants:

To accommodate nonzero neutrino masses we must add new particles, with exotic properties, for which there’s no other motivation or evidence.” (Wilczek, 2008), p168.

Good theories grow knowledge when given data, just as good gardens grow plants when given water. In contrast, new data just makes the standard model bigger, like a sponge that absorbs water but is itself barren. Multiplying causes unnecessarily has produced a model that goes against science, but the scientific landscape around it is stagnant for the same reason, which is that inventing virtual particles to explain equations after the fact is science in reverse. 

Next

Note1. Two leptons with three generations plus anti-matter variants is 12. Two quarks with three generations plus anti-matter variants and three colors is 36. Plus one photon, eight gluons, three weak bosons, one graviton and the Higgs is another 14. The total is 62.

QR4.5.3 No Unnecessary Agents

The principle of not invoking unnecessary agents is fundamental to science. It is embodied in Occam’s Razor, that if two theories have equal explanatory power, the one that makes fewer assumptions is preferred.

For example, suppose one can see a computer screen but not the hardware and software that run it. The screen changes in bit units, so it could be that unseen bit particles cause that, but the alternative is that the screen changes in bits because that is the basic computer process. If now new effects like color and movement require more particles to be assumed, but a bit process could still cause them, the latter theory is preferred by Occam’s Razor, and indeed it is so.

Likewise, electro-magnetic effects can be explained by assuming virtual photons or by taking the photon to be the basic quantum network process. Either could be true, but more virtual particles are needed to explain effects like nuclear bonding and neutron decay, while a processing theory needs no further assumptions, so it is preferred. Changes in electro-magnetism then occur in photon units for the same reason that computer screens change in bit units. We see a correlation between photons and electro-magnetism, but confusing correlation with causation is a common error of science (Note 1).

Quantum processing, as envisaged here, always runs, so it doesn’t need agents to push it. It also spreads naturally on the network, so an electron that falls to a lower energy orbit doesn’t need a virtual orbit particle to make it so. The forces that the standard model attributes to virtual particles are then explained by processing as follows:

1. Electro-magnetism. The standard model needs virtual photons to explain charge and magnetism, but if a photon is the basic quantum process, no virtual agents are needed to explain electrical and magnetic forces (Chapter 5).

2. The strong effect. The standard model needed a new field that created eight gluons with three charges to explain nuclear bonding, but if quarks bond by sharing photons to achieve stability, again no virtual agents are needed (4.4.4).

3. The weak effect. The standard model needed another field, three new particles, and two new charges to explain neutrons decay, and still couldn’t explain why protons don’t do the same, but if neutron decay is a neutrino effect, protons will only decay in stars, and again no virtual agents are needed (4.4.6).

4. The Higgs. If weak particles don’t exist, the Higgs boson isn’t needed at all. It’s just a flash-in-the-pan accelerator resonance that didn’t survive to affect the evolution of matter, so it’s the ultimate unnecessary virtual agent (4.4.7).

5. Gravity. Every attempt to find gravitons has failed, as gravity waves aren’t particles, but the standard iconography still shows them as real (Figure 4.17). In relativity, gravity alters space and time, but particles that exist in space and time can’t do that. Chapter 5 attributes gravity to an electro-magnetic field gradient.

Figure 4.17. The CERN Standard Model

If a processing model explains the forces of physics without virtual particles, they are unnecessary agents. In this theory, all the forces of nature come from one field that causes both electro-magnetism and gravity. This is simpler than many fields with any particles, so it is preferred by Occam’s razor. 

In contrast, the standard model struggles to explain its own inventions. For example, if the Higgs interacts with some particles to create mass, how do other particles interact? A quark can experience electro-magnetic, strong, weak, and gravity forces at the same time, but how then do virtual photons, gluons, weak particles, and gravitons interact? The standard model doesn’t say. And matter particles imply anti-matter versions, so what happens if a Higgs meets an anti-Higgs? Again, the standard model predicts nothing, so physics is better off without it.

Note 1. The number of ice-creams sold in America correlates with deaths by drowning, so do ice-creams kill? In Europe, number of stork nests correlates with human babies born, so do storks bring babies? In both cases, X and Y correlate because both are caused by a third agent Z, namely the weather, not because they cause each other. Correlation is not causation.

Next

QR4.5.2 Weakening Science

In an old story, a frog put in a pan of hot water jumps out immediately, but if put in tepid water that is slowly heated, it doesn’t realize the danger and perishes. It isn’t literally true, but it illustrates how a gradual background change can prove fatal if unrecognized. For example over centuries, the natives of Easter Island cut down the trees their community depended on until it collapsed, but why did they, seemingly irrationally, chop down the last tree? Diamond’s theory of creeping normality suggests they didn’t see the background change because it was gradual (Diamond, 2005). The same effect could explain the current stagnation of particle physics, except their background was science not the environment.

That Faraday’s electric fields move charges from afar was at first considered fanciful because it was a disembodied force acting at a distance. Newton’s argument that gravity needs a particle agent was:

That gravity should be innate, inherent and essential to matter, so that one body may act upon another at a distance thro’ a vacuum, without the mediation of anything else … is to me so great an absurdity, that I believe no man … can ever fall into it. Gravity must be caused by an agent…” (Oerter, 2006), p17.

Hence, the attraction and repulsion of charges was thought to also need a physical agent.

Maxwell developed his equations of electro-magnetism by imagining physical ball-bearings twisting in vortex tubes, but all later attempts to develop a physical model failed, so it was proposed that field effects were caused by created particles, and as they occurred in photon units, photons were taken to be the force-carriers of electro-magnetism. 

The standard model was born when charge effects were attributed to photons from the electro-magnetic field. They weren’t observable like real photons because their effect consumed them, so they were called virtual photons. Particles made the equations work but no-one noticed the background effect on science, of assuming a cause that wasn’t falsifiable or productive. This was bad science so the scientific foundation of particle physics became weaker. The strength of science is its ability to explain more, so doing the opposite, assuming what doesn’t explain more, made it weaker.

Buoyed by apparent success, the standard model then generalized that all fields work the same way, so it attributed gravity to gravitons that to this day have no physical equivalent. There is no evidence at all that they exist, and they predict nothing new about gravity, so again, particle physics became weaker.

The standard model then proposed that a strong field held together the atomic nucleus by creating virtual gluons with a color charge. It now had a field that created charge but again, gluons added nothing to our knowledge of the nucleus, so again, the scientific background was weakened further.

Explaining why neutrons decay in empty space was more challenging, as now a field had to produce particles with charge and mass. Some evidence was needed, so billions of accelerator events were examined and when compatible resonances were found, weak particles were declared to exist. This time, it predicted that protons decay like neutrons, but they don’t, so the science of particle physics became even weaker.

Finally, the standard model had to explain how a field could create mass. Its answer was of course yet another field, with a virtual particle so massive that needed a billion-dollar accelerator to justify it. All to support Newton’s canon that:

“…the forces of Nature are deeply entwined with the elementary particles of Nature.” (Barrow, 2007), p97.

It sounds good, but the elementary particles it refers to are the virtual ones of the standard model. The standard model has pasted field upon field to prove Newton’s belief in particles, so now virtual particles pop out of space to cause every effect. They are said to be everywhere, making everything happen, but what do they add to our knowledge? The answer, honestly, is not much, as they either predict wrongly or add nothing at all.

A new field is the scientific version of a blank check, whose amount can be filled in after it is known, so adding fields to space was a failure of science not a success of physics. It produces what is not even wrong (Woit, 2006), to give what has been called fairy-tale physics (Baggot, 2013). If so, it was created one fairy at a time, by physicists themselves, albeit unknowingly.

Next

QR4.5.1 What is a Field?

According to Feynman:

A real field is a mathematical function we use for avoiding the idea of action at a distance.” (Feynman, Leighton, & Sands, 1977), Vol. II, p15-7.

For example, the electro-magnetic field can be based on Maxwell’s equations, a set of mathematical functions that describe how electro-magnetic waves travel through empty space to create electrical and magnetic effects from afar. Yet they aren’t a theory of how that happens, as they assume a complex dimension that doesn’t physically exist.

In science, an equation like E=mc² is a mathematical law that relates data facts, usually by an equals sign, while a scientific theory explains those facts. For example, the law of gravity is an equation that relates gravity to the mass of a body like the earth, but it isn’t a theory of gravity, so it puzzled even Newton, and he wrote it. Scientific laws describe how observed facts work while scientific theories explain why, so theories aren’t laws. For example, a germ theory of disease doesn’t need equations to work as a theory. In general, theories are theories and laws are laws, so they are always different. 

Maxwell’s equations then describe how light waves travel but don’t explain it in theory terms, and the same is true for quantum electrodynamics (QED), its quantized extension. But when the standard model adds that the equations work because the electro-magnetic field emits photons, this is a theory. The theory that gravity is caused by gravitons, nuclear binding by gluons, and neutron decay by weak particles must then stand as such, apart from the equations it explains.

The difference is that theories should predict new facts but equations just interpolate between the facts they are based on. This ability, to produce new knowledge, is how theories and equations differ, but the theory that virtual photons cause charge effects didn’t tell us anything new about charge, it just avoided the idea of action at a distance. 

Attempts were then made to develop a mathematical model based on field theory, hoping for a Theory of Everything, or TOE. The result was superstring theory, then Witten’s M-theory, which assumes our space has eight extra dimensions, curled up so we can’t see them. Unfortunately, the result was a theory that could predict anything, and so it predicted nothing, as Woit explained decades ago:

The possible existence of, say, 10500 consistent different vacuum states for superstring theory probably destroys the hope of using the theory to predict anything. If one picks among this large set just those states whose properties agree with present experimental observations, it is likely there still will be such a large number of these that one can get just about whatever value one wants for the results of any new observation. (Woit, 2006), p242.

Even worse, a theory that predicts everything can’t be falsified, which is bad news in science. That a universe of eleven dimensions somehow collapsed into our three-dimensional world is untestable because no experiment can deny it. Good science is both fruitful and falsifiable, but M-theory was neither, so that it led nowhere is no surprise. Thousands of scientific papers were then written on a theory of dubious scientific merit, but how did this happen?

A field that extends across all space adds a degree of freedom to it, so adding a new field equates to a new dimension of space. Based on field theory, gravity adds one dimension, electro-magnetism adds two, the strong force three, and the weak force two. Eight extra dimensions, plus three of space, require M-theory to have eleven dimensions, which interact to make anything possible, so it predicts nothing. The failure of M-theory then lies in the number of fields invented by the standard model. 

Instead of being a theory of everything, M-theory became a theory of nothing, because in science, assuming a fact to explain a fact isn’t profitable, just as borrowing $100 to make a $100 isn’t profitable in business. However this strategy, of assuming more than is explained, has a long history in particle physics.  

Next

QR4.5 Fields Upon Fields

Newton believed that particles cause all the forces of nature because only matter can push matter, so how the earth’s gravity kept the moon in orbit puzzled him. The earth doesn’t touch the moon but it pulls it from a distance, so particles must cause it because nothing else can.

Figure 4.17. The CERN Standard Model

The standard model solution to this puzzle was that Newton was right because modern fields create particles that exert forces. They can’t be seen, as their action destroys them, but the equations imply them. Photon particles from an electro-magnetic field then cause electrical and magnetic forces, gluon particles from a strong field cause nuclear forces, W and Z particles from a weak field cause neutron decay, and a Higgs particle from a field gives mass to W/Z particles.

The standard model of particles (Figure 4.17) was accepted because accelerator energy spikes let its force-carrying particles exist, and the equations worked, so gravity was attributed to graviton particles by analogy, with no evidence. It portrays a universe of fields upon fields, each producing different force particles, but what exactly is a field? 

.

QR4.5.1 What is a Field?

QR4.5.2 The Frog in the Pan

QR4.5.3 Virtual Particles Aren’t Needed

QR4.5.4 A Model that Feeds on Data

QR4.5.5 A Particle Toolbox

QR4.5.6 The Last Standard Model

QR4.5.7 The Particle Model

QR4.5.8 A Processing Model

QR4.5.9 Testing The Theory

Next