QR4.5.9 Testing The Theory

Science tests theories by comparing what they predict with what actually happens. In this case, one theory predicts that at the highest frequency, light collides to create matter, while the other claims that light never ever collides: 

Two photons cannot ever collide. In fact light is quantized only when interacting with matter.Wikipedia, 2019.

In the standard model, light doesn’t collide because photons are bosons that share quantum states. In a processing model, photons are the core network process so at a two-point wavelength they will overload the network, i.e. collide. The evidence supporting that light can collide in space to produce matter includes:

1. Confined photons have mass. A free photon is massless but in a hypothetical 100% reflecting mirror box, it has mass because as the box accelerates, unequal photon pressure on its walls creates inertia (van der Mark & t’Hooft, 2011). By this logic, photons confined at a point, in a standing quantum wave, will have mass.

2. Einstein’s equation. Einstein’s equation works both ways, so if mass can turn into energy in nuclear bombs, photon energy can become mass, as the Breit-Wheeler process allows.

3. Particle accelerator collisions create new matter. Protons that collide and stay intact produce new matter that didn’t exist before based on the collision energy, so high-energy photons could do the same.

4. Pair production. High-frequency light near a nucleus gives electrons and positrons that annihilate back into space.

5. Light collides. When high-energy photons at the Stanford Linear Accelerator hit an electron beam to accelerate it at almost the speed of light, some electrons knocked a photon back with enough energy to hit the photon behind it, giving matter pairs that a magnetic field pulled apart to detect (Burke et al., 1997).

That extreme light colliding in a vacuum can give matter is a plausible prediction that can be tested by experiment. And if pure light alone can create matter, the boson-fermion divide of the standard model falls, as bosons can become fermions. The future of physics then lies in colliding light not matter, using light colliders not particle colliders.

The standard model expected particle collisions to unlock the secrets of the universe but they didn’t. Instead of permanent particles, accelerators found only transient flashes, and in nature what doesn’t survive isn’t the future. Hence, these ephemeral flashes are just evolutionary dead-ends that failed to lead anywhere because they weren’t stable.

That everything came from matter is just a theory, and scientists who don’t question their theories are priests. Light is the simplest thing, so that everything came from light is more likely. This theory is testable, so let the facts decide whether it is right or wrong.

Next

QR4.5.8 One Process

Figure 4.19. A processing model

The processing model of Figure 4.19 begins with one process, a simple circle that gives the null result of space. The first event then separated this process to start our universe as a plasma of pure light, with no matter at all. Most of this extreme light then diluted to ordinary light by the expansion of space, but some collided to create matter. A one-axis collision gave electrons or neutrinos, based on phase, while a three-axis collision gave up or down quarks, also based on phase. In both cases, the net process that repeated was mass, and the processing that didn’t run was charge, including the one-third quark charges.

Quarks then combined into protons or neutrons by sharing photons, to give the atomic nuclei around which electrons orbit. The first atom, Hydrogen, was just a proton and an electron, but neutrons allowed higher atoms to evolve based on nucleosynthesis. Both light and matter then evolved from one quantum process running on a network.

Unlike the standard particle model (Figure 4.18), a processing model (Figure 4.19) explains:

1. The evolution of matter. Matter evolved, first from light then into higher atoms, rather than being fundamental.

2. The forces of nature. All forces come from quantum waves on a network, rather than virtual agents.

3. Anti-matter. Processing implies anti-processing, while particles have no natural inverse.

4. Space. Space is a network null process, rather than nothing at all.

5. Neutrinos. The neutrino is an electron byproduct, rather than a pointless particle.

6. Charge. Charge is a mass byproduct, rather than an arbitrary property.

7. Quarks. The one-third charge of quarks are expected, rather than unexpected.

The processing model of Figure 4.19 has no virtual agents, and only one quantum process underlies everything, including space. It also explains what a particle model can’t, including:

1. Why does matter have mass and charge? (4.3.2)

2. Why do neutrinos have a tiny but variable mass? (4.3.3)

3. Why does anti-matter exist? (4.3.4)

4. Why isn’t anti-matter around today? (4.3.5)

5. Why are quark charges in strange thirds? (4.4.3)

6. Why does the force binding quarks increase with distance? (4.4.4)

7. Why don’t protons decay in empty space? (4.4.6)

8. Why does the energy of matter depend on the speed of light? (4.4.8)

9. How did atomic nuclei evolve? (4.6.1)

10. How did electron shells evolve? (4.6.2)

11. Why do charges add simply but mass doesn’t? (4.7.3)

12. Why is the universe charge neutral? (4.7.4)

13. What is dark matter? (4.7.6)

14. What is dark energy? (4.7.7)

These explanations assume only that the waves of quantum theory are processing waves on a network. If a quantum network defines space, it will then keep point matter entities apart. If the network transfer rate is one point per cycle, the speed of light will be constant. If electrons and neutrinos are phases of the same interaction, they will be brother leptons. If up and down quarks are phases of a three-axis interaction, they will have one-third charges. If a process creates matter, there must be anti-matter. One process can then explain what many particles can’t.

The Newtonian idea that God made our world like a clock from existing bits now struggles. If the standard model is God’s Lego-set, why do higher generation leptons and quarks play no part at all in the physics we see? If all the bits that make our universe were lying around before it began, where did they come from?

The alternative is that before our universe began, it didn’t exist at all. There were no divine shortcuts as everything had to be made! This wasn’t possible in one step so light, being simpler, came first and matter followed. Essentially, complex outcomes evolved from a simple process.

The Mandelbrot set illustrates this, as one line of code repeated produces infinite complexity (Figure 4.20), based not on complex bits but on a simple process that endlessly evolves.

Figure 4.20. Mandelbrot’s set, a. Main, b. Detail

If the null process of space became light that became matter that became us, our complex universe came from simplicity, or as Douglas Adams put it, nothing:

The world is a thing of utter inordinate complexity and richness and strangeness that is absolutely awesome. I mean the idea that such complexity can arise not only out of such simplicity, but probably absolutely out of nothing, is the most fabulous extraordinary idea. And once you get some kind of inkling of how that might have happened, it’s just wonderful.” Douglas Adams, quoted by Dawkins in his eulogy for Adams (17 September 2001).

Quantum theory’s description of how physical complexity comes from quantum simplicity supports this extraordinary idea, but how can it be tested?

Next

QR4.5.7 Many Particles

If matter exists, it should break down into basic bits that smashing it apart will reveal, so for almost a century particle physics has collided matter in accelerators to find what doesn’t break down further, which it calls elementary particles (see Figure).

Yet when pressed on what these particles actually are, experts retreat to equations that don’t describe particles at all. This bait-and-switch, talking about particles but giving wave equations, is now normal. The equations describe quantum waves not particles, but they aren’t real so it doesn’t matter! Feynman explains how this double-speak began:

In fact, both objects (electrons and photons) behave somewhat like waves and somewhat like particles. In order to save ourselves from inventing new words such as wavicles, we have chosen to call these objects particles.” (Richard Feynman, 1985), p85.

But imagine if an engineer said “This vehicle has two wheels like a bicycle and an engine like a car, so to avoid inventing a new word like motorcycle, we have chosen to call it a car”. Who would accept that? Physicists with accelerators see everything as a particle, just as a boy with a hammer sees everything as a nail, but the evidence suggests otherwise because what was found was:

1. Ephemeral. The tau particle of the standard model is actually a million, million, millionth of a second energy spike. A lightning bolt is long-lived compared to that, and it isn’t a particle, so why is a tau? Shouldn’t particles live longer than that?

2. Transformable. When a neutron decays into a proton and an electron, three elementary particles become four, so how are they basic elements if they transform? 

3. Massive. The top quark has the same mass as a gold nucleus of 79 protons and 118 neutrons, but why does the cosmic Lego-set need such a big building block? It is no surprise that this so-called elementary particle plays no part in the function of the universe we see.

4. Unstable. If a top quark is elementary, why does it instantly decay? Calling what decays elementary is a strange use of the term. 

Entities that decay and transform into each other aren’t elementary because the basic elements of everything shouldn’t do that, and energy events that last less than a millionth of a second aren’t particles because particles should last longer than that. It follows that the elementary particles of the standard model are neither elementary nor particles.

Calling them building blocks is no better, as imagine building a house from bricks that only exist for an instant, or decay into other bricks, or transform when combined? Of all these building blocks, only the electron is stable alone, and it adds hardly anything to the mass of an atom.

Figure 4.18. The standard particle model

In Figure 4.18, the particles of the standard model divide into fermions and virtual bosons that cause forces. This, we are told, is the end of the story because accelerators can’t break matter down further, but how do particles that exist at a point take up space? Apparently, virtual particles from fields keep them apart, but this theory can’t be tested because virtual particles are unobservable.

The particle model satisfies neither logic nor science, but survives because we don’t look behind the curtain of physical events. Just as the wizard of Oz told Dorothy: “Pay no attention to that man behind the curtain”, who was the real cause of events, today’s wizards tell us to ignore the waves of quantum theory that actually create all physical events. 

Next

QR4.5.6 The Last Standard Model

Ptolemy’s Model

In the second century, Ptolemy’s Almagest let experts predict the movements of the stars for the first time, based on the belief that heavenly bodies, being heavenly, circled the earth in perfect circles, or in circles within circles (epicycles). It wasn’t true, but it worked, and its followers made it work for a thousand years. When new stars were found, they expanded the model to explain them, which made it more complex. This medieval standard model explained all the planets and stars until a new one was found. It only fell when Kepler, Copernicus, Galileo, and Newton developed a valid model to replace it.

Scientists now see medieval standard model as primitive, but it satisfied the experts of the day, as the modern standard model does today, so it is interesting to compare these models, as both are:

1. Descriptive. Both describe what is, but don’t predict what is new. They describe observed patterns, as equations do, but science isn’t based only on description.

2. Based on free parameters. The medieval standard model let experts choose the free parameters of epicycle, eccentric, and equant to fit the facts, and the modern standard model lets them choose the free parameters of field, boson, and charge.

3. After the factThe medieval standard model defined its epicycles after a new star was found, and the modern standard model bolts on a new field after a new force is found.

4. BarrenThe medieval standard model couldn’t produce anything new, like Kepler’s laws, and the modern standard model is the same, so it can’t deduce that matter evolved from light.

5. Ridiculously complexMedieval astronomers tweaked their model until it became absurdly complex, just as today, the equations of string theory fill pages, even books.

6. Normative. The medieval standard model was the norm of its day, so any criticism of it was seen as an attack on tradition, just as now, any critique of today’s standard model is seen as an attack on physics itself (Smolin, 2006).

7. Invalid. Just as we now know that the planets and stars don’t move in circles around the earth, we may in the future accept that virtual particles are unnecessary agents that probably don’t exist.

When the medieval church pressured Galileo to recant, they didn’t ask him to deny that the earth went around the sun. They just asked him to call it a mathematical fiction, not a reality description. Today, physicists call quantum waves mathematical fictions without a church to make them, but that doesn’t make it true. What if quantum waves really exist, just as the earth really does go around the sun?

The scientific method has three steps: first it describes patterns, then it finds correlations, and finally it attributes causes (Rosenthal & Rosnow, 1991). The standard model is then a descriptive model that didn’t become a causal theory because physicists called quantum theory a fantasy. Ironically, Everett then fantasized about many worlds (Everett, 1957), and Witten built a mathematical castle in the air called M-theory, neither of which led anywhere. The standard model may then be a scientific dead end in the history of science, just as the last standard model was.

Next

QR4.5.5 A Particle Toolbox

The standard model invents virtual particles to explain results after they are found, like a toolbox that can produce any particle, so when anti-matter was discovered, it just added a new particle column, and when family generations were found, it just added new rows. When mesons were discovered, they were so unexpected that Nobel laureate Israel Rabi quipped “Who ordered that?”, but the standard model just called them bosons and carried on. When new facts arrive, the standard model accommodates them in its existing structure, or adds a new room.

Scientific theories should be falsifiable, but how can one falsify a model that absorbs rather than adds knowledge? It proposed gravitons that a long search hasn’t found, so was that a fail? It predicted proton decay, but twenty years of study pushed their lifetime to that of the universe, so was that a fail? It expected matter and anti-matter to exist in equal amounts, so is our universe of matter a fail? It expected massless neutrinos, until experiments found they had mass, and penta-quarks and strange quarks, until a two-decade search found neither, and the list goes on. It expected weakly interacting particles (WIMPs) to explain dark matter, but again a long search found nothing. The standard model is like a hydra, as when the facts cut off one head, it just grows another. What will it take to falsify a model whose failures are called unsolved problems in physics?

The standard model’s success is its ability to calculate results to many decimal places, but in science, accuracy isn’t validity. An equation that accurately interpolates between known data points isn’t a theory that extrapolates to new points. Equations are judged by accuracy but theories are judged by their predictions, yet today’s physicists, fed on equations not science (Kuhn, 1970), think they are the same. As Georgi said:

Students should learn the difference between physics and mathematics from the start” (Woit,2007), p85.

The difference is that theories are based on validity, while equations are based on accuracy. A theory is valid if it is true, and no amount of accuracy can replace that, so if a model can’t predict, it doesn’t matter how accurate it is.

The standard model claims to have predicted top and charm quarks before they were found, but predicting quark generations after finding lepton generations is like predicting the last move in a tic-tac-toe game, inevitable. After all, it didn’t predict family generations in the first place. It also claims to have predicted gluons, weak particles, and the Higgs, but predicting what one invents isn’t prediction. Fitting equations to data then matching their terms to ephemeral flashes in accelerator events is like reading tea-leaves – look hard enough and you’ll find something, as according to Wyszkowski’s Second Law, anything can be made to work if you fiddle with it long enough.

The standard model’s reason why a top quark is 300,000 times heavier than an electron is because it is, so it is no surprise that what baffled physics fifty years ago still baffles it today. Equations don’t have to go beyond the data that made them, but theories do, so where are they? The answer is that only the standard model exists, and it isn’t producing any new knowledge. The last time such a barren model dominated thought so completely was before Newton.

Next

QR4.5.4 A Model That Grows Itself

Occam’s razor, not to multiply causes unnecessarily, is the pruning hook of science, but the standard model is the opposite. Particle physics was once just about mass, charge, and spin, but now it has isospin, hypercharge, color, chirality, flavor, and other esoteric features. The standard model today needs sixty-two particles (Note 1), five fields, sixteen charges, and fourteen bosons to work (Table 4.6). If it was a machine, one would have to hand-set over two dozen knobs just right for it to light up, so it isn’t preferred today because it is simple.

For this complexity, one might expect completeness, but the standard model can’t explain gravity, proton stability, anti-matter, quark charges, neutrino mass, neutrino spin, family generations, or the dark energy and matter that constitute 95% of the universe.

Its main feature is that with each new finding, it grows, so to explain inflation it needs a hypothetical symmetron field, and to explain neutrino mass it needs another 7-8 arbitrary constants:

To accommodate nonzero neutrino masses we must add new particles, with exotic properties, for which there’s no other motivation or evidence.” (Wilczek, 2008), p168.

Good theories grow knowledge when given data, just as good gardens grow plants when given water. In contrast, new data just makes the standard model bigger, like a sponge that absorbs water but is itself barren. Multiplying causes unnecessarily has produced a model that goes against science, but the scientific landscape around it is stagnant for the same reason, which is that inventing virtual particles to explain equations after the fact is science in reverse. 

Next

Note1. Two leptons with three generations plus anti-matter variants is 12. Two quarks with three generations plus anti-matter variants and three colors is 36. Plus one photon, eight gluons, three weak bosons, one graviton and the Higgs is another 14. The total is 62.

QR4.5.3 No Unnecessary Agents

The principle of not invoking unnecessary agents is fundamental to science. It is embodied in Occam’s Razor, that if two theories have equal explanatory power, the one that makes fewer assumptions is preferred.

For example, suppose one can see a computer screen but not the hardware and software that run it. The screen changes in bit units, so it could be that unseen bit particles cause that, but the alternative is that the screen changes in bits because that is the basic computer process. If now new effects like color and movement require more particles to be assumed, but a bit process could still cause them, the latter theory is preferred by Occam’s Razor, and indeed it is so.

Likewise, electro-magnetic effects can be explained by assuming virtual photons or by taking the photon to be the basic quantum network process. Either could be true, but more virtual particles are needed to explain effects like nuclear bonding and neutron decay, while a processing theory needs no further assumptions, so it is preferred. Changes in electro-magnetism then occur in photon units for the same reason that computer screens change in bit units. We see a correlation between photons and electro-magnetism, but confusing correlation with causation is a common error of science (Note 1).

Quantum processing, as envisaged here, always runs, so it doesn’t need agents to push it. It also spreads naturally on the network, so an electron that falls to a lower energy orbit doesn’t need a virtual orbit particle to make it so. The forces that the standard model attributes to virtual particles are then explained by processing as follows:

1. Electro-magnetism. The standard model needs virtual photons to explain charge and magnetism, but if a photon is the basic quantum process, no virtual agents are needed to explain electrical and magnetic forces (Chapter 5).

2. The strong effect. The standard model needed a new field that created eight gluons with three charges to explain nuclear bonding, but if quarks bond by sharing photons to achieve stability, again no virtual agents are needed (4.4.4).

3. The weak effect. The standard model needed another field, three new particles, and two new charges to explain neutrons decay, and still couldn’t explain why protons don’t do the same, but if neutron decay is a neutrino effect, protons will only decay in stars, and again no virtual agents are needed (4.4.6).

4. The Higgs. If weak particles don’t exist, the Higgs boson isn’t needed at all. It’s just a flash-in-the-pan accelerator resonance that didn’t survive to affect the evolution of matter, so it’s the ultimate unnecessary virtual agent (4.4.7).

5. Gravity. Every attempt to find gravitons has failed, as gravity waves aren’t particles, but the standard iconography still shows them as real (Figure 4.17). In relativity, gravity alters space and time, but particles that exist in space and time can’t do that. Chapter 5 attributes gravity to an electro-magnetic field gradient.

Figure 4.17. The CERN Standard Model

If a processing model explains the forces of physics without virtual particles, they are unnecessary agents. In this theory, all the forces of nature come from one field that causes both electro-magnetism and gravity. This is simpler than many fields with any particles, so it is preferred by Occam’s razor. 

In contrast, the standard model struggles to explain its own inventions. For example, if the Higgs interacts with some particles to create mass, how do other particles interact? A quark can experience electro-magnetic, strong, weak, and gravity forces at the same time, but how then do virtual photons, gluons, weak particles, and gravitons interact? The standard model doesn’t say. And matter particles imply anti-matter versions, so what happens if a Higgs meets an anti-Higgs? Again, the standard model predicts nothing, so physics is better off without it.

Note 1. The number of ice-creams sold in America correlates with deaths by drowning, so do ice-creams kill? In Europe, number of stork nests correlates with human babies born, so do storks bring babies? In both cases, X and Y correlate because both are caused by a third agent Z, namely the weather, not because they cause each other. Correlation is not causation.

Next

QR4.5.2 Weakening Science

In an old story, a frog put in a pan of hot water jumps out immediately, but if put in tepid water that is slowly heated, it doesn’t realize the danger and perishes. It isn’t literally true, but it illustrates how a gradual background change can prove fatal if unrecognized. For example over centuries, the natives of Easter Island cut down the trees their community depended on until it collapsed, but why did they, seemingly irrationally, chop down the last tree? Diamond’s theory of creeping normality suggests they didn’t see the background change because it was gradual (Diamond, 2005). The same effect could explain the current stagnation of particle physics, except their background was science not the environment.

That Faraday’s electric fields move charges from afar was at first considered fanciful because it was a disembodied force acting at a distance. Newton’s argument that gravity needs a particle agent was:

That gravity should be innate, inherent and essential to matter, so that one body may act upon another at a distance thro’ a vacuum, without the mediation of anything else … is to me so great an absurdity, that I believe no man … can ever fall into it. Gravity must be caused by an agent…” (Oerter, 2006), p17.

Hence, the attraction and repulsion of charges was thought to also need a physical agent.

Maxwell developed his equations of electro-magnetism by imagining physical ball-bearings twisting in vortex tubes, but all later attempts to develop a physical model failed, so it was proposed that field effects were caused by created particles, and as they occurred in photon units, photons were taken to be the force-carriers of electro-magnetism. 

The standard model was born when charge effects were attributed to photons from the electro-magnetic field. They weren’t observable like real photons because their effect consumed them, so they were called virtual photons. Particles made the equations work but no-one noticed the background effect on science, of assuming a cause that wasn’t falsifiable or productive. This was bad science so the scientific foundation of particle physics became weaker. The strength of science is its ability to explain more, so doing the opposite, assuming what doesn’t explain more, made it weaker.

Buoyed by apparent success, the standard model then generalized that all fields work the same way, so it attributed gravity to gravitons that to this day have no physical equivalent. There is no evidence at all that they exist, and they predict nothing new about gravity, so again, particle physics became weaker.

The standard model then proposed that a strong field held together the atomic nucleus by creating virtual gluons with a color charge. It now had a field that created charge but again, gluons added nothing to our knowledge of the nucleus, so again, the scientific background was weakened further.

Explaining why neutrons decay in empty space was more challenging, as now a field had to produce particles with charge and mass. Some evidence was needed, so billions of accelerator events were examined and when compatible resonances were found, weak particles were declared to exist. This time, it predicted that protons decay like neutrons, but they don’t, so the science of particle physics became even weaker.

Finally, the standard model had to explain how a field could create mass. Its answer was of course yet another field, with a virtual particle so massive that needed a billion-dollar accelerator to justify it. All to support Newton’s canon that:

“…the forces of Nature are deeply entwined with the elementary particles of Nature.” (Barrow, 2007), p97.

It sounds good, but the elementary particles it refers to are the virtual ones of the standard model. The standard model has pasted field upon field to prove Newton’s belief in particles, so now virtual particles pop out of space to cause every effect. They are said to be everywhere, making everything happen, but what do they add to our knowledge? The answer, honestly, is not much, as they either predict wrongly or add nothing at all.

A new field is the scientific version of a blank check, whose amount can be filled in after it is known, so adding fields to space was a failure of science not a success of physics. It produces what is not even wrong (Woit, 2006), to give what has been called fairy-tale physics (Baggot, 2013). If so, it was created one fairy at a time, by physicists themselves, albeit unknowingly.

Next

QR4.5.1 What is a Field?

According to Feynman:

A real field is a mathematical function we use for avoiding the idea of action at a distance.” (Feynman, Leighton, & Sands, 1977), Vol. II, p15-7.

For example, the electro-magnetic field can be based on Maxwell’s equations, a set of mathematical functions that describe how electro-magnetic waves travel through empty space to create electrical and magnetic effects from afar. Yet they aren’t a theory of how that happens, as they assume a complex dimension that doesn’t physically exist.

In science, an equation like E=mc² is a mathematical law that relates data facts, usually by an equals sign, while a scientific theory explains those facts. For example, the law of gravity is an equation that relates gravity to the mass of a body like the earth, but it isn’t a theory of gravity, so it puzzled even Newton, and he wrote it. Scientific laws describe how observed facts work while scientific theories explain why, so theories aren’t laws. For example, a germ theory of disease doesn’t need equations to work as a theory. In general, theories are theories and laws are laws, so they are always different. 

Maxwell’s equations then describe how light waves travel but don’t explain it in theory terms, and the same is true for quantum electrodynamics (QED), its quantized extension. But when the standard model adds that the equations work because the electro-magnetic field emits photons, this is a theory. The theory that gravity is caused by gravitons, nuclear binding by gluons, and neutron decay by weak particles must then stand as such, apart from the equations it explains.

The difference is that theories should predict new facts but equations just interpolate between the facts they are based on. This ability, to produce new knowledge, is how theories and equations differ, but the theory that virtual photons cause charge effects didn’t tell us anything new about charge, it just avoided the idea of action at a distance. 

Attempts were then made to develop a mathematical model based on field theory, hoping for a Theory of Everything, or TOE. The result was superstring theory, then Witten’s M-theory, which assumes our space has eight extra dimensions, curled up so we can’t see them. Unfortunately, the result was a theory that could predict anything, and so it predicted nothing, as Woit explained decades ago:

The possible existence of, say, 10500 consistent different vacuum states for superstring theory probably destroys the hope of using the theory to predict anything. If one picks among this large set just those states whose properties agree with present experimental observations, it is likely there still will be such a large number of these that one can get just about whatever value one wants for the results of any new observation. (Woit, 2006), p242.

Even worse, a theory that predicts everything can’t be falsified, which is bad news in science. That a universe of eleven dimensions somehow collapsed into our three-dimensional world is untestable because no experiment can deny it. Good science is both fruitful and falsifiable, but M-theory was neither, so that it led nowhere is no surprise. Thousands of scientific papers were then written on a theory of dubious scientific merit, but how did this happen?

A field that extends across all space adds a degree of freedom to it, so adding a new field equates to a new dimension of space. Based on field theory, gravity adds one dimension, electro-magnetism adds two, the strong force three, and the weak force two. Eight extra dimensions, plus three of space, require M-theory to have eleven dimensions, which interact to make anything possible, so it predicts nothing. The failure of M-theory then lies in the number of fields invented by the standard model. 

Instead of being a theory of everything, M-theory became a theory of nothing, because in science, assuming a fact to explain a fact isn’t profitable, just as borrowing $100 to make a $100 isn’t profitable in business. However this strategy, of assuming more than is explained, has a long history in particle physics.  

Next

QR4.5 Fields Upon Fields

Newton believed that particles cause all the forces of nature because only matter can push matter, so how the earth’s gravity kept the moon in orbit puzzled him. The earth doesn’t touch the moon but it pulls it from a distance, so particles must cause it because nothing else can.

Figure 4.17. The CERN Standard Model

The standard model solution to this puzzle was that Newton was right because modern fields create particles that exert forces. They can’t be seen, as their action destroys them, but the equations imply them. Photon particles from an electro-magnetic field then cause electrical and magnetic forces, gluon particles from a strong field cause nuclear forces, W and Z particles from a weak field cause neutron decay, and a Higgs particle from a field gives mass to W/Z particles.

The standard model of particles (Figure 4.17) was accepted because accelerator energy spikes let its force-carrying particles exist, and the equations worked, so gravity was attributed to graviton particles by analogy, with no evidence. It portrays a universe of fields upon fields, each producing different force particles, but what exactly is a field? 

.

QR4.5.1 What is a Field?

QR4.5.2 The Frog in the Pan

QR4.5.3 Virtual Particles Aren’t Needed

QR4.5.4 A Model that Feeds on Data

QR4.5.5 A Particle Toolbox

QR4.5.6 The Last Standard Model

QR4.5.7 The Particle Model

QR4.5.8 A Processing Model

QR4.5.9 Testing The Theory

Next