QR4.5.9 Testing The Theory

In science, a new theory is tested when it predicts what contradicts the old theory. Quantum realism predicts that light, and light alone, collided to create matter. In contrast, the standard model holds that light is made of photon particles that don’t collide because they are bosons that can occupy the same quantum state without colliding. Table 4.1 is based on a distinction between matter particles (fermions) and force particles (bosons), where fermions collide and bosons don’t. If matter collides by a basic substantiality that light does not have, then:

Two photons cannot ever collide. In fact light is quantized only when interacting with matter.Wikipedia 2019.

In contrast, quantum realism predicts that extreme light in empty space will collide to form matter. Evidence to support this includes that:

1. Photons confined have mass. A free photon is massless but if confined in a hypothetical 100% reflecting mirror box it has a rest mass because as the box accelerates unequal photon pressure on its reflecting walls creates inertia (van der Mark & t’Hooft, 2011). By the same logic, photons entangled in a node will have mass.

2. Einstein’s formula. That matter is energy works both ways so if nuclear bombs can turn mass into energy, photon energy can create mass. The Breit-Wheeler process describes how high energy photons can create matter.

3. Particle accelerator collisions routinely create new matter. Protons that collide and stay intact give new matter that didn’t exist before. If this matter comes from the collision energy, high energy photons can do the same.

4. Pair production. High-frequency light near a nucleus gives electrons and positrons that annihilate back into space.

5. Light collides. When high-energy photons at the Stanford Linear Accelerator hit an electron beam to accelerate it at almost the speed of light, some electrons knocked a photon back with enough energy to hit the photon behind it, giving matter pairs that a magnetic field pulled apart to detect (Burke et al., 1997).

That extreme light alone colliding in a vacuum gives matter is a prediction that no experiment has yet tested.

If beams of pure light can collide in pure space to create matter, the boson-fermion distinction of the standard model is challenged as then bosons can create fermions. If matter evolved from light, the future of physics lies in colliding light not matter so physics should build light colliders rather than particle colliders. Recent experiments support the idea that matter can arise from light, although the light colliding came from high-energy particle collisions creating intense photon bursts rather than directly from lasers.

The standard model expected the short-lived energy flashes of its accelerators to unlock the secrets of the universe but it didn’t happen and quantum realism says it never will. If matter evolved, our billion-dollar accelerators are just finding transient evolutionary dead-ends that led nowhere because in evolution, what doesn’t survive doesn’t change the future. The standard model assumes that matter came first but in quantum realism, light was the first existence.

Physical realism is just a theory and scientists who don’t question their theories are priests. Last century, it was the only game in town but today quantum realism is the rational alternative that space is network null processing, time is its processing cycles, light is the basic quantum process and matter is entangled light rebooting. This theory, based on reverse engineering, is testable, so if it is wrong, let the facts decide.

Next

QR4.5.8 A Quantum Processing Model

Figure 4.19. A quantum processing model

A quantum processing model (Figure 4.19) has no virtual bosons to make things happen because dynamic processing on a network, like an ever-flowing river, actively finds stable states. The first event created a plasma of extreme light that diluted to ordinary light as space expanded and collided with itself to give matter as a standing quantum wave. Extreme light overloading one dimension gave electron or neutrino leptons, depending on phase, and extreme light overloading a plane gave semi-stable up or down quarks, again depending on phase. In both cases, the repeating overload caused mass and the repeating remainder caused charge, including the strange one-third charges of quarks.

The only fundamental process in this model is a circle of quantum processing that in one node outputs “nothing”, so in quantum realism, space is null processing.

Distributing this circle gives the sine wave of light, so the entire electromagnetic spectrum is one process more or less shared so in quantum realism, light is space distributed.

Up and down quarks achieve stability by photon-sharing in a proton or neutron triangle and protons, neutrons and electrons then evolved into stable atoms that in time gave us. Matter entities have anti-matter versions with the same mass but opposite charge because processing can run in reverse. In the lines Figure 4.18 are similarities between supposed fundamentals but in Figure 4.19 they signify a dynamic evolution.

Figure 4.19 is simpler because one fundamental quantum process gives space, light and matter and it answers questions that the standard model of particles struggles with, including:

1. How do matter and charge relate? (4.3.2)

2. Why do neutrinos have a tiny but variable mass? (4.3.3)

3. Why does anti-matter with the same mass but opposite charge exist? (4.3.4)

4. Where did the anti-matter go? (4.3.5)

5. Why are quark charges in strange thirds? (4.4.3)

6. Why does the force binding quarks increase with distance? (4.4.4)

7. Why don’t protons decay in empty space? (4.4.6)

8. Why does the energy of mass depend on the speed of light? (4.4.8)

9. How did atomic nuclei evolve? (4.6.1)

10. How did electron shells evolve? (4.6.2)

11. Why does mass vary enormously but charge doesn’t? (4.7.3)

12. Why is the universe charge neutral? (4.7.4)

13. What is dark matter? (4.7.6)

14. What is dark energy? (4.7.7)

Some of the above are covered shortly. If a quantum network defines the pixels of space, nothing is needed to keep point matter entities apart. If the quantum network transfer rate is one node per cycle, the speed of light will be a constant. If electrons and neutrinos are phases of the same interaction, they will be brother leptons. If up and down quarks are phases of a three-axis interaction, there will be charges in thirds. If a quantum process creates matter, there must be anti-matter. Quantum processing explains more than inert particles pushed around by forces.

It’s time to abandon Newton’s idea that God put the world together like a clock, from existing bits. The standard model doesn’t describe God’s Lego-set because most of its “fundamental particles” play no part at all in the world we see.

If only quantum reality existed initially, it had to create physical reality from itself, with no divine shortcuts because there were no basic bits of matter just laying around from which a universe could be made! Given itself alone, it had to create an observer-observed universe by providing the observer and the observed from itself. This couldn’t occur in one step, so our was universe booted-up from a single photon, not made from preexisting bits. After that, it was complexity evolving from simplicity. The Mandelbrot set illustrates how a simple process can give endless complexity, as one line of code repeated gives rise to endless forms (Figure 4.20). There is no end to the Mandelbrot set not because was “built” from complex bits but because it is an endlessly dynamic interaction.

Figure 4.20. Mandelbrot’s set, a. Main, b. Detail

Quantum realism describes an essential simplicity hidden by complex outputs. If the null process we call space became light, then light became matter and matter became us, so nothing became everything. As Douglas Adams says:

The world is a thing of utter inordinate complexity and richness and strangeness that is absolutely awesome. I mean the idea that such complexity can arise not only out of such simplicity, but probably absolutely out of nothing, is the most fabulous extraordinary idea. And once you get some kind of inkling of how that might have happened, it’s just wonderful.” Douglas Adams, quoted by Dawkins in his eulogy for Adams (17 September 2001)

The best argument against physical realism is the ridiculous complexity of the models it needs to describe it. Quantum realism derives physical complexity from quantum simplicity.

Next

QR4.5.7 The Particle Model

Aristotle’s ancient idea of a matter substance implies that it can be broken down into fundamental particles and battering matter into bits seemed the best way to do that. Physics spent much of last century and billions of dollars smashing matter apart to find fundamental particles, defined as what can’t be broken down further.

But when pressed on what a particle actually is, physicists retreat to wave equations that don’t describe particles at all. This bait-and-switch, talking about a particle but giving a meaningless wave equation, is now the physics norm. If one points out that the equations describe waves not particles, they reply it doesn’t matter because the equations are fictional! Feynman explains how this double-speak began:

In fact, both objects (electrons and photons) behave somewhat like waves and somewhat like particles. In order to save ourselves from inventing new words such as wavicles, we have chosen to call these objects particles.” (RichardFeyman, 1985) p85

Imagine if an engineer said “This vehicle has two wheels like a bicycle and an engine like a car so to avoid inventing a new word like motorcycle we have chosen to call it a car”. A boy with a hammer thinks everything is a nail and likewise physicists with particle accelerators think everything is a particle but it isn’t always so. What physics found by battering matter apart turned out to be neither fundamental nor particles, because it was:

1. Ephemeral. A lightning bolt is long-lived compared to what physics today calls a particle, e.g. a tau is a million, million, millionth of a second energy spike. We don’t call a lightning bolt a particle so why call a tau a particle?

2. Classifiable. The standard model classifies a tiny electron, a massive tau and a positron as leptons but what can be classified isn’t fundamental. Classifying requires common properties that imply something else more fundamental. “Fundamental” in physics today just means that which can’t be further smashed apart by high speed protons.

3. Massive. The “fundamental” top quark has the same mass as a gold nucleus of 79 protons and 118 neutrons. It is 75,000 times heavier than an up quark, so why does the cosmic Lego-set have such a huge building block? Not surprisingly, this fundamental entity plays no part at all in the function of the universe we see.

4. Unstable. If a top quark is fundamental, why does it instantly decay into other particles? When a neutron decays into a proton and an electron, three fundamental particles become four, which is a strange use of the term fundamental.

Entities that decay and transform into each other aren’t fundamental because what is fundamental isn’t subject to decay or transformation, and energy events that last less than a millionth of a second aren’t particles because what is substantial should last longer than that. A brief eddy in a stream isn’t a particle, so why is a brief quantum eddy a particle? It follows that the fundamental particles of the standard model are neither fundamental nor particles but rather quantum events.

Figure 4.18. The standard particle model

The standard particle model (Figure 4.18) describes fundamental particles that are classifiable and virtual bosons that come from nowhere to make things happen. This, we are told, is the end of the story because particle accelerators can’t break point particles down any further. How then does a particle that exists at a point take up space? Apparently, they create invisible fields that generate virtual particles to keep them apart. It is a wonderfully circular argument that can’t be tested because the agents involved are unobservable.

The particle model survives because physicists are conditioned to not look behind the curtain of physical reality. The wizard of Oz told Dorothy: “Pay no attention to that man behind the curtain” to distract her from what really orchestrates events, and likewise the wizards of physics ask us to pay no attention to the quantum waves that quantum theory says create physical reality. Quantum realism looks behind the curtain to see that quantum processes cause physical events.

Next

QR4.5.6 The Last Standard Model

In the second century, Ptolemy’s Almagest let experts predict the movements of the stars for the first time based on the idea that heavenly bodies, being heavenly, moved around the earth in perfect circles, or circles within circles (epicycles). It wasn’t true but it worked, and Ptolemy’s followers made it work for centuries. As new stars were found, they altered the model, to make it more complex. This ancient standard model only fell when Copernicus, Kepler, Galileo, and Newton developed a valid causal model to replace it. The standard model of physics and the standard model of Ptolemy have a lot in common, as both are:

1. Descriptive. They both describe what is but can’t successfully predict new things. They identify patterns, ideally as equations, but description is the first step of science not the last, which is a causal theory that truly predicts.

2. Parameterized. Ptolemy’s model let experts choose the free parameters of epicycle, eccentric and equant to fit the facts, and the standard model of today lets experts choose the free parameters of field, bosons and charge.

3. After the fact. Ptolemy’s model defined its epicycles after a new star was found, just as today’s standard model bolts on a new field after a new force is found.

4. Barren. Descriptive models only interpolate, so the Ptolemaic model would never have deduced Kepler’s laws, Likewise today’s standard model will never deduce that matter is made of extreme light.

5. Complex. Medieval astronomers tweaked Ptolemy’s model until it became absurdly complex, just as standard model equations today fill pages, and those of its string theory offspring fill books.

6. Normative. The Ptolemaic model was the norm of its day, so any critique of it was an attack on tradition. Likewise today, any standard model critique is seen as an attack on physics itself (Smolin,2006).

7. Wrong. Ptolemy’s model sometimes worked, even though planets don’t move in circles around the earth. Likewise the standard model sometimes works, even though virtual particles don’t exist.

When the medieval church pressured Galileo to recant, they didn’t ask him to deny that the earth went around the sun. They asked him to call it a mathematical fiction rather than a reality description. Likewise today, quantum reality is called a mathematical fiction, but what if it really does exist, just as the earth really does go around the sun?

The research method of science has three steps: first it describes patterns, then it finds correlations, and finally it attributes causes (Rosenthal & Rosnow, 1991). This suggests that the standard model is a descriptive model that failed to evolve into a causal theory. The reason proposed here is that it denies that what quantum theory describes is real. This led Everett fantasize about many worlds (Everett,1957), and Witten to go it alone with string theory mathematics, neither of which led anywhere. The standard model, as a naive description based on acausal equations that lead nowhere, is essentially a scientific dead end in the history of physics, just as the last standard model was.

Next

QR4.5.5 A Particle Toolbox

The standard model is essentially a theoretical device that invents virtual particles based on fields to explain results after the fact, like a particle toolbox. For example, when anti-matter was discovered, it just added a new column, and when family generations came along, it added new rows. When mesons were found they were so unexpected that the Nobel laureate Israel Rabi quipped “Who ordered that?”, but the standard model just called them bosons that carried no force, and carried on. When new facts arrive, the standard model fits them into its existing structure, or adds a new room.

Science is based on falsifiability, but how does one falsify a model that absorbs rather than generates knowledge? It proposed gravitons that a long search hasn’t found, so was that a fail? It predicted proton decay, but twenty years of study have pushed their lifetime to that of the universe, so was that a fail? It expected matter and anti-matter to be equal, so is that our universe is only matter a fail? It also expected massless neutrinos, until experiments found they had mass, and penta-quarks and strange quarks, until a two-decade search found neither, and the list goes on. It predicted that weakly interacting particles (WIMPs) will explain dark matter, but again a long search found nothing. The standard model is like a hydra, as when the facts cut off one head, it just grows another. Indeed, it is unclear what it would take to falsify a model whose failures are called “unsolved problems in physics.

The standard model’s claim to fame is that its equations can calculate results to many decimal places, but in science, accuracy isn’t validity. That an equation can accurately interpolate between known data points isn’t the same as a theory that extrapolates to new points. Equations are judged by accuracy but theories are judged by their ability to predict but today’s physicists, fed on equations not science (Kuh,1970), think they are the same, so as Georgi says:

Students should learn the difference between physics and mathematics from the start” (Woit,2007), p85.

The difference is that theories aren’t equations, because they are based on validity not accuracy. A theory is valid if it represents what it true, and if it isn’t valid, it doesn’t matter how reliable it is. Hence, if the standard model isn’t valid because it can’t predict, it doesn’t matter how accurate it is.

When it comes to prediction, the standard model’s success is dubious. It claims to have predicted top and charm quarks before they were found, but predicting three quark generations after finding three generations of leptons is like predicting the last move in a tic-tac-toe game, inevitable. It also claims to have predicted gluons, W bosons, and the Higgs, but predicting invented agents isn’t prediction. Fitting equations to data then matching their terms to ephemeral resonances in billions of accelerator collisions is the research version of tea-leaf reading – look hard enough and you’ll find something. It illustrates Wyszkowski’s Second Law, that anything can be made to work if you fiddle with it long enough.

The standard model’s answer to why a top quark is 300,000 times heavier than an electron is “because it is”. What baffled physics fifty years ago still baffles it today because equations can’t go beyond the data set that created them, only theories can. The last time such a barren model dominated thought so completely was before Newton.

Next

QR4.5.4 A Model That Absorbs Data

Occam’s razor, not to multiply causes unnecessarily, is the pruning hook of science, but the standard model did the opposite. Physics started with a simple theory of mass, charge, and spin, but now it has isospin, hypercharge, color, chirality, flavor, and other esoteric features. The standard model today needs sixty-two fundamental particles (Note 1), five invisible fields, sixteen charges and fourteen bosons to work (Table 4.6). If it was a machine, one would have to hand-set over two dozen knobs just right for it to light up, so if this model is preferred today, it isn’t because of its simplicity.

For this level of complexity, one might expect completeness, but the standard model is far from that. It can’t explain gravity, proton stability, anti-matter, quark charges, neutrino mass, neutrino spin, family generations, randomness, or why inflation stopped. Nor does it say anything about the dark energy or dark matter that constitute 95% of the universe. Its main feature is that with each new result, it grows, so to explain inflation needs a hypothetical symmetron field, and to explain the mass of neutrinos needs another 7-8 arbitrary constants:

To accommodate nonzero neutrino masses we must add new particles, with exotic properties, for which there’s no other motivation or evidence.” (Wilczek, 2008), p168.

Good theories grow new data, as a garden grows plants, but the standard model absorbs new data to grow itself. It is the magic castle that spawned the fairy-like virtual particles that led modern physics nowhere, because multiplying causes unnecessarily is against science.

Next

Note1. Two leptons with three generations plus anti-matter variants is 12. Two quarks with three generations plus anti-matter variants and three colors is 36. Plus one photon, eight gluons, three weak bosons, one graviton and the Higgs is another 14. The total is 62.

QR4.5.3 Virtual Particles Aren’t Needed

Suppose one could see a computer screen but not the hardware and software that created it. If the changes on the screen occur in bit units, does that mean that unseen bit particles create them? In fact, the screen changes in bit units because the bit is the basic unit of the processing behind the screen. Likewise, electro-magnetic effects on the screen of our space can occur in photon units because the photon is the basic unit of the processing that changes the screen.

In this model, quantum processes create physical effects and the photon is the basic network process, so changes in electro-magnetism occur in photon units for the same reason that computer screens change in bit units. What looks like photon particles acting could then be a basic process instead. The correlation we see between photons and electro-magnetism doesn’t imply causation, and confusing correlation with causation is the oldest error in science (Note 1).

Processing that explores every option doesn’t need agents to push it, so an electron that falls to a lower energy orbit doesn’t need an orbit boson to make it so. The forces that physics attributes to imaginary particles can be explained by quantum processing as follows:

1. Electromagnetism. The standard model needs virtual photons to explain electro-magnetism, but a processing model just lets a photon be the basic quantum process. No virtual agents are then needed to explain electrical and magnetic effects (Chapter 5).

2. The strong effect. The standard model needed a new field that created eight gluons with three charges to explain how the nucleus holds together, but a processing model just lets quarks share photons to achieve stability. Again, no virtual agents are needed (4.4.4).

3. The weak effect. The standard model needed another field, three more bosons, and two new charges to explain how neutrons decay, and still couldn’t explain why protons don’t decay in empty space. In contrast, a processing model just lets neutron decay be a neutrino effect, and predicts that protons only decay in stars. Again, no virtual agents are needed (4.4.6).

4. The Higgs. If W-bosons don’t exist, the Higgs boson isn’t needed at all. It is just another species the already over-flowing menagerie of particles that had no role in the evolution of matter, so yet another virtual particle that adds nothing new to our knowledge is avoided (4.4.7).

5. Gravity. Every attempt to find gravitons has failed but standard model iconographies still display them as if they were real (Figure 4.17). But if gravity alters space and time, as Einstein says, how can particles that exist in space and time do that? In contrast, Chapter 5 attributes gravity to a processing gradient.

Figure 4.17. The CERN Standard Model

A processing model explains what the standard model does without virtual particles, so they are unnecessary agents that needn’t exist at all. And even if they did, the standard model tells us nothing about them. For example, if the Higgs acts on weak bosons to give mass, how do other bosons interact? A quark can be subject to electromagnetic, strong, weak, Higgs, and gravity forces, so what will happen if a virtual photon, gluon, weak boson, Higgs, and graviton appear at the same time? The standard model doesn’t say. Also matter bosons imply anti-matter versions, so what happens if a Higgs meets an anti-Higgs? Again the standard model doesn’t predict anything, so physics is better off without it.

Note 1. The number of ice-creams sold in America correlates with deaths by drowning, so do ice-creams kill? In Europe, number of stork nests correlates with human babies born, so do storks bring babies? In both cases, X and Y correlate because both are caused by a third agent Z, namely the weather, not because they cause each other. Correlation is not causation.

Next

QR4.5.2 The Frog in the Pan

In the apocryphal story, a frog dropped in a pan of hot water jumps out immediately but if put in tepid water that is slowly heated, by the time it realizes the danger it is too weak to jump out and perishes. It is now proposed that the standard model did something similar to the science of physics last century.

Faraday’s proposal that electric fields affected charges from afar was initially considered fanciful because it was a disembodied force acting at a distance, and Newton had already rejected this for gravity:

That gravity should be innate, inherent and essential to matter, so that one body may act upon another at a distance thro’ a vacuum, without the mediation of anything else … is to me so great an absurdity, that I believe no man … can ever fall into it. Gravity must be caused by an agent…” (Oerter,2006), p17.

Likewise, charge effects had to be caused by a physical agent. Maxwell then developed the equations of electro-magnetism by imagining physical ball-bearings twisting in vortex tubes, but later attempts to develop this into a physical model failed. To resolve this quandary, field theory proposed that field-created particles moved charges in an electric field, and as Maxwell’s equations acted in photon units, they were taken to be the force-carriers of electro-magnetism. 

The standard model was then born, when it attributed electro-magnetic effects to photons from a Faraday field. Unlike real photons, these photons couldn’t be observed, as they are consumed by the effect they cause, so they were called virtual photons. This worked nicely, so no-one worried that a cause had been assumed after an effect, but since this cause predicted nothing new, the pseudo-science temperature had just gone up a notch.

The strength of science is based on its power to predict physical events not its equations, because equations aren’t theories. Theories increase knowledge but equations just summarize what we know already, so equation results aren’t new predictions. And since virtual photons didn’t predict anything else that was new, the science of physics was made weaker.  

Even so, the standard model then generalized that since photons are bosons, all fields act by boson agents, so gravity had to work by gravitons that to this day have no physical equivalent. There is no evidence at all that such particles have ever existed, but the standard model says they cause gravity. And since they predict nothing new about gravity, again the pseudo-science temperature went up.

Buoyed by its acceptance, the standard model then proposed that protons were held together in the atomic nucleus by a strong field that created virtual gluons with a color charge, so now it had a field that created charge. Yet again, this added nothing to our knowledge of the nucleus, so again the pseudo-science temperature increased.

Explaining why neutrons decay in empty space was more of a challenge, as it needed a weak field to produce virtual agents with charge and mass, unlike virtual photons that have neither mass or charge. Some evidence was needed, so resonances from billions of transient accelerator events were examined, and when compatible ones were found, W-bosons were declared to exist. The equations then implied that protons decay like neutrons, but they don’t, so this failure to predict weakened the science of physics yet again.

Finally, the standard model had to explain how a field could create mass, so its answer was yet another field, this time with a virtual particle so massive that only a billion-dollar accelerator could justify it. All this, to support Newton’s canon that:

“…the forces of Nature are deeply entwined with the elementary particles of Nature.” (Barrow,2007), p97.

This statement sounds good but doesn’t mention that the elementary particles it refers to are virtual! Physics has pasted field upon field to prove Newton’s belief, until now virtual particles pop in and out of space to cause every effect. They are said to be everywhere, making everything happen, but what have they added to the science of physics? The answer, honestly, is not much, as they either predict wrongly or not at all.

Virtual particles are the scientific version of a blank check, whose amount can be filled in after it is known. Every new virtual cause the standard model invented weakened physics until, like the frog in the pan, it is now in danger of dying as a science. Yet the modern age of fairy-tale physics (Baggot, 2013) was created one fairy at a time, by the scientists themselves.

Next

QR4.5.1 Going Nowhere

The standard model explains how forces act at distance by invoking virtual particles that come from fields, so the earth’s field of gravity holds the moon in orbit by graviton particles, and the strong field holds the nucleus together by gluon particles, where according to Feynman:

A real field is a mathematical function we use for avoiding the idea of action at a distance.” (Feyma,Leighton, & Sands, 1977), Vol. II, p15-7.

The electromagnetic field is then a mathematical function used to describe how charge acts from afar, but an equation isn’t a theory. Equations summarize known data but don’t predict as theories do, as the a new values they produce are expectations not predictions. Field theory is only a valid science if it predicts something new, that wasn’t known before, but it didn’t do that. That virtual photons cause charge effects didn’t add knowledge about charge, but it avoided the idea of action at a distance, so it was accepted. It explained the equation but didn’t advance science because it didn’t predict new facts, just as borrowing $10 to make a $10 profit isn’t an advance. Science advances when theories explain more than they assume, not when every new fact needs new assumptions. This tactic of assuming new virtual causes for new forces explains why field theory is currently going nowhere, as will be seen.

By analogy, if I say that your tooth was taken by a tooth fairy, maybe a sock fairy took my sock, and a perhaps a spoon fairy took the lost spoon, but where does it end? These causes lead nowhere because they don’t predict anything new. Likewise, a theory based on gravitons and gluons that predict nothing new is also going nowhere.  

The best example of this failure to predict is string theory, which tried to explain all physics by field mathematics. The result was so many possible architectures that anything was possible, but no-one could say which one applied. It was mathematically impressive but scientifically useless, so it led nowhere. Based on field theory, it assumed that our space has eight additional dimensions, but they interacted in so many ways that the result was meaningless. According to Woit, string theory is pseudo-science because:

The possible existence of, say, 10500 consistent different vacuum states for superstring theory probably destroys the hope of using the theory to predict anything. If one picks among this large set just those states whose properties agree with present experimental observations, it is likely there still will be such a large number of these that one can get just about whatever value one wants for the results of any new observation. (Woit, 2006), p242.

The basic problem is that inventing a field across all space adds what mathematics calls a degree of freedom to it, so adding many fields is like adding many dimensions to space. Based on current field theory, gravity then adds one-dimension, electromagnetism adds two, the strong force adds three, and the weak force two. These eight extra dimensions, plus the three of space, are why string theory has to assume eleven dimensions. Its failure to predict illustrates why inventing new fields to explain new facts isn’t leading anywhere either. 

That a universe of eleven dimensions somehow collapsed into our three-dimensional world is an untestable theory, like the multiverse, that only exists to support our preconceptions. It led nowhere because the goal of science is the explain outcomes not equations, but how did it come to this? 

Next

QR4.5 Fields Upon Fields

Physics spent much of last century trying to prove Newton’s idea that particles cause all the forces in nature. How gravity acted at a distance puzzled Newton, because no particles seemed to cause it. The standard model solved this problem by proposing that fields create virtual particles called bosons that exert forces. First the electro-magnetic field had photon bosons, then the weak field had gluon bosons, then the strong field had W bosons, and finally the Higgs field had a Higgs boson. By analogy, the standard model attributes the force of gravity to graviton bosons, despite no evidence at all. As a result, our universe is seen as fields upon fields, each creating different virtual particles that cause different forces. This section presents the alternate view that only field, the quantum field, does all the above.

QR4.5.1 Going Nowhere

QR4.5.2 The Frog in the Pan

QR4.5.3 There Are No Virtual Particles

QR4.5.4 The Standard Model Feeds on Data

QR4.5.5 A Particle Toolbox

QR4.5.6 The Last Standard Model

QR4.5.7 The Particle Model

QR4.5.8 A Quantum Processing Model

QR4.5.9 Testing The Theory

Next