QR4.5.9 Testing The Theory

Science tests a new theory when it contradicts an old one. In this case, reverse engineering predicts that at the highest frequency, light will collide to become matter, but the standard model predicts that light never collides because photons are bosons that share quantum states. Table 4.1 separates matter particles that collide from bosons like light, that don’t, so:

Two photons cannot ever collide. In fact light is quantized only when interacting with matter.Wikipedia 2019.

However, if matter and light are both network processes, then a collision is an overload, so potentially, light can collide in empty space to form matter. Evidence for this includes that:

1. Confined photons have mass. A free photon is massless, but when confined in a hypothetical 100% reflecting mirror box, it has mass, because as the box accelerates, unequal photon pressure on its reflecting walls creates inertia (van der Mark & t’Hooft, 2011). By the same logic, photons confined at a point, in a standing wave, will have mass.

2. Einstein’s equation. That matter equates to energy works both ways so if nuclear bombs can turn mass into energy, photon energy can become mass. The Breit-Wheeler process describes how pure light can potentially transform into matter.

3. Particle accelerator collisions routinely create new matter. Protons that collide and stay intact produce new matter that didn’t exist before. If this matter comes from the collision energy, high energy photons can do the same.

4. Pair production. High-frequency light near a nucleus gives electrons and positrons that annihilate back into space.

5. Light collides. When high-energy photons at the Stanford Linear Accelerator hit an electron beam to accelerate it at almost the speed of light, some electrons knocked a photon back with enough energy to hit the photon behind it, giving matter pairs that a magnetic field pulled apart to detect (Burke et al., 1997).

Hence, extreme light colliding in a vacuum to give matter is a plausible prediction that can be tested by experiment. If pure light colliding in empty space creates matter, the boson-fermion divide of the standard model falls, as bosons can create fermions. The future of physics then lies in colliding light not matter, using light colliders not particle colliders.

The standard model expected particle collisions to unlock the secrets of the universe but they didn’t. Instead of elementary bits, accelerators found the brief flashes of transient particles but in nature, what can’t survive isn’t the future. If matter evolved from light, these ephemeral flashes are evolutionary dead-ends that failed because they weren’t stable.

That matter is primal is just a theory, and scientists who don’t question their theories are priests. Light is simpler than matter, so it is more likely to be primal. The theory that light evolved into matter, based on reverse engineering, is testable, so if it is wrong, let the facts decide.

Next

QR4.5.8 A Processing Model

Figure 4.19. A processing model

The processing model of Figure 4.19 needs neither particles nor virtual agents to activate them. It is based on one process, that of light, which spreads on the quantum network as a wave. This lets our universe boot-up from one photon in one unit of space, both of which increased rapidly, until the expansion of space diluted light enough to stop its creation. The result was a plasma of pure light, with no matter at all, but this highest-frequency light did what ordinary light can’t, namely collide. In processing terms, it overloaded the quantum network to create matter as a standing wave.

Extreme light colliding on one axis produced leptons, either an electron or neutrino, depending on the phase. The same collision on three axes also gave up or down quarks, again depending on phase. In both cases, the overload was mass and its remainder was charge, including the one-third quark charges.

Quarks then combined into protons and neutrons by sharing photons, to give the nucleus around which electrons orbit in atoms. The first atom, Hydrogen, was just one proton and one electron, but adding neutrons to the nucleus let other atoms of the periodic table evolve, based on nucleosynthesis.

This model differs from the particle model as follows:

1. The links in Figure 4.19 signify an evolution, as light evolved into matter, and matter evolved into higher atoms, while the links in Figure 4.18 are categories not causes.

2. The processing model needs no virtual agents because processing always explores every option.

3. The particle model doesn’t predict anti-matter, but processing can run in reverse to cause it.

4. In Figure 4.18, the neutrino seems pointless, but Figure 4.19 makes it an electron byproduct.

5. The particle model doesn’t explain charge, but in processing terms it is a mass byproduct.

6. The particle model doesn’t explain the one-third charge of quarks, but the processing model does.

The processing model of Figure 4.19 is simpler because it has no virtual agents, and one fundamental process, that of light, produces all matter. Instead of many fundamental particles, many of which do nothing, one process does everything. It is also causal, so it answers questions that a particle model can’t, including:

1. Why does matter have mass and charge? (4.3.2)

2. Why do neutrinos have a tiny but variable mass? (4.3.3)

3. Why does anti-matter exist? (4.3.4)

4. Why don’t we see anti-matter around us today? (4.3.5)

5. Why are quark charges in strange thirds? (4.4.3)

6. Why does the force binding quarks increase with distance? (4.4.4)

7. Why don’t protons decay in empty space? (4.4.6)

8. Why does the energy of mass depend on the speed of light? (4.4.8)

9. How did atomic nuclei evolve? (4.6.1)

10. How did electron shells evolve? (4.6.2)

11. Why does mass vary enormously but charge doesn’t? (4.7.3)

12. Why is the universe charge neutral? (4.7.4)

13. What is dark matter? (4.7.6)

14. What is dark energy? (4.7.7)

Some of the above are covered shortly. If a quantum network defines the pixels of space, nothing is needed to keep point matter entities apart. If the quantum network transfer rate is one node per cycle, the speed of light will be constant. If electrons and neutrinos are phases of the same interaction, they will be brother leptons. If up and down quarks are phases of a three-axis interaction, there will have charges in thirds. If a quantum process creates matter, there must be anti-matter. Quantum processing can explain what inert particles pushed around by forces can’t.

It’s time to abandon Newton’s idea that God put the world together, like a clock, from existing bits. If the standard model describes God’s Lego-set, why do many of its supposed fundamentals play no part at all in the world we see? To think that all the bits of our universe were lying around before it began, so it was made as we make things, is to underestimate what happened.

The alternative is that before it began, our universe didn’t exist at all, only a reality that had to create from itself alone. There were no divine shortcuts, as even matter had to be made! This couldn’t occur in one step, so light, being simpler, was made first, and matter followed in due course. Essentially, complex outcomes evolved from a simple process.

The Mandelbrot set illustrates this, as one line of code repeated gives rise to complex forms (Figure 4.20). The complexity of the Mandelbrot set isn’t based on complex bits, but on a simple process that endlessly evolves.

Figure 4.20. Mandelbrot’s set, a. Main, b. Detail

If the null process of space became light that became matter that became us, then the complexity of our universe came from the simplicity of nothing, or as Douglas Adams put it:

The world is a thing of utter inordinate complexity and richness and strangeness that is absolutely awesome. I mean the idea that such complexity can arise not only out of such simplicity, but probably absolutely out of nothing, is the most fabulous extraordinary idea. And once you get some kind of inkling of how that might have happened, it’s just wonderful.” Douglas Adams, quoted by Dawkins in his eulogy for Adams (17 September 2001).

This extraordinary idea, of complexity from simplicity, is exactly what quantum theory describes.

Next

QR4.5.7 The Particle Model

Aristotle’s concept of matter as a substance implied that it will break down into basic parts. Battering matter into bits seemed the best way to do that, so physics spent much of last century smashing matter apart to find fundamental particles that can’t be broken down further.

Yet when pressed on what a particle actually is, the experts retreat to equations that don’t describe particles at all. This bait-and-switch, talking about a particle but giving a wave equation, is now the norm. If one points out that the equations describe waves not particles, the answer is it doesn’t matter because they are fictional! Feynman explains how this double-speak began:

In fact, both objects (electrons and photons) behave somewhat like waves and somewhat like particles. In order to save ourselves from inventing new words such as wavicles, we have chosen to call these objects particles.” (RichardFeyman, 1985), p85.

But imagine if an engineer said “This vehicle has two wheels like a bicycle and an engine like a car so to avoid inventing a new word like motorcycle, we have chosen to call it a car”. A boy with a hammer sees everything as a nail, and likewise experts with particle accelerators think everything is a particle, but the evidence suggests otherwise, because what battering matter apart revealed was:

1. Ephemeral. The tau particle of the standard model is actually a million, million, millionth of a second energy spike. A lightning bolt is long-lived compared to that, and we don’t call it a particle, so why is a tau a particle? Isn’t a particle supposed to subsist?

2. Transformable. When a neutron decays into a proton and an electron, three fundamental particles become four, so how are they fundamental? Fundamental refers to what forms the base for other things, not what is itself transformed.

3. Massive. The fundamental top quark has the same mass as a gold nucleus of 79 protons and 118 neutrons, but why does the cosmic Lego-set have such a huge building block? It is no surprise that this supposedly fundamental entity plays no part at all in the function of the universe we see.

4. Unstable. If a top quark is fundamental, why does it decay into other particles? Calling a particle that instantly decays fundamental is a strange use of the term. 

Entities that decay and transform into each other aren’t fundamental because what is fundamental shouldn’t decay or transform, and energy events that last less than a millionth of a second aren’t particles because what is substantial should last longer than that. It follows that the fundamental particles of the standard model are neither fundamental nor particles.

To clarify, imagine building a house from bricks that are its fundamental parts. But how can one build a house from bricks that disappear when you pick them up, or from bricks that are bigger than the house itself, or from bricks that fall apart, or from bricks that transform when put together? Of all the matter building blocks of the standard model, only the electron is stable alone, and it adds hardly anything to the mass of an atom.

Figure 4.18. The standard particle model

Figure 4.18 shows the fundamental particles of the standard model, which divide into fermion particles and virtual bosons that make things happen. This, we are told, is the end of the story because particle accelerators can’t break things down any further. How then do particles that exist at a point take up space? Apparently, invisible fields generate virtual particles to keep them apart, and this circular argument can’t be tested because virtual particles are unobservable.

The particle model survives because we are conditioned to not look behind the curtain of physical reality. The wizard of Oz told Dorothy: “Pay no attention to that man behind the curtain” to distract her from what really orchestrates events, and likewise today’s wizards tell us to pay no attention to the quantum waves that quantum theory says orchestrate physical events. Therefore, let us look behind the curtain of physical reality to see what really causes them.

Next

QR4.5.6 The Last Standard Model

In the second century, Ptolemy’s Almagest let experts predict the movements of the stars for the first time, based on the belief that heavenly bodies, being heavenly, moved around the earth in perfect circles, or in circles within circles (epicycles). It wasn’t true, but it worked, and its followers made it work for over a thousand years. As new stars were found, the model was upgraded to explain them, which increased its complexity. This first standard model explained every star movement, until a new one was seen, and it only fell when Copernicus, Kepler, Galileo, and Newton developed a causal model to replace it. Today there is a second standard model, and it is very like the first, as both are:

1. Descriptive. Both describe what is, but fail to predict what is new. They are based on observed patterns, ideally equations, but description is only the first step of science, not the last.

2. Parameterized. Ptolemy’s model let experts choose the free parameters of epicycle, eccentric, and equant to fit the facts, and today’s standard model lets experts choose the free parameters of field, bosons, and charge.

3. After the fact. Ptolemy’s model defined its epicycles after a new star was found, just as today’s standard model bolts on a new field after a new force is found.

4. Barren. Descriptive models only interpolate, so the Ptolemaic model would never have deduced Kepler’s laws, Likewise today’s standard model will never deduce that matter is made of extreme light.

5. ComplexMedieval astronomers tweaked Ptolemy’s model until it became absurdly complex, just as today, standard model equations fill pages and those of its string theory offspring fill books.

6. Normative. The Ptolemaic model was the norm of its day, so any critique of it was an attack on tradition. Likewise today, any standard model critique is seen as an attack on physics itself (Smolin,2006).

7. Wrong. Ptolemy’s model sometimes worked, even though planets don’t move in circles around the earth. Likewise our standard model sometimes works, even though virtual particles don’t exist.

When the medieval church pressured Galileo to recant, they didn’t ask him to deny that the earth went around the sun. They just asked him to call it a mathematical fiction not a reality description. Likewise today, quantum theory is called a mathematical fiction, but what if it describes reality, just as the earth really does go around the sun?

The research method of science has three steps: first it describes patterns, then it finds correlations, and finally it attributes causes (Rosenthal & Rosnow, 1991). This suggests that the standard model is a descriptive model that didn’t evolve into a causal theory because physicists gave up on quantum theory. Instead, Everett fantasized about many worlds (Everett,1957) and Witten built a mathematical castle in the air called M-theory, neither of which led anywhere. The standard model, as a description based on equations that lead nowhere, is essentially a scientific dead end in the history of science, just as the last standard model was.

Next

QR4.5.5 A Particle Toolbox

The standard model is essentially a theoretical device that invents virtual particles based on fields to explain results after the fact, like a particle toolbox. For example, when anti-matter was discovered, it just added a new column, and when family generations came along, it added new rows. When mesons were found they were so unexpected that the Nobel laureate Israel Rabi quipped “Who ordered that?”, but the standard model just called them bosons that carried no force, and carried on. When new facts arrive, the standard model fits them into its existing structure, or adds a new room.

Science is based on falsifiability, but how does one falsify a model that absorbs rather than generates knowledge? It proposed gravitons that a long search hasn’t found, so was that a fail? It predicted proton decay, but twenty years of study have pushed their lifetime to that of the universe, so was that a fail? It expected matter and anti-matter to be equal, so is that our universe is only matter a fail? It also expected massless neutrinos, until experiments found they had mass, and penta-quarks and strange quarks, until a two-decade search found neither, and the list goes on. It predicted that weakly interacting particles (WIMPs) will explain dark matter, but again a long search found nothing. The standard model is like a hydra, as when the facts cut off one head, it just grows another. Indeed, it is unclear what it would take to falsify a model whose failures are called “unsolved problems in physics.

The standard model’s claim to fame is that its equations can calculate results to many decimal places, but in science, accuracy isn’t validity. That an equation can accurately interpolate between known data points isn’t the same as a theory that extrapolates to new points. Equations are judged by accuracy but theories are judged by their ability to predict but today’s physicists, fed on equations not science (Kuh,1970), think they are the same, so as Georgi says:

Students should learn the difference between physics and mathematics from the start” (Woit,2007), p85.

The difference is that theories aren’t equations, because they are based on validity not accuracy. A theory is valid if it represents what it true, and if it isn’t valid, it doesn’t matter how reliable it is. Hence, if the standard model isn’t valid because it can’t predict, it doesn’t matter how accurate it is.

When it comes to prediction, the standard model’s success is dubious. It claims to have predicted top and charm quarks before they were found, but predicting three quark generations after finding three generations of leptons is like predicting the last move in a tic-tac-toe game, inevitable. It also claims to have predicted gluons, W bosons, and the Higgs, but predicting invented agents isn’t prediction. Fitting equations to data then matching their terms to ephemeral resonances in billions of accelerator collisions is the research version of tea-leaf reading – look hard enough and you’ll find something. It illustrates Wyszkowski’s Second Law, that anything can be made to work if you fiddle with it long enough.

The standard model’s answer to why a top quark is 300,000 times heavier than an electron is “because it is”. What baffled physics fifty years ago still baffles it today because equations can’t go beyond the data set that created them, only theories can. The last time such a barren model dominated thought so completely was before Newton.

Next

QR4.5.4 A Model That Absorbs Data

Occam’s razor, not to multiply causes unnecessarily, is the pruning hook of science, but the standard model did the opposite. Physics started with a simple theory of mass, charge, and spin, but now it has isospin, hypercharge, color, chirality, flavor, and other esoteric features. The standard model today needs sixty-two fundamental particles (Note 1), five invisible fields, sixteen charges and fourteen bosons to work (Table 4.6). If it was a machine, one would have to hand-set over two dozen knobs just right for it to light up, so if this model is preferred today, it isn’t because of its simplicity.

For this level of complexity, one might expect completeness, but the standard model is far from that. It can’t explain gravity, proton stability, anti-matter, quark charges, neutrino mass, neutrino spin, family generations, randomness, or why inflation stopped. Nor does it say anything about the dark energy or dark matter that constitute 95% of the universe. Its main feature is that with each new result, it grows, so to explain inflation needs a hypothetical symmetron field, and to explain the mass of neutrinos needs another 7-8 arbitrary constants:

To accommodate nonzero neutrino masses we must add new particles, with exotic properties, for which there’s no other motivation or evidence.” (Wilczek, 2008), p168.

Good theories grow new data, as a garden grows plants, but the standard model absorbs new data to grow itself. It is the magic castle that spawned the fairy-like virtual particles that led modern physics nowhere, because multiplying causes unnecessarily is against science.

Next

Note1. Two leptons with three generations plus anti-matter variants is 12. Two quarks with three generations plus anti-matter variants and three colors is 36. Plus one photon, eight gluons, three weak bosons, one graviton and the Higgs is another 14. The total is 62.

QR4.5.3 Virtual Particles Aren’t Needed

Suppose one could see a computer screen but not the hardware and software that created it. If the changes on the screen occur in bit units, does that mean that unseen bit particles create them? In fact, the screen changes in bit units because the bit is the basic unit of the processing behind the screen. Likewise, electro-magnetic effects on the screen of our space can occur in photon units because the photon is the basic unit of the processing that changes the screen.

In this model, quantum processes create physical effects and the photon is the basic network process, so changes in electro-magnetism occur in photon units for the same reason that computer screens change in bit units. What looks like photon particles acting could then be a basic process instead. The correlation we see between photons and electro-magnetism doesn’t imply causation, and confusing correlation with causation is the oldest error in science (Note 1).

Processing that explores every option doesn’t need agents to push it, so an electron that falls to a lower energy orbit doesn’t need an orbit boson to make it so. The forces that physics attributes to imaginary particles can be explained by quantum processing as follows:

1. Electromagnetism. The standard model needs virtual photons to explain electro-magnetism, but a processing model just lets a photon be the basic quantum process. No virtual agents are then needed to explain electrical and magnetic effects (Chapter 5).

2. The strong effect. The standard model needed a new field that created eight gluons with three charges to explain how the nucleus holds together, but a processing model just lets quarks share photons to achieve stability. Again, no virtual agents are needed (4.4.4).

3. The weak effect. The standard model needed another field, three more bosons, and two new charges to explain how neutrons decay, and still couldn’t explain why protons don’t decay in empty space. In contrast, a processing model just lets neutron decay be a neutrino effect, and predicts that protons only decay in stars. Again, no virtual agents are needed (4.4.6).

4. The Higgs. If W-bosons don’t exist, the Higgs boson isn’t needed at all. It is just another species the already over-flowing menagerie of particles that had no role in the evolution of matter, so yet another virtual particle that adds nothing new to our knowledge is avoided (4.4.7).

5. Gravity. Every attempt to find gravitons has failed but standard model iconographies still display them as if they were real (Figure 4.17). But if gravity alters space and time, as Einstein says, how can particles that exist in space and time do that? In contrast, Chapter 5 attributes gravity to a processing gradient.

Figure 4.17. The CERN Standard Model

A processing model explains what the standard model does without virtual particles, so they are unnecessary agents that needn’t exist at all. And even if they did, the standard model tells us nothing about them. For example, if the Higgs acts on weak bosons to give mass, how do other bosons interact? A quark can be subject to electromagnetic, strong, weak, Higgs, and gravity forces, so what will happen if a virtual photon, gluon, weak boson, Higgs, and graviton appear at the same time? The standard model doesn’t say. Also matter bosons imply anti-matter versions, so what happens if a Higgs meets an anti-Higgs? Again the standard model doesn’t predict anything, so physics is better off without it.

Note 1. The number of ice-creams sold in America correlates with deaths by drowning, so do ice-creams kill? In Europe, number of stork nests correlates with human babies born, so do storks bring babies? In both cases, X and Y correlate because both are caused by a third agent Z, namely the weather, not because they cause each other. Correlation is not causation.

Next

QR4.5.2 The Frog in the Pan

In the apocryphal story, a frog dropped in a pan of hot water jumps out immediately but if put in tepid water that is slowly heated, by the time it realizes the danger it is too weak to jump out and perishes. It is now proposed that the standard model did something similar to the science of physics last century.

Faraday’s proposal that electric fields affected charges from afar was initially considered fanciful because it was a disembodied force acting at a distance, and Newton had already rejected this for gravity:

That gravity should be innate, inherent and essential to matter, so that one body may act upon another at a distance thro’ a vacuum, without the mediation of anything else … is to me so great an absurdity, that I believe no man … can ever fall into it. Gravity must be caused by an agent…” (Oerter,2006), p17.

Likewise, charge effects had to be caused by a physical agent. Maxwell then developed the equations of electro-magnetism by imagining physical ball-bearings twisting in vortex tubes, but later attempts to develop this into a physical model failed. To resolve this quandary, field theory proposed that field-created particles moved charges in an electric field, and as Maxwell’s equations acted in photon units, they were taken to be the force-carriers of electro-magnetism. 

The standard model was then born, when it attributed electro-magnetic effects to photons from a Faraday field. Unlike real photons, these photons couldn’t be observed, as they are consumed by the effect they cause, so they were called virtual photons. This worked nicely, so no-one worried that a cause had been assumed after an effect, but since this cause predicted nothing new, the pseudo-science temperature had just gone up a notch.

The strength of science is based on its power to predict physical events not its equations, because equations aren’t theories. Theories increase knowledge but equations just summarize what we know already, so equation results aren’t new predictions. And since virtual photons didn’t predict anything else that was new, the science of physics was made weaker.  

Even so, the standard model then generalized that since photons are bosons, all fields act by boson agents, so gravity had to work by gravitons that to this day have no physical equivalent. There is no evidence at all that such particles have ever existed, but the standard model says they cause gravity. And since they predict nothing new about gravity, again the pseudo-science temperature went up.

Buoyed by its acceptance, the standard model then proposed that protons were held together in the atomic nucleus by a strong field that created virtual gluons with a color charge, so now it had a field that created charge. Yet again, this added nothing to our knowledge of the nucleus, so again the pseudo-science temperature increased.

Explaining why neutrons decay in empty space was more of a challenge, as it needed a weak field to produce virtual agents with charge and mass, unlike virtual photons that have neither mass or charge. Some evidence was needed, so resonances from billions of transient accelerator events were examined, and when compatible ones were found, W-bosons were declared to exist. The equations then implied that protons decay like neutrons, but they don’t, so this failure to predict weakened the science of physics yet again.

Finally, the standard model had to explain how a field could create mass, so its answer was yet another field, this time with a virtual particle so massive that only a billion-dollar accelerator could justify it. All to support Newton’s canon that:

“…the forces of Nature are deeply entwined with the elementary particles of Nature.” (Barrow,2007), p97.

This statement sounds good but doesn’t mention that the elementary particles it refers to are virtual! Physics has pasted field upon field to prove Newton’s belief, until now virtual particles pop in and out of space to cause every effect. They are said to be everywhere, making everything happen, but what have they added to the science of physics? The answer, honestly, is not much, as they either predict wrongly or not at all.

Virtual particles are the scientific version of a blank check, whose amount can be filled in after it is known. Every new virtual cause the standard model invented weakened physics until, like the frog in the pan, it is now in danger of dying as a science. Yet the modern age of fairy-tale physics (Baggot, 2013) was created one fairy at a time, by the scientists themselves.

Next

QR4.5.1 Going Nowhere

The standard model explains how forces act at distance by invoking virtual particles that come from fields, so the earth’s field of gravity holds the moon in orbit by graviton particles, and the strong field holds the nucleus together by gluon particles, where according to Feynman:

A real field is a mathematical function we use for avoiding the idea of action at a distance.” (Feyma,Leighton, & Sands, 1977), Vol. II, p15-7.

The electromagnetic field is then a mathematical function used to describe how charge acts from afar, but an equation isn’t a theory. Equations summarize known data but don’t predict as theories do, as the a new values they produce are expectations not predictions. Field theory is only a valid science if it predicts something new, that wasn’t known before, but it didn’t do that. That virtual photons cause charge effects didn’t add knowledge about charge, but it avoided the idea of action at a distance, so it was accepted. It explained the equation but didn’t advance science because it didn’t predict new facts, just as borrowing $10 to make a $10 profit isn’t an advance. Science advances when theories explain more than they assume, not when every new fact needs new assumptions. This tactic of assuming new virtual causes for new forces explains why field theory is currently going nowhere, as will be seen.

By analogy, if I say that your tooth was taken by a tooth fairy, maybe a sock fairy took my sock, and a perhaps a spoon fairy took the lost spoon, but where does it end? These causes lead nowhere because they don’t predict anything new. Likewise, a theory based on gravitons and gluons that predict nothing new is also going nowhere.  

The best example of this failure to predict is string theory, which tried to explain all physics by field mathematics. The result was so many possible architectures that anything was possible, but no-one could say which one applied. It was mathematically impressive but scientifically useless, so it led nowhere. Based on field theory, it assumed that our space has eight additional dimensions, but they interacted in so many ways that the result was meaningless. According to Woit, string theory is pseudo-science because:

The possible existence of, say, 10500 consistent different vacuum states for superstring theory probably destroys the hope of using the theory to predict anything. If one picks among this large set just those states whose properties agree with present experimental observations, it is likely there still will be such a large number of these that one can get just about whatever value one wants for the results of any new observation. (Woit, 2006), p242.

The basic problem is that inventing a field across all space adds what mathematics calls a degree of freedom to it, so adding many fields is like adding many dimensions to space. Based on current field theory, gravity then adds one-dimension, electromagnetism adds two, the strong force adds three, and the weak force two. These eight extra dimensions, plus the three of space, are why string theory has to assume eleven dimensions. Its failure to predict illustrates why inventing new fields to explain new facts isn’t leading anywhere either. 

That a universe of eleven dimensions somehow collapsed into our three-dimensional world is an untestable theory, like the multiverse, that only exists to support our preconceptions. It led nowhere because the goal of science is the explain outcomes not equations, but how did it come to this? 

Next

QR4.5 Fields Upon Fields

Physics spent much of last century trying to prove Newton’s idea that particles cause all the forces in nature. How gravity acted at a distance puzzled Newton, because no particles seemed to cause it. The standard model solved this problem by proposing that fields create virtual particles called bosons that exert forces. First the electro-magnetic field had photon bosons, then the weak field had gluon bosons, then the strong field had W bosons, and finally the Higgs field had a Higgs boson. By analogy, the standard model attributes the force of gravity to graviton bosons, despite no evidence at all. As a result, our universe is seen as fields upon fields, each creating different virtual particles that cause different forces. This section presents the alternate view that only field, the quantum field, does all the above.

QR4.5.1 Going Nowhere

QR4.5.2 The Frog in the Pan

QR4.5.3 Virtual Particles Aren’t Needed

QR4.5.4 A Model that Feeds on Data

QR4.5.5 A Particle Toolbox

QR4.5.6 The Last Standard Model

QR4.5.7 The Particle Model

QR4.5.8 A Processing Model

QR4.5.9 Testing The Theory

Next