QR4.5.9 Testing The Theory

Science tests theories by comparing what they predict with what actually happens. In this case, processing predicts that light at the highest frequency will collide to create matter, while particles predict that light can never collide: 

Two photons cannot ever collide. In fact light is quantized only when interacting with matter.Wikipedia, 2019.

The standard model doesn’t let light collide because photons are bosons that share quantum states. However, if photons are the core process of a network, at a two-point wavelength they will overload it, i.e. collide. The evidence supporting the idea that light can collide to produce matter includes:

1. Confined photons have mass. A free photon is massless but in a hypothetical 100% reflecting mirror box, it has mass because as the box accelerates, unequal photon pressure on its walls creates inertia (van der Mark & t’Hooft, 2011). By this logic, photons confined at a point, in a standing quantum wave, will have mass.

2. Einstein’s equation. Einstein’s equation works both ways, so if mass can turn into energy in nuclear bombs, photon energy can become mass, as the Breit-Wheeler process allows.

3. Particle accelerator collisions create matter. Protons that collide and stay intact produce new matter that didn’t exist before based on the collision energy, so high-energy photons could do the same.

4. Pair production. High-frequency light near a nucleus gives electrons and positrons that annihilate back into space.

5. Light collides. When high-energy photons at the Stanford Linear Accelerator hit an electron beam to accelerate it at almost the speed of light, some electrons knocked a photon back with enough energy to hit the photon behind it, giving matter pairs that a magnetic field pulled apart to detect (Burke et al., 1997).

That extreme light colliding in a vacuum produces matter is a plausible prediction that can be tested by experiment. And if light can create matter, the boson-fermion divide of the standard model falls because bosons can produce fermions. 

Physicists expected accelerator collisions to unlock the secrets of our universe, but they didn’t. They found transient flashes not permanent particles. In nature, what doesn’t survive isn’t the future, so these ephemera are evolutionary dead-ends that led nowhere because they weren’t stable. If matter evolved from light, the future of physics lies in colliding light, using light colliders not matter colliders.

That matter causes everything is just a theory, and scientists who don’t question their theories are priests. Light is simpler than matter, so that matter came from high-frequency light is a reasonable theory that can be tested by experiment. Let the facts then decide whether it is right or wrong.

Next

QR4.5.8 One Process

Figure 4.19. A processing model

The model of Figure 4.19 is based on one process, a simple circle that gives the null result of space. A first event then separated this process to begin our universe as a plasma of pure light, with no matter at all. This extreme light was diluted by the expansion of space, but some collided to make the first matter. A one-axis collision gave electrons or neutrinos based on phase, while a three-axis collision gave up or down quarks also based on phase. In both cases, the net processing that repeated was mass and what didn’t run was charge.

Quarks then combined into protons or neutrons by sharing photons to give nuclei that attracted electrons. The first atom, Hydrogen, was just a proton and an electron, but neutrons allowed higher atoms to form by nucleosynthesis. Space, light, and matter then all come from the same quantum process.

Unlike the standard particle model (Figure 4.18), this model (Figure 4.19) explains:

1. The origin of matter. Matter evolved, first from light, then into higher atoms, so it isn’t fundamental.

2. The forces of nature. All forces come from quantum waves on a network, not particle agents.

3. Anti-matter. Processing implies anti-processing, while particles have no natural inverse.

4. Space. Space is a network null process, not nothing at all.

5. Neutrinos. The neutrino is an electron byproduct, not a pointless particle.

6. Charge. Charge is a mass byproduct, not an arbitrary property.

7. Quarks. Quarks are a three-way electron collision.

The model of Figure 4.19 has no virtual agents and the same process underlies light, matter, and space. It also answers questions that a particle model can’t, such as:

1. Why does matter have mass and charge? (4.3.2)

2. Why do neutrinos have a tiny but variable mass? (4.3.3)

3. Why does anti-matter exist? (4.3.4)

4. Why isn’t anti-matter around today? (4.3.5)

5. Why are quark charges in strange thirds? (4.4.3)

6. Why does the nuclear force increase with distance? (4.4.4)

7. Why don’t protons decay in empty space? (4.4.6)

8. Why does the energy of matter depend on the speed of light? (4.4.8)

9. Why do atomic nuclei need neutrons? (4.6.1)

10. How did electron shells evolve? (4.6.2)

11. Why do charges add simply but mass doesn’t? (4.7.3)

12. Why is the universe charge neutral? (4.7.4)

13. What is dark matter? (4.7.6)

14. What is dark energy? (4.7.7)

These answers require only that quantum waves are processing waves on a network. A quantum network that defines space then keeps point matter entities apart naturally. A network transfer rate of one point per cycle then makes the speed of light constant. Electrons and neutrinos are then brother leptons because they are phases of the same interaction. Up and down quarks as phases of a three-axis interaction will then have one-third charges. Finally, a process that creates matter allows an anti-process to create anti-matter. One process that evolved can explain what many particles can’t.

The idea that God made our world like a clock, from existing parts, doesn’t work. If the particles of the standard model are a universal Lego-set, why are higher electrons and quarks irrelevant to the world we see? And if these bits were lying around before our universe began, where did they come from?

The simpler alternative is that before our universe began, it didn’t exist at all, so instead of divine shortcuts, everything had to be made, including our space and time! This wasn’t possible in one step so light, being simpler, evolved before matter did.

The Mandelbrot set illustrates how a simple process can evolve complexity, as one line of code repeated produces an endless variety of outcomes (Figure 4.20).

Figure 4.20. Mandelbrot’s set, a. Main, b. Detail

And if space became light that became matter that became us, our entire universe could have come from what we call nothing:

The world is a thing of utter inordinate complexity and richness and strangeness that is absolutely awesome. I mean the idea that such complexity can arise not only out of such simplicity, but probably absolutely out of nothing, is the most fabulous extraordinary idea. And once you get some kind of inkling of how that might have happened, it’s just wonderful.” Douglas Adams, quoted by Dawkins in his eulogy for Adams (17 September, 2001).

That physical complexity comes from quantum simplicity supports this idea, but how can it be tested?

Next

QR4.5.7 Many Particles

If matter exists, it should break down into basic bits, so for over a century particle physics has been colliding matter to find elementary particles that don’t break down further, (see Figure).

Yet when pressed on what these particles actually are, experts retreat to equations that don’t describe particles at all. This bait-and-switch, talking about particles but giving wave equations, is now normal. The equations describe quantum waves not particles, but they are imaginary so it doesn’t matter! Feynman explains how this double-speak began:

In fact, both objects (electrons and photons) behave somewhat like waves and somewhat like particles. In order to save ourselves from inventing new words such as wavicles, we have chosen to call these objects particles.” (Richard Feynman, 1985), p85.

But imagine if we said “This vehicle has two wheels like a bicycle and an engine like a car, so to avoid inventing a new word like motorcycle, we have chosen to call it a car”. Who would accept that? Physicists with accelerators seem to see everything as a particle, just as a boy with a hammer sees everything as a nail. However, what was actually found was:

1. Ephemeral. The standard model tau particle is actually a million, million, millionth of a second energy spike. A lightning bolt is long-lived compared to that and it isn’t a particle, so why is a tau? Particles should live longer than that!

2. Transformable. When a neutron decays into a proton and an electron, three elementary particles become four, so how then are they elementary? 

3. Massive. A top quark has the same mass as a gold nucleus of 79 protons and 118 neutrons, but why have a cosmic Lego-set with a huge building block that plays no part in the universe we see? 

4. Unstable. Top quarks also instantly decay, but calling what decays elementary is a strange use of the term. 

Entities that decay and transform into each other aren’t elementary because what is elementary shouldn’t do that. Equally, energy events that last less than a millionth of a second aren’t particles because particles should last longer than that. It follows that the elementary particles of the standard model are neither elementary nor particles.

Calling them building blocks doesn’t work either, as how can one build a house from bricks that only exist for a moment, or instantly decay, or transform into other bricks? And why have a building block that is bigger than most houses? Of all these building blocks, only the electron is stable alone, and it adds hardly anything to the mass of an atom.

Figure 4.18. The standard particle model

In Figure 4.18, the standard model divides its particles into fermions that cause matter and bosons that cause forces. This, we are told, is the end of the story, because accelerators can’t break matter down further, but how do particles that exist at a point take up space? Apparently, virtual particles from fields keep them apart but this theory can’t be tested because virtual particles can’t be observed.

This particle model satisfies neither logic nor science, but survives if we don’t look behind the curtain of physical reality. The wizard of Oz told Dorothy: “Pay no attention to that man behind the curtain” to distract her from the real cause of events, and today’s wizards tell us to ignore the quantum waves that actually cause physical events. 

Next

QR4.5.6 The Last Standard Model

Ptolemy’s Model

In the second century, Ptolemy’s Almagest let experts predict the movements of the stars for the first time, based on the belief that heavenly bodies, being heavenly, circled the earth in perfect circles, or in circles within circles (epicycles). It wasn’t true, but it worked, and its followers made it work for a thousand years. When new stars were found, they expanded the model to explain them, which made it more complex. This medieval standard model explained all the planets and stars, until a new one was found, and only fell when Kepler, Copernicus, Galileo, and Newton developed a causal model to replace it.

Scientists now see Ptolemy’s model as primitive, but it satisfied the experts of the day, just as the standard model does today, so it is interesting that both models are:

1. Descriptive. Both describe what is but don’t predict anything new. They summarize patterns, as equations do, but science isn’t just about description.

2. Based on free parameters. The medieval standard model let experts choose the free parameters of epicycle, eccentric, and equant to fit the facts, and the modern standard model lets them choose the free parameters of field, boson, and charge.

3. After the factThe medieval standard model defined its epicycles after a new star was found, and the modern standard model bolts on a new field after a new force is found.

4. BarrenThe medieval standard model couldn’t produce anything new, like Kepler’s laws, and the modern standard model is the same, so it can’t deduce that matter evolved from light.

5. Absurdly complexMedieval astronomers tweaked their model until it became absurdly complex, just as today, the equations of string theory fill pages, even books.

6. Normative. The medieval standard model was the norm of its day, so any criticism of it was seen as an attack on tradition, just as now, any critique of today’s standard model is seen as an attack on physics itself (Smolin, 2006).

7. Invalid. We now know that the planets and stars don’t move in circles around the earth, and it may also be true that virtual particles are unnecessary agents that don’t exist.

When the medieval church pressured Galileo to recant, they didn’t ask him to deny that the earth went around the sun. They only asked him to call it a mathematical fiction not a reality description. Today, physicists call quantum theory a mathematical fiction of their own accord but what if quantum waves really exist, just as the earth really does go around the sun?

The scientific method has three steps: first it describes patterns, then it finds correlations, and finally it attributes causes (Rosenthal & Rosnow, 1991). The standard model is then a descriptive model that didn’t become a causal theory because physicists rejected quantum causes. Ironically, Everett then fantasized about many worlds (Everett, 1957), and Witten built his mathematical castle in the air M-theory, neither of which led anywhere. The current standard model is then a dead end in the history of science, just as the last standard model was.

Next

QR4.5.5 A Particle Toolbox

A model that invents virtual particles to explain results after they are found is just a toolbox that produces particle causes, so when anti-matter was discovered, it just added a new column, and when family generations were found, it just added new rows. When mesons were discovered, they were so unexpected that Nobel laureate Israel Rabi quipped “Who ordered that?”, but the standard model just called them bosons and carried on. When new facts arrive, the standard model accommodates them in its existing structure or adds a new room.

Scientific theories should be falsifiable, but how can one falsify a model that absorbs rather than adds knowledge? It proposed gravitons that a long search hasn’t found, so was that a fail? It predicted proton decay, but twenty years of study pushed their lifetime to that of the universe, so was that a fail? It expected matter and anti-matter to exist in equal amounts, so is our universe of matter a fail? It expected massless neutrinos, until experiments found they had mass, and penta-quarks and strange quarks, until a two-decade search found neither, and the list goes on. It expected weakly interacting particles (WIMPs) to explain dark matter, but again a long search found nothing. The standard model is like a hydra, as when the facts cut off one head it just grows another. What will it take to falsify a model whose failures are called unsolved problems in physics?

The standard model’s claim to fame is that it can calculate results to many decimal places but in science, accuracy isn’t validity. An equation that accurately interpolates between known data points isn’t a theory that extrapolates to new points. Theories are judged by their predictions not accuracy yet today’s physicists, fed on equations not science (Kuhn, 1970), confuse them, for as Georgi said:

Students should learn the difference between physics and mathematics from the start” (Woit,2007), p85.

The difference is that physics needs valid theories not accurate equations. A theory is valid if it is true, so if a model can’t predict, it doesn’t matter how accurate it is.

The standard model claims to have predicted top and charm quarks before they were found, but predicting quark generations after finding lepton generations is like predicting the last move in a tic-tac-toe game, inevitable. After all, it didn’t predict family generations in the first place. It also claims to have predicted gluons, weak particles, and the Higgs, but a theory predicting what it invents isn’t predicting. Fitting equations to data then matching their terms to ephemeral flashes in accelerator events is like reading tea-leaves as if you look hard enough, you will find something. According to Wyszkowski’s Second Law, anything can be made to work if you fiddle with it long enough.

For example, why is a top quark 300,000 times heavier than an electron? The standard model answer is that it just is, so no wonder what baffled physics fifty years ago still baffles it today. Equations summarize the data that made them but theories should do more, so where are they? Currently, only the standard model exists, and it isn’t producing any new knowledge. The last time such a barren model dominated thought so completely was before Newton.

Next

QR4.5.4 A Model That Grows Itself

Occam’s razor, not to multiply causes unnecessarily, is the pruning hook of science but the standard model ignores it. Once, physics was just about mass, charge, and spin, but now it has isospin, hypercharge, color, chirality, flavor, and other esoteric features. The current standard model needs sixty-two particles (Note 1), five fields, sixteen charges, and fourteen bosons to work (Table 4.6). If it was a machine, one would have to hand-set over two dozen knobs just right for it to light up, so it isn’t preferred today because it is simple.

For this complexity one might expect completeness, but the standard model can’t explain gravity, proton stability, anti-matter, quark charges, neutrino mass, neutrino spin, family generations, or the dark energy and matter that constitute 95% of the universe.

Its main feature is that with each new finding, it grows, so to explain inflation it needs a hypothetical symmetron field, and to explain neutrino mass it needs another 7-8 arbitrary constants:

To accommodate nonzero neutrino masses we must add new particles, with exotic properties, for which there’s no other motivation or evidence.” (Wilczek, 2008), p168.

Good theories grow knowledge from data as good gardens grow plants from water, but the standard model is like a sponge that absorbs water to make itself bigger but remains barren. A model that explains new facts by growing itself not knowledge goes against science. Particle physics is now stagnant because inventing virtual particles to explain equations after the fact is science in reverse. Physics then needs a real theory based on quantum mechanics, not a particle toolbox. 

Next

Note 1. Two leptons with three generations plus anti-matter variants is 12. Two quarks with three generations plus anti-matter variants and three colors is 36. Plus one photon, eight gluons, three weak bosons, one graviton and the Higgs is another 14. The total is 62.

QR4.5.3 No Unnecessary Agents

The principle of not invoking unnecessary agents is fundamental to science. It is embodied in Occam’s Razor, that if two theories have equal explanatory power, prefer the one that makes fewer assumptions. For example, suppose one can see a computer screen but not the hardware and software that run it. The screen changes in bit units, so unseen bit particles could cause that, but the screen might also change in bits because that is the basic computer process. Now if new effects like color and movement must assume more particles but a bit process could still cause them, science prefers the latter theory by Occam’s Razor, and we know that is true.

Likewise, electro-magnetic effects could be caused by virtual photons or because the photon is a basic process. Either could be true, but more virtual particles are needed to explain nuclear bonding and neutron decay while a processing theory needs no further assumptions, so it is preferred. Electro-magnetism then occurs in photon units for the same reason that computer screens change in bit units. There is a correlation between photons and electro-magnetism but in science, correlation is not causation (Note 1).

The quantum processes envisaged here always run and spread naturally on the network, so they don’t need agents to push them. For example, when an electron falls to a lower energy orbit, it doesn’t need an orbit particle to make it so. The force particles of the standard model are then explained by processing as follows:

1. Electro-magnetism. The standard model needs virtual photons to explain charge and magnetism but if a photon is the basic quantum process, no virtual agents are needed to explain electro-magnetic effects (Chapter 5).

2. The strong effect. The standard model needed a new field that created eight gluons with three charges to explain nuclear bonding, but if quarks bond by sharing photons to achieve stability, again no virtual agents are needed (4.4.4).

3. The weak effect. The standard model needed another field, three new particles, and two new charges to explain neutrons decay, and still couldn’t explain why protons don’t do the same, but if neutron decay is a neutrino effect, protons will only decay in stars, and again no virtual agents are needed (4.4.6).

4. The Higgs. If weak particles don’t exist, the Higgs boson isn’t needed at all. It’s just a flash-in-the-pan accelerator resonance that didn’t survive to affect the evolution of matter, so it’s the ultimate unnecessary virtual agent (4.4.7).

5. Gravity. Every attempt to find gravitons has failed, as gravity waves aren’t particles, but the standard iconography still shows them as real (Figure 4.17). In relativity, gravity alters space and time, but particles that exist in space and time can’t do that. Chapter 5 attributes gravity to an electro-magnetic field gradient.

Figure 4.17. The CERN Standard Model

If a processing model explains the forces of physics without virtual particles, they are unnecessary agents. That all the forces of nature come from one field, including electro-magnetism and gravity, is simpler than assuming many fields with many particles, so Occam’s razor prefers it. 

In contrast, if the Higgs interacts with particles to create mass, how do the other particles interact? A quark can experience electro-magnetic, strong, weak, and gravity forces at the same time, so how do virtual photons, gluons, weak particles, and gravitons affect each other? The standard model doesn’t say. And matter particles have anti-matter versions, so what happens if a Higgs meets an anti-Higgs? Again, the standard model predicts nothing, so physics is better off without it.

Note 1. The number of ice-creams sold in America correlates with deaths by drowning, so do ice-creams kill? In Europe, number of stork nests correlates with human babies born, so do storks bring babies? In both cases, X and Y correlate because both are caused by a third agent Z, namely the weather, not because they cause each other. Correlation is not causation.

Next

QR4.5.2 Weakening Science

In a well-known story, a frog put in a pan of hot water jumps out immediately but if put in tepid water that is slowly heated, doesn’t realize the danger and perishes. It isn’t literally true, but illustrates how a gradually changing environment can prove fatal if unrecognized. For example, over centuries the natives of Easter Island cut down the trees on their island until their community collapsed but why did they, seemingly irrationally, chop down the last tree? The theory of creeping normality suggests they didn’t see their environment degrade because it was gradual (Diamond, 2005). It is now argued that particle physics is now stagnant because it did the same thing to its scientific environment.

That Faraday’s electric fields move charges from afar was at first considered fanciful because it was a disembodied force acting at a distance. Newton’s argument that gravity needs a particle agent was:

That gravity should be innate, inherent and essential to matter, so that one body may act upon another at a distance thro’ a vacuum, without the mediation of anything else … is to me so great an absurdity, that I believe no man … can ever fall into it. Gravity must be caused by an agent…” (Oerter, 2006), p17.

Hence, the attraction and repulsion of charges was thought to also need a physical agent.

Maxwell developed his equations of electro-magnetism by imagining physical ball-bearings twisting in vortex tubes, but later attempts to develop a physical model rom this failed. It was then proposed that electro-magnetic effects occurred in photon units because photons were the force-carriers of electro-magnetism. 

The standard model began when charge effects were attributed to photons from the electro-magnetic field. They weren’t observable like real photons, because their effect consumed them, so they were called virtual photons. They justified the equations, so no-one noticed the effect on the scientific environment of assuming a cause that wasn’t falsifiable or productive. The strength of science is its ability to explain so assuming what doesn’t explain more weakened it. This was bad science, so the scientific environment of particle physics was degraded.

Buoyed by its acceptance, the standard model then argued that all fields are the same, so gravity was caused by gravitons that to this day have no observed basis. There is no evidence at all that they exist and they predict nothing new about gravity so again, particle physics became scientifically weaker.

The standard model then proposed that a strong field held together the atomic nucleus by creating gluons with a color charge, so it now had a field that created charge. Again, gluons added nothing to our knowledge of the nucleus so again, the scientific background was degraded further.

Explaining why neutrons decay in space was more challenging, as now a field had to produce particles with charge and mass. Some evidence was needed, so billions of accelerator events were examined and when compatible resonances were found, weak particles were declared to exist. This predicted that protons will decay like neutrons but they don’t, so the science of particle physics weakened again.

Finally, the standard model had to explain how a field could create mass. Its answer was of course yet another field, with a virtual particle so massive that needed a billion-dollar accelerator to justify it. All to support Newton’s canon that:

“…the forces of Nature are deeply entwined with the elementary particles of Nature.” (Barrow, 2007), p97.

It sounds good but the elementary particles it refers to are virtual not real. The standard model has pasted field upon field to prove Newton’s belief in particles, so now virtual particles pop out of space to cause every effect. They are said to make everything happen but what did they add to knowledge? The brutal fact is that they either predict wrongly or add nothing at all.

A new field is the scientific version of a blank check whose amount is filled in after it is known, so adding fields to space was a failure of science not a success of physics. Theories that aren’t even wrong (Woit, 2006) have produced what some call fairy-tale physics (Baggot, 2013), which was created one fairy at a time by the physicists themselves.

Next

QR4.5.1 What is a Field?

According to Feynman:

A real field is a mathematical function we use for avoiding the idea of action at a distance.” (Feynman, Leighton, & Sands, 1977), Vol. II, p15-7.

For example, the electro-magnetic field based on Maxwell’s equations is a mathematical function that describes how electro-magnetic waves travel through empty space to create electrical and magnetic effects. Yet it isn’t a theory of how that happens because those equations require a complex plane that doesn’t exist physically, so it is a law of physics not a theory of physics.

In science, an equation like E=mc² is a mathematical law that relates data facts, usually by an equals sign, while a scientific theory explains those facts. For example, the law of gravity is an equation that relates gravity to mass but it isn’t a theory of gravity, so it puzzled even Newton who wrote it. Scientific laws specify how observed facts work while scientific theories explain why, so they aren’t the same. For example, the germ theory of disease has no equations but still works as a theory. In general, theories are theories and laws are laws, and they are always different. 

Maxwell’s equations then describe how light travels but don’t explain it in theory terms, and the same is true for quantum electrodynamics (QED), its quantized extension. But when the standard model adds that the equations work because the electro-magnetic field emits photons, this is a theory. The theory that gravitons cause gravity, gluons cause nuclear bonds, and weak particles cause neutron decay must then stand on its own, apart from the equations it explains.

Theories survive by predicting new facts while equations just have to interpolate between the facts they are based on. This ability, to produce new knowledge, is how theories and equations differ, but the theory that virtual photons cause charge effects didn’t reveal anything new about charge, it just avoided the idea of action at a distance. 

Attempts to develop field theory mathematics into a Theory of Everything led to superstring theory, then Witten’s M-theory, which assumes our space has eight extra dimensions, curled up so we can’t see them. Unfortunately, the result was equations that can predict anything, and so predict nothing, as Woit explained decades ago:

The possible existence of, say, 10500 consistent different vacuum states for superstring theory probably destroys the hope of using the theory to predict anything. If one picks among this large set just those states whose properties agree with present experimental observations, it is likely there still will be such a large number of these that one can get just about whatever value one wants for the results of any new observation. (Woit, 2006), p242.

M-theory can predict whatever you want, so it can’t be falsified, which is bad news in science. That a universe of eleven dimensions somehow collapsed into our three-dimensional world is untestable because no experiment can deny it. Good science is both fruitful and falsifiable but M-theory was neither, so that it led nowhere is no surprise, yet thousands of scientific papers were written on it!

A field that extends across all space adds a degree of freedom to it, so adding a field to space equates to adding a dimension to it. Based on field theory, gravity adds one dimension, electro-magnetism adds two, the strong force three, and the weak force two. Eight extra dimensions, plus three of space, require M-theory to have eleven dimensions, which interact to make anything possible, so M-theory failed because the standard model invented fields. M-theory became a theory of nothing because in science, assuming a fact to explain a fact isn’t profitable, just as borrowing $100 to make a $100 isn’t a profit in business. Yet this strategy of explaining by assuming has a long history in particle physics.  

Next

QR4.5 Fields Upon Fields

Newton believed that only matter can push matter so particles cause all the forces of nature, but how the earth’s gravity kept the moon in orbit puzzled him. The earth doesn’t touch the moon, yet it pulls it from a distance, so how can particles do that?

Figure 4.17. The CERN Standard Model

The standard model answer is that fields create particles that exert forces. They can’t be seen, as their action destroys them, but the equations imply them. Photons from the electro-magnetic field then cause electrical and magnetic forces, gluons from a strong field cause nuclear forces, weak particles from a weak field cause neutron decay, and a Higgs particle creates their mass.

Force-carrying particles (Figure 4.17) were accepted because the equations worked and accelerator energy spikes made them possible, but gravitons are said to cause gravity based on no evidence at all. The standard model describes our universe as fields upon fields, each producing different force particles, so what exactly is a field? 

.

.

QR4.5.1 What is a Field?

QR4.5.2 The Frog in the Pan

QR4.5.3 Virtual Particles Aren’t Needed

QR4.5.4 A Model that Feeds on Data

QR4.5.5 A Particle Toolbox

QR4.5.6 The Last Standard Model

QR4.5.7 The Particle Model

QR4.5.8 A Processing Model

QR4.5.9 Testing The Theory

Next