QR4.5.9 Testing The Theory

In science, a new theory is tested when it predicts what contradicts the old theory. Quantum realism predicts that light, and light alone, collided to create matter. In contrast, the standard model holds that light is made of photon particles that don’t collide because they are bosons that can occupy the same quantum state without colliding. Table 4.1 is based on a distinction between matter particles (fermions) and force particles (bosons), where fermions collide and bosons don’t. If matter collides by a basic substantiality that light does not have, then:

Two photons cannot ever collide. In fact light is quantized only when interacting with matter.Wikipedia 2019.

In contrast, quantum realism predicts that extreme light in empty space will collide to form matter. Evidence to support this includes that:

1. Photons confined have mass. A free photon is massless but if confined in a hypothetical 100% reflecting mirror box it has a rest mass because as the box accelerates unequal photon pressure on its reflecting walls creates inertia (van der Mark & t’Hooft, 2011). By the same logic, photons entangled in a node will have mass.

2. Einstein’s formula. That matter is energy works both ways so if nuclear bombs can turn mass into energy, photon energy can create mass. The Breit-Wheeler process describes how high energy photons can create matter.

3. Particle accelerator collisions routinely create new matter. Protons that collide and stay intact give new matter that didn’t exist before. If this matter comes from the collision energy, high energy photons can do the same.

4. Pair production. High-frequency light near a nucleus gives electrons and positrons that annihilate back into space.

5. Light collides. When high-energy photons at the Stanford Linear Accelerator hit an electron beam to accelerate it at almost the speed of light, some electrons knocked a photon back with enough energy to hit the photon behind it, giving matter pairs that a magnetic field pulled apart to detect (Burke et al., 1997).

That extreme light alone colliding in a vacuum gives matter is a prediction that no experiment has yet tested.

If beams of pure light can collide in pure space to create matter, the boson-fermion distinction of the standard model is challenged as then bosons can create fermions. If matter evolved from light, the future of physics lies in colliding light not matter so physics should build light colliders rather than particle colliders. Recent experiments support the idea that matter can arise from light, although the light colliding came from high-energy particle collisions creating intense photon bursts rather than directly from lasers.

The standard model expected the short-lived energy flashes of its accelerators to unlock the secrets of the universe but it didn’t happen and quantum realism says it never will. If matter evolved, our billion-dollar accelerators are just finding transient evolutionary dead-ends that led nowhere because in evolution, what doesn’t survive doesn’t change the future. The standard model assumes that matter came first but in quantum realism, light was the first existence.

Physical realism is just a theory and scientists who don’t question their theories are priests. Last century, it was the only game in town but today quantum realism is the rational alternative that space is network null processing, time is its processing cycles, light is the basic quantum process and matter is entangled light rebooting. This theory, based on reverse engineering, is testable, so if it is wrong, let the facts decide.

Next

QR4.5.8 A Quantum Processing Model

Figure 4.19. A quantum processing model

A quantum processing model (Figure 4.19) has no virtual bosons to make things happen because dynamic processing on a network, like an ever-flowing river, actively finds stable states. The first event created a plasma of extreme light that diluted to ordinary light as space expanded and collided with itself to give matter as a standing quantum wave. Extreme light overloading one dimension gave electron or neutrino leptons, depending on phase, and extreme light overloading a plane gave semi-stable up or down quarks, again depending on phase. In both cases, the repeating overload caused mass and the repeating remainder caused charge, including the strange one-third charges of quarks.

The only fundamental process in this model is a circle of quantum processing that in one node outputs “nothing”, so in quantum realism, space is null processing.

Distributing this circle gives the sine wave of light, so the entire electromagnetic spectrum is one process more or less shared so in quantum realism, light is space distributed.

Up and down quarks achieve stability by photon-sharing in a proton or neutron triangle and protons, neutrons and electrons then evolved into stable atoms that in time gave us. Matter entities have anti-matter versions with the same mass but opposite charge because processing can run in reverse. In the lines Figure 4.18 are similarities between supposed fundamentals but in Figure 4.19 they signify a dynamic evolution.

Figure 4.19 is simpler because one fundamental quantum process gives space, light and matter and it answers questions that the standard model of particles struggles with, including:

1. How do matter and charge relate? (4.3.2)

2. Why do neutrinos have a tiny but variable mass? (4.3.3)

3. Why does anti-matter with the same mass but opposite charge exist? (4.3.4)

4. Where did the anti-matter go? (4.3.5)

5. Why are quark charges in strange thirds? (4.4.3)

6. Why does the force binding quarks increase with distance? (4.4.4)

7. Why don’t protons decay in empty space? (4.4.6)

8. Why does the energy of mass depend on the speed of light? (4.4.8)

9. How did atomic nuclei evolve? (4.6.1)

10. How did electron shells evolve? (4.6.2)

11. Why does mass vary enormously but charge doesn’t? (4.7.3)

12. Why is the universe charge neutral? (4.7.4)

13. What is dark matter? (4.7.6)

14. What is dark energy? (4.7.7)

Some of the above are covered shortly. If a quantum network defines the pixels of space, nothing is needed to keep point matter entities apart. If the quantum network transfer rate is one node per cycle, the speed of light will be a constant. If electrons and neutrinos are phases of the same interaction, they will be brother leptons. If up and down quarks are phases of a three-axis interaction, there will be charges in thirds. If a quantum process creates matter, there must be anti-matter. Quantum processing explains more than inert particles pushed around by forces.

It’s time to abandon Newton’s idea that God put the world together like a clock, from existing bits. The standard model doesn’t describe God’s Lego-set because most of its “fundamental particles” play no part at all in the world we see.

If only quantum reality existed initially, it had to create physical reality from itself, with no divine shortcuts because there were no basic bits of matter just laying around from which a universe could be made! Given itself alone, it had to create an observer-observed universe by providing the observer and the observed from itself. This couldn’t occur in one step, so our was universe booted-up from a single photon, not made from preexisting bits. After that, it was complexity evolving from simplicity. The Mandelbrot set illustrates how a simple process can give endless complexity, as one line of code repeated gives rise to endless forms (Figure 4.20). There is no end to the Mandelbrot set not because was “built” from complex bits but because it is an endlessly dynamic interaction.

Figure 4.20. Mandelbrot’s set, a. Main, b. Detail

Quantum realism describes an essential simplicity hidden by complex outputs. If the null process we call space became light, then light became matter and matter became us, so nothing became everything. As Douglas Adams says:

The world is a thing of utter inordinate complexity and richness and strangeness that is absolutely awesome. I mean the idea that such complexity can arise not only out of such simplicity, but probably absolutely out of nothing, is the most fabulous extraordinary idea. And once you get some kind of inkling of how that might have happened, it’s just wonderful.” Douglas Adams, quoted by Dawkins in his eulogy for Adams (17 September 2001)

The best argument against physical realism is the ridiculous complexity of the models it needs to describe it. Quantum realism derives physical complexity from quantum simplicity.

Next

QR4.5.7 The Particle Model

Aristotle’s ancient idea of a matter substance implies that it can be broken down into fundamental particles and battering matter into bits seemed the best way to do that. Physics spent much of last century and billions of dollars smashing matter apart to find fundamental particles, defined as what can’t be broken down further.

But when pressed on what a particle actually is, physicists retreat to wave equations that don’t describe particles at all. This bait-and-switch, talking about a particle but giving a meaningless wave equation, is now the physics norm. If one points out that the equations describe waves not particles, they reply it doesn’t matter because the equations are fictional! Feynman explains how this double-speak began:

In fact, both objects (electrons and photons) behave somewhat like waves and somewhat like particles. In order to save ourselves from inventing new words such as wavicles, we have chosen to call these objects particles.” (RichardFeyman, 1985) p85

Imagine if an engineer said “This vehicle has two wheels like a bicycle and an engine like a car so to avoid inventing a new word like motorcycle we have chosen to call it a car”. A boy with a hammer thinks everything is a nail and likewise physicists with particle accelerators think everything is a particle but it isn’t always so. What physics found by battering matter apart turned out to be neither fundamental nor particles, because it was:

1. Ephemeral. A lightning bolt is long-lived compared to what physics today calls a particle, e.g. a tau is a million, million, millionth of a second energy spike. We don’t call a lightning bolt a particle so why call a tau a particle?

2. Classifiable. The standard model classifies a tiny electron, a massive tau and a positron as leptons but what can be classified isn’t fundamental. Classifying requires common properties that imply something else more fundamental. “Fundamental” in physics today just means that which can’t be further smashed apart by high speed protons.

3. Massive. The “fundamental” top quark has the same mass as a gold nucleus of 79 protons and 118 neutrons. It is 75,000 times heavier than an up quark, so why does the cosmic Lego-set have such a huge building block? Not surprisingly, this fundamental entity plays no part at all in the function of the universe we see.

4. Unstable. If a top quark is fundamental, why does it instantly decay into other particles? When a neutron decays into a proton and an electron, three fundamental particles become four, which is a strange use of the term fundamental.

Entities that decay and transform into each other aren’t fundamental because what is fundamental isn’t subject to decay or transformation, and energy events that last less than a millionth of a second aren’t particles because what is substantial should last longer than that. A brief eddy in a stream isn’t a particle, so why is a brief quantum eddy a particle? It follows that the fundamental particles of the standard model are neither fundamental nor particles but rather quantum events.

Figure 4.18. The standard particle model

The standard particle model (Figure 4.18) describes fundamental particles that are classifiable and virtual bosons that come from nowhere to make things happen. This, we are told, is the end of the story because particle accelerators can’t break point particles down any further. How then does a particle that exists at a point take up space? Apparently, they create invisible fields that generate virtual particles to keep them apart. It is a wonderfully circular argument that can’t be tested because the agents involved are unobservable.

The particle model survives because physicists are conditioned to not look behind the curtain of physical reality. The wizard of Oz told Dorothy: “Pay no attention to that man behind the curtain” to distract her from what really orchestrates events, and likewise the wizards of physics ask us to pay no attention to the quantum waves that quantum theory says create physical reality. Quantum realism looks behind the curtain to see that quantum processes cause physical events.

Next

QR4.5.6 The Last Standard Model

In the second century, Ptolemy’s Almagest let experts predict the movements of the stars for the first time based on the idea that heavenly bodies, being heavenly, moved around the earth in perfect circles, or circles within circles (epicycles). It wasn’t true but it worked and Ptolemy’s followers made it work for centuries. As new stars were found they altered the model making it more complex and themselves more expert. This ancient “standard model” only fell when Copernicus, Kepler, Galileo and Newton developed a valid causal model to replace it. The standard model of physics and the standard model of Ptolemy have a lot in common, as both are:

1. Descriptive. They both describe what is but fail to successfully predict new things. Descriptive models identify patterns, ideally in the form of equations, but this is the first step of science not the last. The end goal of science is a causal model that truly predicts.

2. Parameterized. Ptolemy’s model let experts choose the free parameters of epicycle, eccentric and equant to fit the facts and the standard model of today lets experts choose the free parameters of field, bosons and charge.

3. Retrospective. Ptolemy’s model defined its epicycles after a new star was found, just as today’s standard model bolts on a new field after a new force is found.

4. Barren. Descriptive models only interpolate so the Ptolemaic model would never have deduced Kepler’s laws and likewise today’s standard model will never deduce that matter is made of extreme light.

5. Complex. Medieval astronomers tweaked Ptolemy’s model until it became absurdly complex just as the equations of today’s standard model fill pages and those of its string theory offspring fill books.

6. Normative. The Ptolemaic model was the norm of its day so any critique of it was an attack on the establishment, and likewise today any standard model critique is seen as an attack on physics itself (Smolin,2006).

7. Wrong. Ptolemy’s model mostly worked, even though planets don’t move in circles around the earth, and likewise the standard model mostly works, even though virtual particles don’t exist.

When the medieval church pressured Galileo to recant they didn’t ask him to deny the earth went around the sun but to call it a mathematical fiction, not a reality description. Today, physicists volunteer the same about quantum theory, that it is just a mathematical fiction, but what if quantum reality really does exist, just as the earth really does go around the sun?

In research methodology, after describing patterns comes finding correlations and finally attributing causes (Rosenthal & Rosnow, 1991) The standard model is a descriptive model that failed to evolve into a causal theory because physics denies the existence of what quantum theory describes, for as Bohr put it:

There is no quantum world. There is only an abstract quantum mechanical description.” Newton, p244

The denial of meaning at Copenhagen led Everett fantasize about many worlds (Everett,1957) and Witten to go it alone with string theory mathematics, neither of which led anywhere. The choice to prefer equations over meaning halted the scientific growth of physics, as physics abandoned science when it abandoned meaningful causes. To fill the gap, it had to invent magical particles that pop out of empty space to cause the equations. The standard model, as a naive descriptive paradigm ruled by acausal equations that are leading nowhere, is essentially a scientific dead end in the history of physics.

Next

QR4.5.5 A Particle Toolbox

The standard model is a particle toolbox that generates new particles to explain results after the fact. For example, when anti-matter was discovered, it just added new columns and when family generations came along, it added new rows. When mesons were found someone said “Who ordered that?” until the standard model called them bosons that carried no force! When new facts arrive, the standard model accommodates them in its existing structure or builds a new wing.

It is hard to fault a model that absorbs rather than generates knowledge. It includes gravitons that a long search hasn’t found, so was that a fail? It predicted proton decay but twenty years of study have pushed their lifetime to that of the universe, so was that a fail? It sees matter and anti-matter as symmetric so is that our universe is only matter a fail? It expected massless neutrinos until experiments gave them mass and penta-quarks and strange quarks until a two-decade search found neither, and the list goes on. Today it “predicts” that weakly interacting particles (WIMPs) will explain dark matter but again a long search has found nothing. The standard model is like a hydra because when the facts cut off one “head”, it just grows another. Indeed, it is unclear what exactly it would take to falsify a model whose failures are called “unsolved problems in physics.

The standard model’s claim to fame is that the equations associated with it calculate results to many decimal places but in science, accuracy isn’t validity. An equation that accurately interpolates between a known set of data points isn’t the same as a theory that extrapolates to new points. Equations are judged on accuracy but theories are judged on their ability to predict. An equation isn’t a theory but today generations of physicists, fed on equations not science (Kuh,1970), think they are the same, so as Georgi says:

Students should learn the difference between physics and mathematics from the start” (Woit,2007) p85.

Equations aren’t theories because theories should predict new things not just accurately calculate known situations. If a theory isn’t valid, i.e. represent what it true, it doesn’t matter how reliable it is. The virtual particles of the standard model aren’t valid because ultimately, they don’t represent anything that can be verified at all. If the standard model isn’t valid, it doesn’t matter how accurate it is.

When it comes to prediction, the standard model’s success is dubious. It claims to have predicted top and charm quarks before they were found but to “predict” a third quark generation after finding three generations of leptons and two of quarks is like predicting the last move in a tic-tac-toe game. It also claims it predicted gluons, W bosons and the Higgs but inventing agents based on data-fitted equations isn’t prediction. Fitting equations to data then matching their terms to ephemeral resonances in billions of accelerator collisions is the research version of tea-leaf reading – look hard enough and you’ll find something. The standard model illustrates Wyszkowski’s Second Law, that anything can be made to work if you fiddle with it long enough.

The standard model describes the data we know but doesn’t create new knowledge. Its answer to why a top quark is 300,000 times heavier than an electron is “because it is”. What baffled physics fifty years ago still baffles it today because equations can’t go beyond the data set that created them, only theories can. The last time such a barren model dominated thought so completely was before Newton.

Next

QR4.5.4 The Standard Model Feeds on Data

Occam’s razor, not to multiply causes unnecessarily, is the pruning hook of science but the standard model has done just that. Physical realism began as a simple theory of mass, charge and spin but today it has isospin, hypercharge, color, chirality, flavor and other esoteric features. The standard model today needs sixty-two fundamental particles (Note 1), five invisible fields, sixteen charges and fourteen bosons to work (Table 4.6). If it was a machine, one would have to hand-set over two dozen knobs just right for it to light up. If physical realism is preferred today, it isn’t due to its simplicity.

One might expect completeness for this level of complexity but the standard model is unable to explain gravity, proton stability, anti-matter, quark charges, neutrino mass, neutrino spin, family generations, quantum randomness or why inflation occurred. Nor can it explain dark energy or dark matter, i.e. most of the universe. And with each new result it grows, so inflation needs a hypothetical symmetron field to explain it and neutrino mass needs another 7-8 arbitrary constants:

To accommodate nonzero neutrino masses we must add new particles, with exotic properties, for which there’s no other motivation or evidence.” (Wilczek,2008) p168.
The standard model doesn’t explain new data, it feeds on it, because it expands itself when it meets new facts. It is a toolbox for inventing new fields and particles rather than a theory that successfully predicts.

Next

Note1. Two leptons with three generations plus anti-matter variants is 12. Two quarks with three generations plus anti-matter variants and three colors is 36. Plus one photon, eight gluons, three weak bosons, one graviton and the Higgs is another 14. The total is 62.

QR4.5.3 There Are No Virtual Particles

Suppose one could see a computer screen but had no access the hardware and software that created it. If one saw that screen changes occurred in bit units, does that mean that virtual “bit particles” created them? A better conclusion is that the screen changes in bit units because the bit is the basic unit of the processing that creates the screen. Likewise, the assumption of physics that virtual photons cause electromagnetic effects is premature.

If quantum processing creates physical effects, changes in electromagnetism occur in photon units because the photon is the basic network operation, so all changes just look like photon effects. The quantum network changes in photon units for the same reason that a computer screen changes in bit units. The link between photons and electromagnetism is correlation not causation and mixing these up is the oldest error in science.

Dynamic processing that tries every option doesn’t need agents to push it, so an electron can fall to a lower energy orbit without needing an “orbit boson” to make it so. The forces that physics attributes to imaginary particles are produced by quantum processing as follows:

1. Electromagnetism. Where the standard model sees virtual photons quantum realism sees a quantum network re-allocating its basic operation, so no virtual photons are needed to explain electromagnetism (Chapter 5).

2. The strong effect. The standard model needed a new field, three new charges and eight gluons to explain how quarks bind in a nucleus. In quantum realism, quarks share photons to achieve stability and the color charge is the axis orientation needed for a stable result. Again, no magical gluon agents are needed (4.4.4).

3. The weak effect. The standard model needed another field, three more bosons and two new charges to explain how neutrons decay but still couldn’t explain why protons don’t decay. In quantum realism, neutron decay is a neutrino effect whose reverse is an electron effect only possible in stars, so protons are stable in empty space. Weak bosons are again unnecessary and thus imaginary agents (4.4.6).

4. The Higgs. If weak bosons don’t exist, the Higgs boson isn’t needed at all. CERN added yet another species to its already overflowing menagerie of “particles” that had no role at all in the evolution of matter. Adding another virtual particle to the standard model house of cards didn’t add anything new to our knowledge (4.4.7).

5. Gravity. Every attempt to find gravitons has failed but standard model iconographies still display it as if it were real (Figure 4.17). But if gravity alters space and time, how can particles that exist in space and time do that? Something else is needed and Chapter 5 attributes gravity to a grid processing gradient.

Figure 4.17. The CERN Standard Model

Finally, if the Higgs can interact with weak bosons to give mass, how do other bosons interact? A quark can be subject to electromagnetic, strong, weak, Higgs and gravity forces, so what happens if a virtual photon, gluon, weak boson, Higgs and graviton appear at the same time? That virtual bosons only interact to make our equations work isn’t satisfactory. And as matter bosons imply anti-matter versions, what happens if a Higgs meets an anti-Higgs? The standard model being an ad hoc model doesn’t predict anything.

The standard model invents virtual particles for effects that quantum realism derives from a core quantum process. Why invent many virtual particles to explain what one quantum process can? In quantum realism, virtual particles are unnecessary because quantum processing can explain their effects so there are no virtual particles.

Next

QR4.5.2 The Frog in the Pan

In an apocryphal story, a frog dropped in a pan of boiling water jumps out immediately but if put in tepid water that is slowly brought to the boil, by the time it realizes the danger it is too weak to jump out and perishes. It is now proposed that something similar happened to the standard model last century.

When Faraday first proposed that an invisible field around an electric charge made it attract and repel other charges at a distance, it was considered fanciful until the equations worked. Today, fields explain every force in physics but a field is a disembodied force that acts at a distance and Newton, centuries earlier, had issues with this:

That gravity should be innate, inherent and essential to matter, so that one body may act upon another at a distance thro’ a vacuum, without the mediation of anything else … is to me so great an absurdity, that I believe no man … can ever fall into it. Gravity must be caused by an agent…” (Oerter,2006) p17

Maxwell created the equations of electromagnetism by imagining physical ball-bearings twisting in vortex tubes but all attempts to develop this into a physical model failed. Driven by the belief that something physical had to make iron filings move in a magnetic field, field theory came up with the idea that the field created force-carrier particles to do its bidding. Since electromagnetism acted in photon units and Einstein had shown that photons were particles, that virtual photons caused electromagnetic effects worked nicely.

The standard model was born when Maxwell’s equations were explained by virtual photons from a Faraday field. Unlike real photons, virtual photons couldn’t be observed as they come into existence, cause an effect, then are consumed by the act. Science doesn’t normally accept agents that can’t be observed but physicists could see them in the equations. This seemed a small price to pay to carry on calculating but the pseudo-science temperature had just gone up a notch.

As photons are bosons, field theory generalized that all fields use boson agents so gravity had to work by gravitons that to this day have no real-world equivalent. There is no evidence at all that such a particle has ever existed but naming them made them exist in the minds of physicists. To assume a thing exists without evidence because a theory needs it contradicts science, so again the pseudoscience temperature rose a notch.

When the strong field was used to explain how protons bind in a nucleus, virtual photons with no mass or charge were joined by virtual gluons with color charge, so now an invisible field could create charge. Since gluons by definition could never be directly observed, again the pseudoscience temperature rose a notch.

When the weak field was used to explain neutron decay, it needed weak bosons with charge and mass so now there was a field that could create mass. Things were heating up, so this virtual particle had to be shown to at least exist but when a match was found among billions of particle accelerator events it was declared “proven”, although the established scientific method for proving causality was ignored. Adding an invisible cause of mass again raised the pseudoscience temperature.

Finally, to let virtual particles create mass it was necessary to invent yet another field, this time with a virtual particle so massive it needed a billion-dollar accelerator to find it. All this, to support the physical realism canon that:

“…the forces of Nature are deeply entwined with the elementary particles of Nature.” (Barrow,2007) p97

Physics has pasted field upon field to prove Newton’s belief until now virtual particles popping in and out of space cause every effect. They are said to be everywhere making everything happen despite no direct evidence that they cause anything at all. They are magical because an invisible field creates them and the effect absorbs them so by definition, they can never be verified. Virtual particles are the scientific version of a blank check and once physics accepted unverifiable causes it couldn’t stop. Each new virtual “cause” weakened physics scientifically until, like the frog in the pan of water heating up, it is now in danger of dying as a science. The age of fairy-tale physics had arrived (Baggot, 2013).

Next

QR4.5.1 Many Fields, Many Options

Fields are common in physics today, e.g. the earth holds the moon in orbit by a gravitational field that exerts a force on matter particles at every point in space, an electric field exerts a force on charged particles at every point in space, and so on for other fields, where according to Feynman:

A real field is a mathematical function we use for avoiding the idea of action at a distance.” (Feyma,Leighton, & Sands, 1977) Vol. II, p15-7

Emboldened by the success of Faraday’s electromagnetic field, physics explained the forces it found by inventing fields that added what mathematics calls degrees of freedom to space. For example, the force of gravity acting at every point in space added one degree of freedom, the electromagnetic field caused electrical and magnetic forces at every point so it added two degrees of freedom, and so on.

It was then realized that adding a degree of freedom to each point of space in effect adds a dimension to it, so adding many fields is like adding many dimensions to space. Current physics has fields that explain gravity, electromagnetism, strong and weak forces, where gravity adds one-dimension, electromagnetism adds two, the strong force adds three and the weak force two. These eight extra dimensions plus the three of space are why string theory needs eleven dimensions to work.

But when one adds dimensions to space the mathematics soon gets out of control because they interact. String theory’s attempt to explain physics mathematically using many fields gave so many possible architectures, over 10500 at least, that the result didn’t predict anything, hence many scientists today don’t consider it a useful approach. In effect, many fields or many dimensions give so many options that the result is meaningless.

That a universe of eleven dimensions somehow collapsed into ours is a far-fetched idea, akin to the multiverse story. The standard model tactic of inventing new fields to explain new forces is failing because it predicts nothing.

Next

QR4.5 Fields Upon Fields

Physics spent much of last century trying to prove Newton’s idea that particles cause all the forces in nature. In order to explain forces like magnetism and gravity that act at a distance, with no particles in sight, they argued that fields exert forces by creating unobservable virtual particles. The resulting fields upon fields made physics what it is today.

QR4.5.1 Many Fields, Many Choices

QR4.5.2 The Frog in the Pan

QR4.5.3 There Are No Virtual Particles

QR4.5.4 The Standard Model Feeds on Data

QR4.5.5 A Particle Toolbox

QR4.5.6 The Last Standard Model

QR4.5.7 The Particle Model

QR4.5.8 A Quantum Processing Model

QR4.5.9 Testing The Theory

Next