Young and Dirac - The Prophets of New Physics - Claus Birkholz - E-Book

Young and Dirac - The Prophets of New Physics E-Book

Claus Birkholz

0,0
2,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

A critical review of the »Standard« Models. A characteristic of New Physics is its hierarchic organisation in powers of 8 dimensions (Matryoshka principle) and its split into 2 channels. According to Bell, this enables the coexistence of causality with entanglement, shows how visible matter with its non-valence parts condensed out of dark matter, explains the quark confinement and the asymptotic flatness of Eternal Inflation. The world formula unifies all forces of nature to a Grand Unified Theory and this GUT is combined with quantum gravity into a Theory of Everything (ToE). By reproducing the correct value of the fine-structure con-stant, weak interactions are shown to be a dipole effect. A novel segregation between micro- and macrocosm explains the measuring process and the irreversibility of time. It demonstrates the logic gaps in Einstein's General Relativity by quan-tising his curvilinear geometry (including virtual states, dark energy, etc.) thus generating a consistent black-hole physics without singularities. For all that, A. Young and P. Dirac had provided the mathematical basics, while classical physics and Einstein had gone on isolating themselves in self-made deadlocks.

Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:

EPUB
MOBI

Seitenzahl: 199

Veröffentlichungsjahr: 2019

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Claus Birkholz

Young and Dirac

The Prophets of New Physics

Are we prepared to face the new facts?

Copyright: © 2019 Claus Birkholz

Approved: 2019-08-05

Proofreading: Ole Jürgens and Ralph Hickok – www.textcelsior.de Layout: Erik Kinting – www.buchlektorat.net

Publisher: tredition GmbH, Hamburg

978-3-7497-2476-5 (Paperback)

978-3-7497-2477-2 (Hardcover)

978-3-7497-2478-9 (e-Book)

Auch auf Deutsch erhältlich:

978-3-7497-2473-4 (Paperback)

978-3-7497-2474-1 (Hardcover)

978-3-7497-2475-8 (e-Book)

All rights reserved. No part of this publication may be reproduced, translated, distributed, or transmitted in any form or by any means, including photocopying, recording, or other electronic or mechani-cal methods, without the prior written permission of the publisher and author.

Contents

The Theory

1. Free Will and Reproducibility

4. Historical Background

2. Finiteness and Atomism

3. Faster than Light

5. Quantum Gravity

6. Generators and Metric

7. Macrocosm vs. Microcosm

8. Dirac’s Legacy

9. For Specialists Only: Mathematical Supplement

10. Cosmology and Particle Physics

11. The Cosmic Hyperboloid

12. Cosmic Inflation

13. The Matryoshka Principle

14. Dimensions

15. The Technical Base of Philosophy

16. What Is Time, What Space?

17. Enigmatic Time

18. Demystified Measuring Process

19. Event Horizon

20. Life in a Black Hole

21. The Causal Gap

22. The Mater-Mundi Principle

23. Quark Confinement

24. Unified Field Theories

25. Range Horizons

26. How Particles Condense Out of Dark Matter

27. Salty Universes

28. System of Natural Units

29. Coordinate Systems

30. Charges

31. The Chiral Forces of Nature

32. The Geometry of Forces

33. Rejection and Attraction

34. Leptons

35. Excited States

36. Shell Models

37. Flavours

38. Pauli’s Exclusion Principle

39. The Spectrum of Stable Particles

40. The Parity Problem of Neutrinos

41. Massl essness

42. Resonances

43. Numbers and Quanta

44. Problems Still Open in QG

The Story

II - 1. First Contact

II - 2. Starting Work

II - 3. Assistance

II - 4. Late Success

II - 5. Life Goes On

II - 6. “Science”

Footnotes

The Theory

1. Free Will and Reproducibility

Electrodynamics once paved the way to Special Relativity. Einstein’s merit was to introduce the curvilinear metric into his General Relativity (GR). Cosmologists had to accept meanwhile that GR might instead have been a flash in the pan (cosmic inflation decoupled from relativity, no dark energy, no dark matter, etc.). The true pioneers might have been the mathematician A.Young and the fundamental theorist Dirac, both of them busy at Cambridge.

Schrödinger stands for classical quantum mechanics, and not for Quantum Gravity, which denotes the unification of Planck’s quanta with Einstein’s General Relativity. For insiders, the battle was more about Bell’s “ hidden variables”. In 1936, Podolsky, Einstein, and Rosen had postulated them in order:

to overcome the problem of a “collapsing wave equation” in the

measuring process

of quantum theories,

to override Einstein’s “spooky action at a distance”, which is known as “

entanglement”

.

Entanglement denotes the situation that a coupled system preserves its quantum coupling without any time delay, irrespectively of, then, overriding the speed of light. That clearly violates the limitations expected from causality. Einstein’s idea had been that his General Relativity might be incomplete; hidden variables could be an escape strategy. In 1964, however, the Irish physicist Bell published his no-go theorems telling us that hidden variables are not the solution to those problems.

Since that time, Bell’s no-go theorems have conquered fundamental physics. Every theoretician proudly claims the non-existence of hidden variables in quantum systems quite generally. That opinion spread like a plague. Everybody who still dared to raise an objection quickly experienced the worldwide power of the big scientific lobby: Nobody took him seriously any more. Thus, even Einstein became marginalised post hoc.

The irony, however, was that Einstein turned out to be correct, after all! For, Bell himself admitted in a 1985 BBC television report that his no-go theorems were crucially based on his tacit assumption that there existed in nature something like free will. Without free will, however, his theorems faded to nothing.

Bell called the result thus corrected and replacing his no-go theorems an “absolute determinism” or, shorter, a “superdeterminism”. According to that superdeterminism, everything should be uniquely predestined and unchangeable for all times: There should be some general consistency condition embracing the entire world without any exception.

This meant an open declaration of war towards our western civilization as it has grown over thousands of years. Just think of our jurisdiction and its sanctions against crime. Provided everything has been predetermined already, then the accused would be innocent – the culprit would be the superdeterministic combination of events our forefathers once had declared to be a crime! This, however, mistakenly asserts that those sanctions irrevocably are part of that superdeterminism, as well.

As a result, nobody took Bell’s 1985 insights seriously. Instead, his outdated no-go theorems went on flourishing until today. This is supported by the purely technical fact that Bell’s BBC-interview is hardly suitable to be quoted by an official journal. In our present world, remarkably, a subjective rumour once hastily fixed by the official opinion leaders outweighs any future objective correction. Hence, up to now, almost nobody dared the loss of face required to apply superdeterminism to particle physics or to cosmology.

Theoretical physics is defined as the mapping of (parts of) nature into mathematics. We only perceive what our senses are telling us. But they might tell us nonsense, as well. Serious physicists, hence, only accept what can be reproduced unambiguously. The main trait is the reproducibility of physical results. This is its distinction from religion, which works with irreproducible “miracles”.

Thus, it is amusing that the existence of something like a “free will” has been able to stay upright that long, although its implications clearly arenot reproducible. For this reason alone, it is surprising that in physics the hypothesis of a “free will” could have formed at all and then also be preserved for so long.

4. Historical Background

Before 1900, physics was still a hotchpotch of individual disciplines, all more or less independent of each other. In the course of the 20 th century, a melting process started. Even chemistry turned out just to be a combination of quantum mechanics with thermodynamics. Biology and medicine, however, still resisted any unification.

Physics then went from being regarded as an outcome of the variational principle, which is intimately related to the Lagrangian model. Both of them had been developed in the 18 th century. The variational principle had become the highlight of treating mechanical problems. In mathematics, Lagrangian formalism is based on the non-discrete, continuous “functional analysis of many variables”. Their property to subordinate all that might happen under just one single parameter is crucial.

In physics, time is usually chosen as this 1 parameter. Even those notorious “string models” admit just one “time-like” dimension only, while all remaining dimensions are demanded to be “space-like”. We shall observe, however, that this restriction will turn out to be too stringent for physics.

The 20th century spectacularly started with Planck’s introduction of discrete “quanta” replacing continuous structures, followed by Einstein’s relativity theories. Both models, that of relativity and that of quantum theories, rapidly continued to develop to complete “field theories”. Even quantum theories still used that giant machinery of a continuous functional analysis for dealing with the discrete problems of quanta, cf. Schrödinger’s method.

Thus, the world of physicists proved not yet to be mature for QG: The requirements of the variational principle and of the Lagrangian formalism prevented the unification of quantum theory with Einstein’sGeneral Relativity. Its main obstacle had been thatduality not understood between the dynamic and the reaction channel.

The preceding chapter has been dedicated to stripping that unjustified restriction off fundamental physics by redefining physics to be based on what mathematicians would denote as a model of generators. (Their “complex Lie algebra” is the highest common denominator of both channels. And a generator is represented by a square matrix. More about this later.)

In n dimensions, the n diagonal entries of a matrix are commeasurable. This is the microscopic view of physics. The macroscopic view, however, is an application of the law of great numbers resorting to superpositions. We shall still observe that all n x n generators of an nxn-matrix representing a generator can be made approximately commeasurable by applying appropriate statistics.

Such results derive from the mathematical discipline of “group theory”. Spin is a notion of group theory, too. Einstein did not like group theory. Hence, his General Relativity does not incorporate spin; for GR, spin is an “alien”. In group theory, however, spin is one of its fundamental properties. This might be another obstacle preventing both theories from being united successfully, so far.

The main notion in group theory, however, is “irreducibility”, telling us which combination of quanta belong together in order to build up a particle, e.g., and which does not. Like spin, this notion that is important for group theory has not been used by Einstein in his GR.

On the other hand, that “irreducibility” is the basic notion allowing us to write down the “world formula”. Hence Einstein never was able to write it down. For, the invariants of group theory are just defined by that irreducibility; they are called “Casimir operators” there (to be presented later). Einstein’s “world formula for every Casimir existing, must read

2. Finiteness and Atomism

Another feature of physics is its atomistic nature detected by Planck when he was working at his black-body radiation law in 1900. Even before, ancient Greek philosophers already had speculated about it. This atomism, however, should be evident to every layman, indeed. For, nobody can count up to infinity. In physics, hence, everything must stay finite in order that we are able to keep the survey over it and can describe it in a unique way. Without a unique description, however, reproducibility hardly can be checked!

Finiteness, when extended to systems of real numbers, teaches us that fundamental physics only admits rational numbers because irrational numbers need an infinite number of (non-repeating) decimal digits. A finite set of elements, however, can be separated and counted. This yields the above atomistic structure in terms of “quanta”.

Classical physics denies that atomism. Classical physics is assumed to be continuous. For continuous systems, the infinitesimal calculus was invented. The mechanistic view of our world has been thriving with it for centuries. And people are trying to keep it upright still today. Schrödinger’s continuous wave equation exemplifies the resistance with which disciples of that mechanistic view of our world still actually face Planck’s discrete quantum view.

Now, a continuous description might also be interpreted as the limiting case of a superposition of discrete features. This is the wave aspect of statistics. But don’t turn a blind eye to the fact that smoothed statistics are the result of a limiting process, which tacitly includes the extrapolation towards an infinite number of elements! That extrapolation means indirectly taking into consideration additional elements that are not present there from the beginning.

Those “hidden variables”, of course, are unphysical, ambiguous, pure fantasy. You could choose them however you like. They are what Bell’s no-go theorems are excluding for combining causality with entanglement. Their inclusion, however, is the source for a macroscopic extension of a basically microscopic world.

Let us keep in mind: A macroscopic description contains more parameters than experimentally measured! A quantum theory describes the microscopic situation, where there is just 1 state and 1 (“diagonal”) measurable direction. The macroscopic view, however, is less exact: The result of measuring some state A – say at a position z – might be some state B – say at a position z’. Provided the difference z’–z, now, is negligibly small with respect to the absolute value z, its measuring result could approximately be equal to z, and B, then, could “approximately” equal A. Without having quantised spacetime, hence, we have to expect that an entire spectrum of values B will be (mis)interpreted exactly to be =A – with all its curious implications for a theory to be accepted then or not.

According to Bell’s words, those microscopic deviations are “hidden” with respect to the macroscopic world – disguising the fact that the final state B is not exactly equal to the initial state A. Thus, the macroscopic world is working with approximations manifesting themselves in terms of (reducible or even irreducible) superpositions of a great number of states almost coinciding.

This new definition of what is “macroscopic” according to Bell’s superdeterminism is the most important feature of New Physics. It cannot be underestimated Planck’s summation of a finite number of discrete quanta replacing their continuous integration is the key to that problem. By his method of firing the singularities inherent in classical potentials like those of Yukawa or Coulomb-type applications, singular models of classical literature are unexpectedly becoming finite and simple. In their discrete forms, Yukawa or even Coulomb potentials might result in a non-singular way.

3. Faster than Light

According to Pythagoras, a (squared) distance is measured by adding the squares of its components. In 2 dimensions that distance defines the radius of a circle (or, when stretched, the principal axes of an ellipse), in more dimensions a sphere (or an ellipsoid).

Now, Einstein demonstrated that, in order correctly to describe Maxwell’s electrodynamics, the time direction has to be multiplied by the imaginary unit, in addition – which is not present in his 3 space directions. When inserted into Pythagoras, this squared imaginary unit will effectively switch the positive sign of its squared time component to a negative sign. The result is Special Relativity.

This sign switch in the time direction transforms Pythagoras’ circle (or sphere) to a hyperbola (or hyperboloid, respectively). Contrary to a circle or an ellipse, a hyperbola, however, has 2 separate branches that do not touch each other. This transformation from a compact circle or ellipse to a non-compact hyperbola, triggered by that sign switch, thus:

rips the original circle or ellipse into 2 pieces,

stretches and squeezes the rest.

In physics, the (negative) density gradient of a point distribution designates a “force”. On the homogeneous surface of a sphere, where all positions are equivalent, the point concentration should be distributed equally. A point-by-point transformation of this sphere to a hyperboloid – which is a mere exercise for a mathematician – will yield a density distribution there that is definitely non-uniform.

On the hyperboloid, hence, (geometrical) forces will be created that are absent from the original sphere! And those forces will become extreme (velocity of light) at those locations where our original figure is torn to pieces.

When keeping some of the coordinates fixed while letting the remaining ones change, the total set of points, depending on those fixed values, will be sliced into sections that are orthogonal to each other. But each of those coordinate systems will slice the complete set of points in a different way (cf. the red vs. the green way):

In classical physics both slicing schemes would be related by an r-number formula (giving the same quantised real “Lie algebra” – whatever this might be); in Quantum Gravity (QG), however, the relation will be given by a c-number formula (giving the same“ complex Lie algebra”). Thus the point is: QG is distinguishing the compact, “closed” representation by the “reaction channel” from its (formally) non-compact, i.e., “open” representation by the “ dynamic channel”.

In both cases, the “points” represented by these “channels” may be designated as (the expectation values of) “generators” because we are treating Quantum Gravity here, contrary to classical physics, as a thoroughly quantised model; quantum mechanics and Einstein’s General Relativity are just limiting models reflecting classical physics.

Because both channels describe the same set of points, the finiteness constraint of our closed reaction channel is automatically transferred to the dynamic channel, as well. Its infinite, asymptotic nature, thus, proves to be cut off somewhere. In physics this means: That “pseudo-open” dynamic channel will have to be represented by finite-dimensional representations, as well!

Classical physics works with infinite representations, instead. Many of its technical problems arise from those unphysical, infinite singularities that are neither needed nor observable. Quantum Gravity avoids them from the start. Clearly, it is hard work trying to persuade the elder generation to abandon the technical prejudices they have cultivated and expanded for so long. Their international lobby is preventing QG from gaining a proper foothold in science – in spite of its breath-taking experimental success.

Dynamics, hence, is bounded. Contrary to the case of classical physics, there are no singularities. This does not mean that our universe is bounded by some rim we could knock at. A better picture would be that our point distribution is thinning out more and more towards some imaginary limit. And somewhere we are passing its last point without observing that there will be no more point behind.

Probability amplitudes are summed up according to Pythagoras. The “conservation of probability” postulated by physicists, hence, is a property of the reaction channel. Likewise, entanglement is working in the reaction channel. On the other hand, dynamics is a manifestation of the dynamic channel, and causality is a property defined by dynamics. In classical physics, both channels are identified with each other. Bell’s no-go theorem is based on this identification.

In the above sketch, however, point x cannot simultaneously march into the red, vertical direction and into the green, horizontal direction. Although the full red domain is equal to its full green counterpart, those individual slices denoting probability conservation in the reaction channel and lack of dynamical motion in the dynamic channel are not identical – as classical physics tacitly assumes. But this contradiction is exactly the source of Bell’s no-go theorems.

In QG, both channels can be expanded into each other! Thus, Bell’s contradiction disappears: causality and entanglement are both true side by side. Only, both channels are not commeasurable with each other! (Just compare it with the spin components.)

5. Quantum Gravity

Let me briefly summarise what we have found already to be of importance for Quantum Gravity:

Reproducibility

needs Bell’s

superdeterminism

. This prevents a

free will

.

Finiteness

yields an

atomistic

world with

no (non-recoverable) singularities

.

A

complex

Lie algebra

yields the duality between the 2 channels, providing the coexistence of

causality

and

entanglement

.

Geometrical

forces

are the result of relating the

secondary dynamic channel

to the

primary reaction channel

.

Statistics

, by the “

law of great numbers

” is adding the

macroscopic view

of physics to its primary

microscopic view

.

Further implications are:

Triggered by the gradient of probability,

motion

, then, means hopping from one fixed-time slice to the next one.

(In polar coordinates these would be time shells.)

Our 2 channels are not commeasurable with each other. Compare this case with Wheeler-deWitt’s oversimplified trial not to solve the Gordian knot of but to cut it by brute force: They are completely banning any time-dependence from theory. For, they had correctly found that an exact measurement of time trivially would have prevented time from varying – our slicing scheme with respect to time.

Because of their tacit identification of both channels, however, their model was doomed to end up in a disaster: Functional analysis is no good guide for handling the discrete quantum structure of physics! What they had been doing was throwing the baby out with the bath water.

In order to unify Planck’s world of quanta with Einstein’s General Relativity, the objective of Quantum Gravity is it to describe elementary particles and our cosmos by the same set of equations; only the values of their numerical constants should differ. By them, a Quantum Gravity necessarily will contain external parameters, in addition, which itself can not predict in advance. Our universe, hence, turns out to be some subordinate partial subsystem which – howsoever – will be embedded in some higher system fixing those parameters, a system which, therefore, might possibly obey different axioms.

6. Generators and Metric

Attention: For a deeper understanding of physics, regrettably, this chapter is somewhat mathematical. Nevertheless, people who are not sure what all this might mean, should not give up, anyway.

Einstein already used tensors in fundamental physics. His General Relativity is a model based on tensors. Within a continuous framework, the mathematical discipline of tensors is differential geometry; in a discrete one it is group theory. Tensors are multiple vectors, i.e., they are based on (Kronecker) products of vectors in some linear superposition such that the final tensor cannot usually be split into just two vectors as factors.

Hence a tensor is fitted with several vector labels. Group theory classifies them according to symmetry classes (Young tableaux). In a physical interpretation, vectors come across in 2 types: a contravariant vector (lower label) serves as an input, a covariant vector (upper label) as the output of some reaction (or v.v.).

(But observe the ambiguity of the notion “contravariant”! The apparently paradoxical formulation reads that a label which, in some fixed situation, is identified to be covariant in some relative situation is designated to be contravariant to a certain different vector identified to be contravariant.)

Tensors also might simultaneously be fitted with labels of both types; for example:

Only, by its “2nd quantisation” particle physics does not respect the clean gap between input and output issues. As a direct consequence of this mathematical inconsistency, all attempts towards a “unification of field theories” have turned out to be in vain since Dirac in the 1930s; they are still actually stagnating. The reason is that that formalism undermines the conservation of quanta as individually conserved entities:

The vacuum stopped being empty, particles went to be created in pairs out of nothing and disappeared, there, again! The two powerful obstacles on the trail towards constructing a Quantum Gravity, hence, are 2nd quantisation on the hand of particle physics, and the ignorance of irreducibility on the hand of General Relativity.

More generally, a transformation T usually is represented by an exponential expression whose exponent splits off the imaginary unit together with some angle t as its current parameter as factors:

The true heart g of the transformation is its generator. In a quantised system, we are inclined to express the current angle t in terms of a “number of quanta” and to refer it to that number t’ which would result in an equal distribution on the entire periphery of the circle.

Macroscopically, that quotient t/t’, then, will usually become extremely small – functional theorists would call it infinitesimal. Squares and higher powers of t in the exponent will hence become negligible and may be cancelled. In fact, its “expansion according to Taylor” will end after its linear term, already:

In physics, this linear approximation is a metric