Cognition based on Quantum Gravity - Claus Birkholz - kostenlos E-Book

Cognition based on Quantum Gravity E-Book

Claus Birkholz

0,0
0,00 €

oder
-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

     Philosophy would not be worth a penny if it would be contradicting natural sciences, and the same holds true for the sciences if they would be in conflict with nature. The claim of natural sciences is it to map nature. For consistency, hence, the author, here, is outlining the last frontiers fundamental physics, actually, is disposing of.

     This interface of physics towards philosophy is based on cognition gained by Quantum Gravity, the consistent unification of Einstein's General Relativity with Planck's quanta and Gell-Mann's quark model. The quantization of Einstein's curvilinear space-time is giving rise to a completely novel level of a "hidden" structure far below the level of quarks and leptons, which, in accordance with Bell's no-go-theorems, is leading to the statement that our world in its interior should be "absolutely" deterministic: the postulate of a free will to exist is strictly contradicting the rules of nature - provided there are any.

     All those great puzzles we are confronted with by recent research - from dark matter, dark energy, cosmic inflation, the composition of the cosmological constant, etc., until down to the quark confinement and to the value of the fine-structure constant - are solved; the interior of a black hole is calculated. Nature is going to release its ultimate secrets.

     The layman going to delve into those depths of present-day cognition is well-advised to admit some leisure-time while absorbing all those exciting facts nature is offering to the interested reader.

Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:

EPUB

Veröffentlichungsjahr: 2016

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Claus Birkholz

Cognition based on Quantum Gravity

The Interface of Physics towards Philosophy

BookRix GmbH & Co. KG81371 Munich

Determinism

Philosophy starts where natural sciences are ending. And natural sciences, last but not least, are reducible to physics. Here, I want to outline the latest frontiers of fundamental physics giving its interface towards philosophy. Technical details of that borderline of actual research can be looked up in my e-book “ToE; New Physics explaining our world by Quantum Gravity. World’s first Textbook on QG” (2016) (see www.q-grav.com).

Now, experimental physics is based on observation. When writing down those observations, experimentalists are creating and collecting data. Theoretical physics is based on those data; it is trying to relate them by “models”. The ordering of data is an early stage of it; it will be model-dependent, too. The language of those relation logics, nowadays, is mathematics. Briefly:

 

 

Mathematics, however, is not physics! It is much more comprehensive than just covering the requirements of the model chosen: The output data of an experiment are some finite set of single r-numbers, and such an r-number, usually, will be some decimal (or similar) number containing a finite number of digits.

Since Leibniz et al. (inventors of the infinitesimal calculus 300 years ago) physicists are used to construct their models predominantly in a way to handle continuous numbers. A continuous number, however, is “not countable”: its decimal expansion needs an infinite number of non-repeating digits.

But nobody is able to count up to infinity! Continuous numbers, hence, are “unphysical”: they cannot be verified by measurement, they always are equipped with an inherent, experimental imprecision. By this countability aspect, two measuring values either will have to be equal or to be well separated from each other: Mathematical limits nesting numbers by infinitesimal methods, thus, are unphysical, too.

 

 

By applying functional analysis, classical physics – Einstein’s General Relativity included – is ignoring that fact. It was Planck who recovered the countability principle in 1900. His name for the property that nature is showing up in separate steps was “quantization”:

 

 

Quantization means the existence of some great but finite number of “quanta” to set up nature. As nature, apparently, is showing up in a continuous way, this “quantization” effect will immediately lead us to the postulate which is one of the basics of Quantum Gravity:

 

 

However, Bell’s inequalities of 1964 are forbidding “hidden variables”. From that time on, the world of fundamental physics had been paralysed by this “no-go theorem”. Bell’s escape proposal of a “super-determinism” (1985) did not really reach the public, then. For, his “super-determinism” strictly excludes the existence of a “free will”, which is necessary in order to prove his no-go theorem. – But that fits:

 

 

For lawyers, this absolute determinism does not decriminalize lawbreakers, however. For, even the logics of sanctions characterizing criminal justice will result from determinism.

Absolute determinism corresponds well to the older formulation of this basic law of physics:

 

 

For, provided a “free will” would exist, then decisions would originate from nothing! Hence, both yellow boxes, above, are conveying the same fact. Traditional quantum theories permanently are violating this law; especially, quantum field theories do so – and not only by their Copenhagen interpretation of the measuring process!

 

The Copenhagen Interpretation

By the Copenhagen interpretation, a measuring process is not defined by physics but purely by abstract mathematics; its dependence on a measuring device simply is ignored.

The typical example is that of a ray of electrons which is split by magnets into two diverging rays according to the respective spin directions “up” and “down” of their electrons. An incoming electron of some different direction of polarization, thus, will be rotated by the measuring device either into the “up” or into the “down” position because these are its only output channels admitted by construction.

Mathematicians, now, are considering the spin directions only, not the ray directions. In their way of description, the incoming electron is “projected” onto one of the two outgoing channels “up” or “down”. A “projection”, however, is a singular procedure delivering us some “partial electron”; a projection does “not conserve probability”. “Partial electrons”, however, do not exist in nature!

Hence, that description is unphysical. In order to correct it, people tacitly ask another unphysical procedure to follow: a “renormalization” in order to supplement their “partial” electron to a complete one. Altogether, they, thus, are offering us the sequence of two unphysical processes.

On the other hand, the intervention of the measuring device is manifesting itself in terms of the two diverging output rays nobody can deny. The Copenhagen interpretation of a measuring process is darkening the influence of the device, which physically simply is “rotating” the spin direction.

The statement of the Copenhagen interpretation, briefly, might be cast into the form:

There is an interaction by some device; let us ignore it!

That interaction is changing the wave equation.

Afterwards, the wave equation looks different, of course.

Copenhagen, now, is pretending not to understand why.

Copenhagen invents an escape strategybeyond physics.

As a result, the output of an incomplete, i.e., of a defective, wrong calculation is reinterpreted allegedly not to follow “deterministic” logics! Their advocates, however, are not in the habit of explaining us how such an “interaction without interaction” they are demanding could work physically. On the contrary, by starting from assumptions not satisfied by nature, those advocates discredited the reputation of physics by confuse claims.

Their “collapsing wave functions”, then, transmuted into the initial impulse for the monstrous misuse of mathematics that typically characterizing modern fundamental physics.

New Physics corrected this error by attributing a “rotation” to that case, triggered by the device. The measuring device is transferring an electron into some labile state such that the slightest deviation of the real world from its idealized form will sensitively give rise to some effect controlled by “probability”. Compare it with a pencil standing on its tip: Into which direction will it fall?

The result, of course, is deterministic, not random – although, superficially, it will look so. In our above case, the choice is restricted to “up” and “down” only.

Then, there is no objection left against a totally deterministic world on the level of its quanta.

Why Does Time Pass?

How fast is time running? Why doesn’t it just stand still? What is a world good for where nothing “happens”?

These questions are demonstrating that some essential aspect of nature still is missing in our concept developed up to here. This “something” is statistics, i.e., the averaged properties of some multitude of states.

Let us imagine special compounds of our “quanta” to be arranged in parameter space in a way to represent points on the surface of a sphere all of them scattered around at equal distances, one by one. For a visualization, let us better discuss it in 2 dimensions only; then, that “sphere” will be a circle.

When crunching that circle to an ellipse, those equidistant points on the surface of the original circle will assume an unequal distribution, with a higher density distribution at one part of the ellipse and a lower density at some other part of it: as a result, some density gradient will evolve.

Now, let us imagine that plane ellipse to be warped around a vertical cylinder such that the two vertices of the longer axis of the ellipse are almost touching themselves. And let us focus on the closer neighbourhood of those two, now neighbouring, vertices and “forget” about the rest of the ellipse.

Then, that selected part will take on a shape reminding us of a hyperbola with its two branches, one from the left, one from the right. The density distribution of the above points will be maximal about the vertices, minimal on the opposite parts of the cylinder.

The density gradient, hence, will point towards those vertices, everywhere. Compare it with our original circle: there had been no gradient. This gradient, thus, is an additional property only observable in a crunched version of our circle! Physics, hence, is working in 2 frame types of representation to be opposed to each other. They are standardized

to a “reaction channel” without distortion, and

to a “dynamic channel” distorted hyperbolically.

In the sense of thermodynamics – provided we do not bother about the far side of our cylinder in the dynamic case –

the “reaction channel” is representing a “closed” system,

the “dynamic channel an “open” system.

A “density” gradient is a notion which, once, had been derived from statistics. Statistics is introducing “emergent” parameters – “density”, e.g. – which do not make sense for an individual, single point. (Just for the sake of memory: In thermodynamics, the best-known emergent parameters are “temperature” and “entropy”: A single point neither has temperature nor entropy!)

An “emergent” parameter, usually, needs the law of great numbers in order to become an object of statistics, i.e., in order to become measurable at all:

Time is a generator of the dynamic channel, and not of the reaction channel. Its measure, thus, will depend on the law of great numbers! The implication is that there should exist something like a “density gradient”, which, by probability considerations, is triggering time to pass instead of standing still:

When considering the density gradient within the blue circle, then the geometric centre of all points involved will be shifted somewhat to the right (yellow arrow), i.e., probability will locate its “weighted” centre at some shifted position. By repeating this consideration with the dotted circle around that new centre, we apparently observe some motion of the circle following the density gradient. The direction of running time is its

For a constant “density”, i.e., when that arrow is vanishing, a state of equilibrium will have been reached and time will stop running.

(In the pop-science visualization by the ellipse warped around a cylinder, this will happen at the positions of the two vertices, i.e., at the reversal points of the curve in front (maximum), and another time at the positions opposite to its two vertices, i.e., at the backside of the cylinder (minimum).)

Later, when returning to this point, we shall physically identify the minimum position with a big-bang position, and the maximum position with the event horizon of a black hole.