The logic of science limits
Philosophy

The logic of science limits



"Three Fundamental Limitations of Modern Science"

Part 1

A limitation in modern physics

by

Du Won Kang

May 25th, 2010

The Epoch Times

Currently, there is a sense that modern science will continue to advance indefinitely and will eventually discover a complete and consistent theory of the universe. However, as much as modern science has been making great advances, it has also been discovering its limitations. Some of the greatest discoveries of modern science are the discoveries of its own limitations.

In different fields of modern Western culture that are deeply related to the development of modern science, again and again, at different times and by different people, fundamental limitations were discovered. These limitations reduce the scope of modern science.

In three key areas, the essential problems share the common issue of a paradox. At the heart of modern physics, the uncertainty principle persists even in more advanced theories beyond quantum mechanics. In formal logic, the best-known tool of modeling human reasoning and understanding quickly falls into paradox. Even in philosophy, which still plays an essential role in advancing modern sciences such as physics, dualism and paradox are inescapable in rational deductions about the nature of the universe.

Of course, modern science will continue to make advances in physics and less hard sciences like biology and others. However, certain fundamental issues cannot be resolved with more time and research. These limitations may suggest that the starting point of modern science, which forces the universe into a box, may be seriously flawed.

In this multipart series, we begin with physics.

The Beginning of the End of Classical Mechanics

When a piece of matter is heated, it starts to glow, gets red hot, and at higher temperatures becomes white. For a long time, known laws of radiation and heat failed to account for this common phenomenon. German physicist Dr. Max Planck, who is considered the founder of the quantum theory, struggled to provide a physical interpretation of the phenomenon at the atomic level.

Finally, after some intense work in 1900, Planck reluctantly concluded that a radiating atom can emit only discrete quanta of energy. He was reluctant about this conclusion because it goes against the well-established laws of classical physics, which does not impose a fixed constant on levels of energy.

Later, Planck’s conclusion about the quanta of energy became an important foundation of quantum theory, and it was only the beginning of conflicts between the quantum theory and the more sensible classical theory of Newton. Classical mechanics is closely related to our everyday experience of the world. However, atoms and subatomic particles seem to have mysterious characteristics that are very different from our ordinary experience of the world.

The Rise of Quantum Mechanics

From persisting anomalies and accumulating experimental data, which contradict classical mechanics, physicists were forced to make a radical departure from the classical physics of Newton and venture on a long and winding road toward quantum mechanics.

Another German physicist, Dr. Werner Heisenberg, who discovered the uncertainty principle, said in his book “Physics and Philosophy: The Revolution in Modern Science,” “I remember discussions with Bohr which went through many hours till very late at night and ended almost in despair; and when at the end of the discussion I went alone for a walk in the neighboring park I repeated to myself again and again the question: ‘Can nature possibly be as absurd as it seemed to us in these atomic experiments?’”

Nevertheless, in spite of conceptual difficulties, quantum mechanics has become one of the most successful formalisms in modern science. In principle, quantum mechanics can describe the myriad of physical phenomena and chemical properties of matter to an incredible accuracy. And its applications greatly influenced the development of our modern, technological society.

Dr. Michio Kaku, a professor of theoretical physics at City College of New York, in his book “Beyond Einstein: The Cosmic Quest for the Theory of the Universe” wrote: “The consequences of quantum mechanics are all around us. Without quantum mechanics, a plethora of familiar objects, such as television, lasers, computers, and radio, would be impossible. The Schrödinger wave equation, for example, explains many previously known but puzzling facts, such as conductivity. This result eventually led to the invention of the transistor. Modern electronics and computer technology would be impossible without the transistor, which in turn is the result of a purely quantum mechanical phenomenon.”

The enormous success of quantum mechanics comes from its formalism that accurately describes a myriad of phenomena of microscopic things, but it is also in that microcosm where quantum mechanics has fundamental limitations.

The Uncertainty Principle

A central feature of quantum mechanics is Heisenberg’s uncertainty principle. According to this principle, it is impossible to measure both the position and the momentum of an atomic or subatomic thing at any given time. As the position is measured more accurately, the momentum will be measured less accurately, and vice versa. If a position is measured absolutely accurately, then the momentum becomes completely unknown, and vice versa.

Although Heisenberg introduced the uncertainty principle in 1927, it is just as relevant today. The inability to accurately measure both the position and momentum of microscopic things is not due to some limitation with current technology. According to many physicists, this is an inherent limitation, which cannot be resolved by any future advances in technology.

In “Beyond Einstein: The Cosmic Quest for the Theory of the Universe,” Kaku wrote, “The Uncertainty Principle makes it impossible to predict the precise behavior of individual atoms, let alone the universe.”

And, according to Dr. Brian Greene of Columbia University, one of the world’s leading string theorists, future advances in string theory will have to incorporate the uncertainty principle in order to become a complete theory that accounts for observable quantum phenomena. Greene explains in his book “The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory” that the uncertainty principle is not just an issue of disruptions caused by measuring techniques:

“Even without ‘direct hits’ from an experimenter’s disruptive photon, the electron’s velocity severely and unpredictably changes from one moment to the next. [...] Even in the most quiescent setting imaginable, such as an empty region of space, the uncertainty principle tells us that from a microscopic vantage point there is a tremendous amount of activity.”

Heisenberg believed that the uncertainty principle arises from the dualism of atoms and subatomic particles’ property between wave and particle. This dualism is not just embedded in the mathematical scheme of quantum mechanics. The duality can also be inferred from simple experiments. Experiments seem to demonstrate that atomic and subatomic things have characteristics of both a particle and a wave.

A particle occupies a small area in space and can collide with other particles, like solid objects. On the other hand, a wave is spread out in space and can pass through other waves. These descriptions between particle and wave appear to be opposite and conflicting notions.

How can something be a particle and a wave at the same time? When a single electron is considered to be either a particle or wave and not both, an incomplete explanation of the observed phenomena will result. On the other hand, when the aspects of a particle and wave are combined to form a complete theory of observed phenomena, contradictions will result.

According to Heisenberg, attempts to describe atomic events in terms of classical physics lead to contradictions because those microscopic things are not like ordinary objects of our everyday experience.

In Newtonian mechanics, every object has a definite position and momentum at any given time, and the object will follow only a single path of motion. In other words, motion of matter is fully deterministic, where there is only one future outcome.

When the position and momentum of an object are known, then its motion can be predicted with precise mathematical calculations. Newtonian mechanics has been very successful in describing and predicting planetary motions in the heavens as well as events on Earth. However, it fails to describe the phenomena of atomic and subatomic events.

In contrast to the classical physics of Newton, according to Heisenberg, atomic events are, like the concept of potentiality in the philosophy of Aristotle, “a strange kind of physical reality just in the middle between possibility and reality.” In quantum mechanics, atomic and subatomic events are described in probabilities or tendencies.

Quantum mechanics introduced the concept of indeterminacy into the foundation of modern physics. This was a huge leap from the classical mechanics of Newton that dominated physics for centuries. And it was also a radical departure from the theory of relativity. Einstein rejected this interpretation of quantum mechanics on this very point of indeterminacy, and suggested in a letter to physicist Dr. Max Born that God does not play dice.

In “Physics and Philosophy: The Revolution in Modern Science,” Heisenberg wrote, “The change in the concept of reality manifesting itself in quantum theory is not simply a continuation of the past; it seems to be a real break in the structure of modern science.”

Issues of Interpreting the New Physics

Although quantum mechanics has been very successful, we must remember that quantum mechanics only describes and predicts observable physical phenomena; it does not describe the inner reality of physical matter. In fact, as quantum mechanics advanced, different and conflicting interpretations of quantum mechanics developed, even among eminent physicists.

One of the earliest interpretations of quantum mechanics is the Copenhagen interpretation, which was led by a Danish physicist, Dr. Niels Bohr. This interpretation states that “there is no deep reality,” and atoms, electrons, and photons do not exist like objects in our everyday experience. According to this interpretation, a phenomenon fully comes into existence only when it is observed. Bohr once described it this way: “There is no quantum world. There is only an abstract quantum description.”

On the other hand, Einstein was a “realist,” and he believed that quantum mechanics is simply incomplete and that there is a hidden deterministic reality behind quantum phenomena that may be discovered in the future. Although Einstein was in a very small minority of physicists with this view, there are eminent physicists who also made great contributions to the development of quantum mechanics and were realists too.

Planck believed in an objective world that is independent of an observer and adamantly opposed the indeterministic worldview of Heisenberg, Bohr, and Born. Dr. Louis de Broglie, who is best known for his discovery of the wave nature of electrons, was aligned with the statistical interpretation, but after struggling with it for many years, finally settled on a realist position. Dr. Erwin Schrödinger, who developed wave mechanics, was also a realist, and he devoted much of his later life in opposing the statistical interpretation of quantum theory that he had done so much to create.

About a decade after the passing of Einstein, Irish physicist Dr. John Stewart Bell demonstrated that the realist position requires that certain forces must be able to travel faster than the speed of light to account for observable quantum phenomena. And since this contradicts the foundation of the well-established theory of relativity, many physicists reject the realist position.

In 1957, Dr. Hugh Everett III introduced the many-worlds interpretation, which seems to resolve the quantum measurement problem. In the many-worlds interpretation, parallel universes are created for different possible outcomes from each act of measurement. For example, when a coin is tossed, although we observe only one outcome, other possible outcomes are supposed to occur in parallel universes that are instantly created. This interpretation is considered to be absurd by notable physicists and philosophers.

These are only a small sample of attempts to give a complete interpretation of quantum mechanics. There are many interpretations. Dr. Nick Herbert compared eight of them (including the ones mentioned above) and wrote in his book “Quantum Reality: Beyond The New Physics”: “An astonishing feature of these eight quantum realities, however, is that they are experimentally indistinguishable. For all presently conceivable experiments, each of these realities predicts exactly the same observable phenomena [...] All of them without exception are preposterous.”

Part 2

A Limitation in Formal Logic

May 28th, 2010

Some of the greatest thinkers wanted to determine the nature of mathematical reasoning in order to improve their understanding of the notion of "proof" in mathematics. To that end, they attempted to codify the thought process of human reasoning, as it applies to mathematics. They surmised that logic and mathematics are interrelated and that mathematics can be a branch of logic, or vice versa. They thought that the kind of logical deductive method of geometry may be employed for mathematics, where all true statements of a system can be derived from the basis of a small set of axioms.

"The axiomatic development of geometry made a powerful impression upon thinkers throughout the ages; for the relatively small number of axioms carry the whole weight of the inexhaustibly numerous propositions derivable from them,” Philosopher Dr. Ernest Nagel and mathematician Dr. James R. Newman wrote in their book Gödel's Proof. “The axiomatic form of geometry appeared to many generations of outstanding thinkers as the model of scientific knowledge at its best."

Persistent Contradictions in Logic

However, inherent paradoxes were known to exist in logic. And a variety of paradoxes were also discovered in set theory, such as Russell's paradox. Those paradoxes all have two things in common: self-reference and contradiction. A simple and well known paradox is the liar paradox such as "I always lie." From such a statement it follows that if I am lying, then I am telling the truth; and if I am telling the truth, them I am lying. The statement can be neither true nor false. It simply does not make sense. From the discovery of paradoxes in set theory, mathematicians suspected that there may be serious imperfections in other branches of mathematics.

In his book Gödel, Escher, Bach: An Eternal Golden Braid, Dr. Douglas Hofstadter, professor of cognitive science at Indiana University in Bloominton, wrote, "These types of issues in the foundations of mathematics were responsible for the high interest in codifying human reasoning methods which was present in the early part of [the 20th century]. Mathematicians and philosophers had begun to have serious doubts about whether even the most concrete of theories, such as the study of whole numbers (number theory), were built on solid foundations. If paradoxes could pop up so easily in set theory-a theory whose basic concept, that of a set, is surely very intuitively appealing-then might they not also exist in other branches of mathematics?"

Logicians and mathematicians tried to work around these issues. One of the most famous of these efforts was conducted by Alfred North Whitehead and Bertrand Russell in their mammoth work of Principia Mathematica. They realized that all paradoxes involve self-reference and contradiction, and devised a hierarchical system to disallow for both. Principia Mathematica basically had two goals: to provide a complete formal method of deriving all of mathematics from a finite set of axioms, and to be consistent with no paradoxes.

At the time, it was unclear whether or not Russell and Whitehead really achieved their goals. A lot was at stake. The very foundation of logic and mathematics seemed to be on shaky ground. And there was a great effort, involving leading mathematicians of the world, to verify the work of Russell and Whitehead.

Hofstadter wrote in Gödel, Escher, Bach: "[German mathematician Dr. David Hilbert] set before the world community of mathematicians (and metamathematicians) this challenge: to demonstrate rigorously-perhaps following the very methods outlined by Russell and Whitehead-that the system defined in Principia Mathematica was both consistent (contradiction-free), and complete (i.e. that every true statement of number theory could be derived within the framework drawn up in [Principia Mathematica])."

Gödel's Incompleteness Theorem

In 1931, the hope in that great effort was destroyed by Austrian mathematician and logician Dr. Kurt Gödel with the publication of his paper On Formally Undecidable Propositions of Principia Mathematica and Related Systems. Gödel demonstrated an inherent limitation, not just in Principia Mathematica, but in any conceivable axiomatic formal system that attempts to model the power of arithmetic. Arithmetic, the theory of whole numbers, such as addition and multiplication, is the most basic and oldest part of mathematics, which as we know has great practical importance.

Gödel proved that such an axiomatic formal system that attempts to model arithmetic cannot be both complete and consistent at the same time. This proof is known as Gödel's Incompleteness Theorem. There were only two possibilities in such a formal system:

(1) If the formal system is complete, then it cannot be consistent. And the system will contain a contradiction analogous to the liar paradox.

(2) If the formal system is consistent, then it cannot be complete. And the system cannot prove all the truth of the system.

For very simple formal systems, the limitation does not exist. Ironically, as a formal system becomes more powerful, at least as powerful enough to model arithmetic, the limitation of Gödel's Incompleteness Theorem becomes unavoidable.

Some scientists say that Gödel's proof has little importance in actual practice. However, English mathematical physicist Dr. Roger Penrose pointed out that another theorem, Goodstein's theorem, is actually a Gödel theorem that demonstrates the limitation of mathematical induction in proving certain mathematical truths. Mathematical induction is a purely deductive method that can be very useful in proving an infinite series of cases with finite steps of deduction.

Inherent Limitation of Formal Deductive Methods

There was a deeper motivation behind Gödel's efforts beyond the issues of Principia Mathematica and other more practical formal methods. Like other great mathematicians and logicians of his time, Gödel wanted to have a better understanding of basic questions about mathematics and logic: What is mathematical truth and what does it mean to prove it? These questions still remain largely unresolved. Part of the answer came with the discovery that some true statements in mathematical systems cannot be proved by formal deductive methods. An important revelation of Gödel's achievement indicates that the notion of proof is weaker than the notion of truth.

Gödel's proof seems to demonstrate that the human mind can understand certain truths that axiomatic formal systems can never prove. From this, some scientists and philosophers claim that the human mind can never be fully mechanized.

Although Gödel's Incompleteness Theorem is not well known by the public, it is regarded by scientists and philosophers as one of the greatest discoveries in modern times. The profound importance of Gödel's work was recognized many years after its publication, as mentioned in Gödel's Proof: "Gödel was at last recognized by his peers and presented with the first Albert Einstein Award in 1951 for achievement in the natural sciences-the highest honor of its kind in the United States. The award committee, which included Albert Einstein and J. Robert Oppenheimer, described his work as "one of the greatest contributions to the sciences in recent times."

Gödel's Incompleteness Theorem [Wikipedia]

Kurt Gödel [Stanford Encyclopedia Of Philosophy]

Heisenberg’s uncertainty principle [Wikipedia]

Hilbert's program [Wikipedia]




- The Quantum Santa Hypothesis
I teach a first year seminar called "Einstein in Wonderland: Physics, Philosophy, and Other Nonsense" in which we consider the classical notion of sense from Descartes and Newton, classical nonsense from Lewis Carroll, and then look at relativity theory...

- Is An Omniscient God Incompatible With The Copenhagen Interpretation Of Quantum Mechanics?
Yesterday was Neils Bohr's birthday. In addition to being one of the great minds in the history of physics, he also had a fascinating life, and bore an uncanny resemblance to the Chief in Get Smart. He is one of the major figures in early quantum...

- 3. Einstein/ Bohr Debate
In the famous Einstein/Bohr debate over the implications of quantum theory, who do you think won? Explain why and be sure to detail your answer with pertinent information related to quantum entanglement, the double-slit light experiment, and other strange...

- Einstein's Critique Of Quantum Mechanics...not That Goofy
"Philosopher untangles Einstein senility controversy" by Marguerite Rigoglioso April 3rd, 2014 phys.org Albert Einstein's critique of quantum theory, long regarded as a sign of senility, is vindicated in a new book by Stanford philosophy Professor...

- "niels Bohr Collected Works"--revisionist Physics
Niels Bohr This will not be Bohring for it is revisionist physics in action especially when original material [the 12 volume Niels Bohr Collected Works] is made available online but the catch will be that it probably comes with a fee for reading or a...



Philosophy








.