Translate

Powered By Blogger

15.4.13

default metaphysics among the educated secular elite, including most academic philosophers. And that default metaphysics is materialismYet the irony of this situation is that the philosophical foundations of materialism, in a proper metaphysics, are in worse shape now than they have ever been.




Dr Kelly Ross:
The Fortunes of Materialism

It is a curious time in the history of metaphysics. Few areas of 20th Century philosophy have been as shunned and neglected as metaphysics. Entire schools of philosophy, principally Logical Positivism and the "ordinary language" approach founded by Ludwig Wittgenstein, have specifically rejected the possibility or meaningfulness of metaphysics as a discipline; academic philosophers tend to assume that the success of science replaces or discredits the philosophical treatment of metaphysical questions; and popular culture confuses metaphysics with spiritualism or the occult. .... The combination of these factors has resulted in a sort of naive default metaphysics among the educated secular elite, including most academic philosophers. And that default metaphysics is materialism. One sees this in the prevalence of the "new atheism" in popular culture, whose appeal rests largely on naturalistic and materialistic views of reality, and in what otherwise seem to be the generally sensible views of philosophers like John Searle.

Yet the irony of this situation is that the philosophical foundations of materialism, in a proper metaphysics, are in worse shape now than they have ever been. The "new atheists," and other commentators, like Stephen Hawking, whose scientific ideas begin to lead them into philosophical reflection, are philosophically very naive and seem to have only the shallowest familiarity or understanding of the history of philosophy, let alone of metaphysics proper. Metaphysical materialism should not have survived the progression of ideas through Berkeley, Hume, and Kant; but perhaps because this outcome was muddled in the treatment of German Idealists (like Hegel), a simple version of Democritean Atomism returned in the 19th century among those more impressed with the progress of science than with the jargon and obscurantism of Idealism. That is when materialism gained some traction as the default metaphysics of generally sensible, critical, conscientious, and empirical opinion. Even the denial of Marx that his "materialism" was an ontological theory (rather than about material economic conditions of production) was ignored and forgotten in the general impression that, after all, it must be just that.

The revival of Hegelian Idealism in the early 20th century, by people like F.H. Bradley (1846–1924) and Josiah Royce (1855-1916), failed to impress in comparison to what were beginning to be remarkable and even astounding continuing advances in science. Such neo-Idealism collapsed into the Positivistic desire to be subservient to science (which we see dramatically in the career of someone like Bertrand Russell, who was a student at the time of Bradley's greatest influence in England); and the attendant recoil from metaphysics allowed the naive 19th century materialism to insensibly be taken up as self-evidently true. Yet it was science itself, as we shall see, that would decisively explode, again, Democritean materialism. The response of philosophy was curiously to draw encouragement for epistemological skepticism and nihilism rather than to reevaluate the metaphysical issues. In the long run, science would not be well served by this approach, as confidence in the objectivity of science itself would be eroded, and the notion would become current that scientific knowledge was merely the self-interested agreement of socially powerful and politically suspect race, class, and gender interest groups. The collapse of the foundation of materialism thus did not drag down materialism, which continued as default metaphysics, but was displaced into the undermining of science itself, upon which its cognitive confidence presumably rested. So we have the ironic result of the modern materialistic Nihilist, who believes no more in science than he does in God.

Part of the naive confidence vested in modern materialism is the sense that it is just obvious. We see matter. This is the Dr. Johnson school of metaphysics -- Samuel Johnson (1709–1784) kicked the table to refute Bishop Berkeley (definitely a common sense kind of guy). However, this does no more than bespeak the naivety and ignorance of the view. Dr. Johnson did not understand that his refutation was irrelevant to Berkeley's argument. We do not see matter, and its existence has never been obvious.

The truth of this is evident from the first example of a clear ontological materialism, which was the Atomistic theory of Leucippus and Democritus. Reality, in their view, consisted of Atoms and the Void, i.e. particles of matter together with empty space, and atoms were the ultimately small bits of stuff that could not be divided down into smaller particles: they were atomos, "uncuttable" (where we seen the -tom, "cut," or -ektom, "cut out," part in much surgical terminology, e.g. "hysterectomy," to cut out the uterus). Atoms differed from each other only in shape and size. Nothing about this theory was visible to the naked eye. Atoms were too small to be seen, much less inspected; and the variety of the world, which other Greek philosophers had explained with different kinds of "stuff," vanished in the uniformity of whatever it was that constituted the content of atoms. Popular presentations of Atomism neglect that question altogether. What was in the atoms? To answer that, anyone would need to admit that Atomism was based, not on the method of observation and experiment in modern science (which is sometimes carelessly credited to Democritus), but on the metaphysics of Parmenides. Thus, atoms consisted of no more and no less than "Being," the existence of whose ontological opposite, "Not Being," was rejected by Parmenides as self-contradictory.

Atomism altered Parmenides merely by holding that "Not Being" exists as much as "Being." Not Being, in turn, derived its meaning from the concept of empty space. "Being" could thus be broken up into atoms, allowing them to have different shapes and sizes and to move around in space. This would then explain the apparent differences among visible substances and the reality of change in the world. The elegance and simplicity of this theory continued to appeal to certain philosophers, such as Epicurus and the Roman Lucretius; but the paradox of empty space being nothing and yet something did not gain any great popularity or consensus in ancient or mediaeval philosophy. Instead, space itself was identified with matter, a connection still explicitly upheld as late as René Descartes and Baruch Spinoza. If space itself is matter, then there clearly cannot be empty space. Although most people, as well as philosophers, are now comfortable with the idea of empty space, facets of the issue are not at all absent from the consideration of space in modern science.

Now, a definition of matter and of materialism may be in order. Matter is a substance (substantia), i.e. a durable, separable, and identical entity, that possesses only the characteristics of an "externalist" ontology, i.e. that it underlies (substans) the existence of concrete, empirical, or phenomenal objects in nature, that it is intrinsically subject only to natural laws and causal interactions, and that consciousness and purposive determinations are not inherent in it. Descartes, who identified matter with space, could simply define it as the substance whose essence is spatial extension, allowing that there were other substances in reality (i.e. souls and God). Materialism is a stronger doctrine, that matter is the only substance, with conscious phenomena as no more than epiphenomena of matter, if they are not to be dismissed as illusory (as is done by Behaviorists and certain Materialists). Materialism is thus a reductionistic theory in fundamental metaphysics, or ontology, which is the study of "Being qua Being" or the identification of the óntos ónta, , the "beingly beings" -- the genuinely existing or fundamentally real things. Materialism is a false doctrine; but "matter" is a legitimate concept, in terms of an external and causal, and so scientific or naturalistic, understanding of the world (see "Ontological Undecidability").

The connection of space and matter can also be found in Aristotle, as construed through his own theory of "form" and "matter," where form is the actuality of things and matter is the potential or power for change. In these terms, matter again is invisible, since anything available for perception will be actual and an example of "form." Even the four elements represent a certain level of actuality. "Pure" matter or "prime matter" in the theory is invisible and unavailable for direct inspection precisely because nothing about it is actual. Although it is perhaps not required by the theory, Aristotle then identified matter with space. This means that empty space would be identical with prime matter and so doesn't exist for the same reason that prime matter doesn't actually exist. On the other side of the divide, Aristotle thinks there are beings that are pure form. This began with God, but Aristotle then allowed that there were pure form "intelligences" that were responsible for the motion of the planets. In the Middle Ages, the "intelligences" were identified as angels, and St. Thomas then introduced the view that human souls could also exist independently as pure forms, which Aristotle had not believed. But whatever version we prefer, beings of pure form, beings free of matter, are without spatial extension. This gives us an unambiguous answer to the famous Scholastic question of how many angels will fit (dance?) on the head of pin, which is, "All of them." They don't take up any space and so can be fit into any space, however small. All of this may seem, and may be, very dated as metaphysics, but elements of it persist in the metaphysics of Leibniz, which becomes, most significantly, part of modern debates about space and matter.

Indeed, as we move from the physics of Descartes to that of Newton, the possibility of (actual) empty space reemerges in modern science. This is not only because Newton believes that motion is possible in empty space, which Aristotle had denied (until contradicted and refuted by John Philoponus), but, more importantly, because Newton believes that gravity is transmitted (or something) as action at a distance. Descartes and earlier physics had always held that forces could be transmitted only by contact between bodies -- a sensible view when there can be no empty space. But Newtonian gravity leaps across empty space, for which action Newton had no better explanation than that it was the Will of God. Nevertheless, the reality of space for Newton, and belief in its absolute structure and existence, was sharply denied by Leibniz. This led to the epic Clarke-Leibniz Debate, one of the most important events in the history of modern philosophy. Samuel Clarke's defense of Newton worked best in terms of what was needed and implied in physics, e.g. that the rotation of a body can be detected from internal observation (i.e. in relation to space itself), without reference, as Leibniz required, to eternal bodies. Leibniz's objections to Newton were mainly metaphysical, including the argument from sufficient reason of spatial counterparts (q.v.), which was refuted by Kant with the point that right and left handed mirror opposites do physically differ, which Leibniz denied, specifically in spatial terms and not in any other physical metric.

In the 19th century, for some reason there were physicists, like Ernst Mach (1838-1916), who wanted to get around Clarke's rotation argument against Leibniz. This was never more than an ad hoc special pleading until Einstein introduced his theory of Special Relativity in 1905. Then, absolute position in space could no longer be determined, and even size became Relative, in the direction of motion, because of space and time dilation. It was soon widely believed that Einstein's theory meant that Newton was wrong about space and that Leibniz was correct, and I have so far never seen this questioned in popular or even more specialized treatments of the history and philosophy of science. But Einstein had suggested or proven nothing of the sort -- indeed, rotation as acceleration falls entirely outside the purview of Special Relativity -- and this is what makes me wonder about how well even academic philosophers have understood the metaphysical issues involved (or even Special Relativity itself). For if Leibniz was correct about space, in just the way that he understood it, this means that space doesn't exist. In fact, Einstein's theory did not affect the status of Clarke's rotation argument at all. But really, did Ernst Mach and all the subsequent scientists and philosophers really mean to say that space doesn't exist? Is this part of science now, that space doesn't exist? I don't think so. But if space does exist, and we are not to accept Leibniz's conclusion at face value, then what kind of metaphysic of space are we left with? Well, no significant effort has been made to deal with that, largely thanks to the lack of clarity about what Leibniz's theory was.

The irony of this situation is that space emerges in Einstein, particularly with his theory of General Relativity in 1915, as a significant feature of the physics. Einstein's approach to gravity is to replace the concept of "forces," and particularly Newton's appeal to forces that act at a distance, with geometry. The Earth goes around the Sun, not because it is pulled into an orbit through the invisible attraction of gravity, but because it is following the equivalent of a straight line, a "geodesic," in curved space-time. Matter causes a deformation in space and time. The Earth, with its Newtonian velocity, just follows the path laid out for it. One is not left with the impression here that space doesn't exist. Quite the opposite. The physicist E. C. G. Sudarshan (b.1931), who introduced the theory of tachyons (particles that travel faster than the velocity of light), told a seminar I was taking at the University of Texas in the 1970's that Einstein's field equations looked to him like those for fluid mechanics. Space-time flows like water.

Indeed, by being subject to curvature, in Non-Euclidean geometries explored in the 19th century, the reality of space becomes absolutely essential to Einstein's treatment. And this becomes more acute when we realize that this is Einstein's answer to Newton's doctrine of action-at-a-distance: it is now space that mediates the force of gravity, but not space as the old contact-matter of Cartesian physics. Matter, whatever it is, moves within the geometrical structure of empty, but existing, space. This is a rather extraordinary idea, but no one would ever mistake it for a Leibnizian theory that space does not exist. Yet academic philosophers managed to make it through the 20th Century without a clear recognition that Einstein's refutation of Newton's theory of space is a fairy tale that is incommensurable with the larger nature, requirements, and implications of Einstein's own theory.

Meanwhile, other events had been occurring. Ernest Rutherford announced in 1911 that atoms were mostly empty space. He (or his graduate students) had bombarded gold foil with alpha particles (subsequently discovered to consist of two protons and two neutrons), most of which sailed right through the foil as though it had not been there. To this astonishing result was added the equally curious datum that occasionally one of the particles would be scattered back almost directly towards its source, as though it had rebounded off something all but impenetrable. So Rutherfold concluded that most of the mass of an atom was concentrated in something very small at its center, the "nucleus." Indeed, the nucleus is typically something like a hundred thousand times smaller than the whole atom -- roughly the ratio in units of a Fermi (fm, a femtometer, 10-15m) to an Angstrom (, 10-10m) -- a reality that is seldom indicated in popular presentations, in which nucleus and atom are often the relative sizes of golf balls and grapefruits. Instead, if the diameter of an atom were about 100 meters, slightly longer than an American fooball field, then the diameter of the nucleus would be an impressive one millimeter, something that would be quite invisible on the field and practically undetectable even by the players during a football game.

Meanwhile, the rest of the empty atom was somehow filled with orbiting electrons, whose negative charge bound them to the positively charged nucleus. There was immediately an obvious problem with this. If charged particles orbit the nucleus because of electrostatic attraction, they are accelerated; and accelerated charges emit radiation and lose energy. The atoms would almost instantaneously collapse. Rutherford had no solution to this problem. The solution, when it came, which was quickly enough, nevertheless has never subsequently figured in popular ideas about the atoms. We still see images of electrons orbiting the nucleus (an image constantly seen on the popular television show, The Big Bang Theory), and it is a popular pastime for people to imagine that our solar system is somehow an atom in the matter of a larger universe (as recounted in classic form by Donald Sutherland in Animal House [1978]). That's impossible.

Part of the solution to the problem of the atoms came in 1913, with Niels Bohr's theory of the quantacized atom. Bohr proposed that electrons lodge at certain specific energy levels, "orbitals," in the atom. Each level can only hold a certain number of electrons, and when electrons jumped from one level to another, they emitted the very specific wavelengths of radiation that could be seen in the spectra of stars and of heated elements. This explained beautifully the mystery of things like the spectrum of hydrogen; but it also left unexplained why such orbitals existed or what electrons were really doing in them.

An answer to this came from Louis de Broglie, who suggested in 1923 that electrons could be understood to have the characteristic of waves, which meant that in the atom they were standing waves (as opposed to traveling waves, which move), with the integer and half-integer nature of whole and half wavelengths of waves "in a box." This is the only physical explanation that has ever been given for the quantum states of electrons in the atom, yet its nature has come to be neglected because of Werner Heisenberg's addition to the theory, that the square of the wave function (which of course turns all negatives into positives) would give a probability distribution for where the electrons as particles will be found when the "box" is broken and the wave function collapses by an act of observation.

This interaction of waves and particles in retrospect is a bit curious given another side of Einstein's work in 1905. Einstein's analysis of the Photoelectric Effect, for which he received his only Nobel Prize, included an analysis of light as consisting of particles, in line with Newton's thinking, rather than as waves, as had been established in the 19th century. Richard Feynman, with his own Nobel Prize in Physics, put it very bluntly: "the wave theory collapsed." But it didn't. It got expanded. De Broglie had analyzed electrons as waves and had calculated their wavelength. This ended up agreeing with experiment. Bohr eventually posited a principle, "Complementarity," of the "wave-particle duality": if you haven't observed things yet, light and electrons behave as waves; but once you have observed, or are even able to infer the positions of particles, then the wave function collapses and light and electrons behave as particles, i.e. have discrete locations (subject to the Uncertainty Principle of Heisenberg, in which position and momentum cannot be known with equal precision).

As Oliver Hardy (of Laurel and Hardy) always used to say, "This is another fine mess that you've gotten me into." There is no popularly accepted system of metaphysics that can accommodate the wave-particle duality. Bohr himself seems to have endorsed an Idealist or Anti-Realist position, that "nothing exists until it is observed" -- this is the "Copenhagen Interpretation" (because of the Danish nationality of Bohr). To the extent that this remains popular, it is the Anti-Realist version that conforms better with trendy Nihilism of "Post-Modern" academic thought. But Einstein was a Realist, and he hated it. And even though Einstein was long dismissed as, in effect, an old fuddy-duddy, the absence of a proper metaphysics for the system, even at a time when metaphysics was in poor repute, meant that the problem was bound to gnaw on people's minds.

Nevertheless, there was a system of metaphysics that fit the wave-particle duality perfectly, which was Kant's Transcendental Idealism, which contained a similar duality of observed (i.e. synthesized into perception and consciousness) vs. unobserved (not synthesized). Most of the early participants in these matters knew Kant and even had the language to read him in German. Bohr's position is sometimes said to be Kantian, but this only shows that anyone who says so is relatively unfamiliar with Kant's thought and doesn't understand the place of "empirical realism" in it. I get the impression that Einstein and Kurt Gödel, on their daily walks to the Institute for Advanced Study in Princeton, sometimes discussed such things. Yet Kantian metaphysics has never been put forward by prominent physicists as an interpretation of quantum mechanics. This puzzles me. I suppose it is largely due to the difficulties in interpreting Kant's system and its lack of popularity in contemporary philosophy. This leaves behind a continuing scrimmage between Realists, Anti-Realists, and Positivists (who don't want to explain anything) in Physics and Philosophy of Science, with sometimes tortured, bizarre, or irrelevant proposals to avoid the wave-particle duality and collapse the metaphysics into one or another form.

Meanwhile, the form that a particle like the electron will take when it is a particle was specified by Paul Dirac. The electron will be a point particle, having no extension in space. The reason for this was simple. If an electron is extended, and is charged, then the charge presumably will be spread over the surface of the particle. However, negative charges at different spatial locations will repel each other, with a force that increases with proximity. Thus, while atoms might have collapsed with orbiting electrons, an extended electron will instantaneously explode as its parts repel each other. Collapsing atoms and simultaneously exploding electrons make for a nice image to associate with popular representations of matter.

This is easily remedied with the postulate of the point particle. However, we now may notice an attendant curiosity. Atoms are not merely mostly empty space, they are entirely empty space. Perhaps that was not evident for a while, since the actual size of protons and neutrons could be measured; but in the 1960's Murray Gell-Mann proposed that these particles consisted of smaller ones, quarks. The quarks are -- you guessed it -- point particles. But now, if atoms, and so all matter, are entirely empty space, this produces, not a Democritean world of Atoms and Void, but something exactly the opposite of Parmenides: A world where absolutely everything looks like it is fundamentally Not-Being. The currently popular systems of "String Theory," in which particles are extended in one dimension do not alter this point, since one dimensional objects do not fill space. That strings are extended and so might be subject to the original problem with extended particles is avoided by the circumstance that charge is not inherent in the string but is an epiphenomenon of the oscillation of the string in a higher dimension of space.

If matter, in the form of fundamental particles (quarks and leptons), does not fill space, what does? The answer to that is easy: Fields fill space. OK, but what then is a "field"? The answer to that one is not easy, for Modern Physics contains two different and exclusive explanations of what a field is. One we have seen already. A field in Einstein's approach is a deformation of space-time. String theory in recent physics itself includes an extension of Einstein's theory by additional dimensions of space (typically six) in order to accommodate all the forces of nature besides gravity. Thus, as previously, a theory that follows Einstein posits space as an actual thing that mediates the interactions of all the forces of nature.

For a while, however, Einstein's use of space was not the most popular approach to fields in physics. The other approach was an artifact of quantum mechanics. It was noticed that Heisenberg's Uncertainty Principle, in which the product of position and momentum is proportional to Planck's Constant, could be rewritten in terms of energy and time. Indeed, the units of Planck's Constant can be stated as "Joule-seconds" (J*s), which are units of energy and time. The result of this was the theory of virtual particles, where energy can be borrowed from nothing as long as it is "returned" within a length of time proportional to Planck's Constant. The larger the energy, and so the more massive the particle, the shorter the time. Photons and gravitons, which have no rest mass and so no intrinsic energy, can therefore exist permanently as "virtual particles."

Virtual particles were then used to explain the interaction of the forces of nature. Electromagnetism and gravity, mediated by massless photons and gravitons, would be infinite in reach, while other forces (Strong and Weak) were mediated by particles with mass and so have a finite effect. But since they don't have "real" energy, virtual particles cannot be observed or detected as such, unless somehow real energy is transmitted to them. These strange entities thus became the basis of the entire theory of fields in quantum mechanics.

In the Feynman diagram at right, we see an interaction of real particles and real energy, where an electron-positron pair (which are anti-particles) annihilate each other and the resulting (real) energy (of a gamma ray, , an energetic photon) gives rise to two quarks, a d-quark and an anti-d-quark. It is the convention in Feynman diagrams to show anti-particles as real particles traveling backwards in time, which is why the arrows for the positron and anti-d-quark look like they are going backwards.


In the next Feynman diagram we see something a little different. Here two normal electrons scatter off each other -- both being equally and negatively charged, they repel each other. Their interaction is mediated by the exchange of a virtual photon. Feynman realized that virtual particles themselves could split, loop, and interact, complicating the diagram and implying a potentially infinite nest of interactions. Indeed, his original mathematics for virtual exchanges produced infinite values, which clearly did not match the observed scattering of particles. His genius was to discover mathematical techniques ("renormalization") that enabled him to calculate values for successive splits and loops and eliminate the infinities. The precision of his calculations was astonishing and opened the possibility that even physical constants, like the Gravitational Constant, might be calculated (rather than inferred from observation) in such ways. This proved the value of the quantum understanding of forces and fields using virtual particles.


In this way Physics went merrily along, confident that Einstein's geometry was silly and that quantum theories would easily deal with all the forces of nature, despite the oddness of all the virtual particle business. Then there was a problem. The math didn't work for a quantum theory of gravity. It didn't work, and it didn't work; and physicists went back to Einstein and began working on theories that introduced extra dimensions to accommodate additional forces of nature. This is what is the most popular now, although String Theories are coming under criticism for being arbitrary and not making critical predictions. Other physicists continue hopefully with quantum theories for gravity. But the old days of quick and dramatic progress in all this seem to have been gone, unless and until a new genius pops up with something entirely new and unexpected.

For the rest of us, the choice between Einsteinian space-time (despite the Leibnizian result that space doesn't exist) and quantum virtual particles is a curious one. In the former case, we are effectively back in a Parmenidean (/Eleatic) universe, where Being is an extended plenum (of space) and matter and energy (and fields) are wave phenomena within it, while with the latter, the phenomena of nature are either equally (Kantian) or exclusively (Copenhagen) the contents of consciousness, once an observation has been made. None of these are remotely comparable to a good solid Democritean Atomism or 19th Century Materialism. Because of this, we get books with provocative titles like The Matter Myth: Dramatic Discoveries that Challenge Our Understanding of Physical Reality [Paul Davies & John Gribbin, 1992, Simon & Schuster, 2007]. Yet the curiosities and problems of matter in modern Physics seem to have barely registered on popular culture, academic philosophers, or even physicists, whose background in philosophy or metaphysics is, of course, spotty, minimal, and inaccurate. In debates about space, they should at least have a good knowledge of Leibniz, if not Kant; but this does not seem to be the case -- and patently false statements continue to be made about Kant's philosophy of geometry by people who really should know better.

The fortunes of materialism are thus not promising, and both popular and academic culture should pay a little more attention to what happened between Berkeley, Hume, and Kant. Of course, that will not happen spontaneously; and it will require some treatment, as on these pages, that catches the attention of the gatekeepers of popular and academic discourse.


Infantile Atheism

Metaphysics

Home Page

Copyright (c) 2012, 2013 Kelley L. Ross, Ph.D. All Rights Reserved