The Heisenberg Uncertainty Principle is wrong? More Certainty On Uncertainty’s Quantum Mechanical Role

The Heisenberg Uncertainty Principle is wrong? More Certainty On Uncertainty’s Quantum Mechanical Role October 6, 2012

As a determinist, the Heisenberg Uncertainty Principle is of great interest. I have always intuitively believed there to be hidden variable theories or similar which make more sense of quantum than indeterministic ones. Recently, research has come forth to support such a position. Here is the latest:

ScienceDaily (Oct. 4, 2012) — Researchers are presenting findings at the Frontiers in Optics 2012 meeting that observation need not disturb systems as much as once thought, severing the act of measurement from the Heisenberg Uncertainty Principle.

Scientists who study the ultra-small world of atoms know it is impossible to make certain simultaneous measurements — for example, finding out both the location and momentum of an electron — with an arbitrarily high level of precision. Because measurements disturb the system, increased certainty in the first measurement leads to increased uncertainty in the second. The mathematics of this unintuitive concept — a hallmark of quantum mechanics — were first formulated by the famous physicist Werner Heisenberg at the beginning of the 20th century and became known as the Heisenberg Uncertainty Principle.

Heisenberg and other scientists later generalized the equations to capture an intrinsic uncertainty in the properties of quantum systems, regardless of measurements, but the uncertainty principle is sometimes still loosely applied to Heisenberg’s original measurement-disturbance relationship. Now researchers from the University of Toronto have gathered the most direct experimental evidence that Heisenberg’s original formulation is wrong.

The results were published online in the journal Physical Review Letters last month and the researchers will present their findings for the first time at the Optical Society’s (OSA) Annual Meeting, Frontiers in Optics (FiO), taking place in Rochester, N.Y. Oct. 14 -18.

The Toronto team set up an apparatus to measure the polarization of a pair of entangled photons. The different polarization states of a photon, like the location and momentum of an electron, are what are called complementary physical properties, meaning they are subject to the generalized Heisenberg uncertainty relationship. The researchers’ main goal was to quantify how much the act of measuring the polarization disturbed the photons, which they did by observing the light particles both before and after the measurement. However, if the “before shot” disturbed the system, the “after shot” would be tainted.

The researchers found a way around this quantum mechanical Catch-22 by using techniques from quantum measurement theory to sneak non-disruptive peeks of the photons before their polarization was measured. “If you interact very weakly with your quantum particle, you won’t disturb it very much,” explained Lee Rozema, a Ph.D. candidate in quantum optics research at the University of Toronto, and lead author of the study. Weak interactions, however, can be like grainy photographs: they yield very little information about the particle. “If you take just a single measurement, there will be a lot of noise in that measurement,” said Rozema. “But if you repeat the measurement many, many times, you can build up statistics and can look at the average.”

By comparing thousands of “before” and “after” views of the photons, the researchers revealed that their precise measurements disturbed the system much less than predicted by the original Heisenberg formula. The team’s results provide the first direct experimental evidence that a new measurement-disturbance relationship, mathematically computed by physicist Masanao Ozawa, at Nagoya University in Japan, in 2003, is more accurate.

“Precision quantum measurement is becoming a very important topic, especially in fields like quantum cryptography where we rely on the fact that measurement disturbs the system in order to transmit information securely,” said Rozema. “In essence, our experiment shows that we are able to make more precise measurements and give less disturbance than we had previously thought.”

The above post is reprinted from materials provided by Optical Society of AmericaNote: Materials may be edited for content and length.

Optical Society of America. “More certainty on uncertainty’s quantum mechanical role.” ScienceDaily. ScienceDaily, 4 October 2012. <www.sciencedaily.com/releases/2012/10/121004121638.htm>.

"The SCOTUS disagrees with you about gay marriage. Read the Equal Protection clause in the ..."

Nationalistic Strain of Christianity Shaping America’s ..."
"So you think States shouldn't be granted equal say? Federal Republic. .. you might want ..."

Nationalistic Strain of Christianity Shaping America’s ..."
"Yep. The Senate gives extra power to low-population states. Tell me why that is justified. ..."

Nationalistic Strain of Christianity Shaping America’s ..."

Browse Our Archives

Follow Us!


TRENDING AT PATHEOS Nonreligious
What Are Your Thoughts?leave a comment
  • im-skeptical

    As a determinist, you believe that radioactive decay is not random – we simply don’t know enough about the states of subatomic particles within an atom to determine precisely when the event will occur. Is that correct?

    • To think that it is an ex nihilo causality, to me, is oncoherent. the fact is, at present, decay neither supports determinism or indeterminism. We simply do not know enough. To punt to this being an example of uncaused causation is special pleading. and on supposedly the Many Worlds quantum interpretations being the actual orthodox understanding (which is deterministic, I believe), there is good reason to believe that decay would be deterministic.

      If we have ranges for decay and probabilistic outcomes, then already you have confined total random to a range, which would appear to be deterministic, rather than truly indeterministic.

  • JJH

    Umm, I don’t think the authors of the article were as clear as they could have been when discussing the “uncertainties” in quantum measurements. There are really two (really more but we’re only concerned with two here) different uncertainties were looking at and the authors sort of conflate the two. There are uncertainties introduced by physical measurements (e.g. whacking a particle with a photon) and then there is the theoretical minimum uncertainty in a quantum system (the heart of Heisenberg’s principal). The physicists in the article have overcome a big obstacle for the first uncertainty (Yeah Team!), but they did nothing that would call into question the theoretical limit. And thank goodness for that. If we were to ever find out that the limit dE x dt >hbar/2 didn’t hold, the entire standard model of physics would fall apart (it’s necessary for particles to carry the fundamental forces).

    However, I am also determinist when it comes to human behavior and the uncertainty principle and the statistical nature QM in general doesn’t bother me in the least. The human mind is such a large and complex system that statistical nature of QM collapses into a quite determinable system. Vic Stenger did some back of the envelope calculations on it, I’ll try to find a link.

    • im-skeptical

      I agree with you regarding human behavior. However, if events at a quantum level are not strictly deterministic, the implication is that the universe does not play out along a fixed course, as many materialists believe.

      • JJH

        I’m also a strict materialist. Quantum indeterminacy doesn’t effect my viewpoint on that either. It’s still about particles and how they act; be it statistically or simple cause and effect deterministically is just a matter of the system you’re looking at (now, at the boundary, that’s fascinating). But it’s still particles.

    • JJH

      I couldn’t find any direct links, but here is a quote from Stenger’s “Quantum Gods”

      “In The Unconscious Quantum I presented a criterion for determining whether a system must be described by quantum mechanics. If the product of a typical mass (m), speed (v), and distance (d) for the particles of the system is on the order of Planck’s constant (h) or less, then you cannot use classical mechanics to describe it but must use quantum mechanics. Applying the criterion to the brain, I took the typical mass of a neural transmitter molecule (m=10 to the minus 22 kilogram), it’s speed based thermal motion (v=10 meters per second), and the distance across the synapse (d=10 to the minus 9th meter) and found mvd=1700h, more than three orders of magnitude too large for quantum effects to be necessarily present. This makes it very unlikely that quantum mechanics plays any direct role in normal thought processing.”

    • I think there is some confusion over observer effect vs HUP, here, no?

      • JJH

        Yes, I think that is exactly it.

        And in my previous replies I may have been a little too hard on the authors (I was reminded it’s frequently the editors). But either way it is still a point of frustration. Popular science writing tries so hard to turn every story into a “ground breaking discovery” that they frequently overstate the case that the scientists are making and there are consequences to that. As just a run of the mill skeptic, I frequently have to deal with, “well yesterday science said this and today science says that, so it’s just as subjective as anything else.” But, when you actually go and read the findings of the paper you see words like, “maybe, possibly, tentatively, requires further research, etc.” A good example would be the LHC and particles traveling faster than c: The authors of that study were very tentative, almost to the point of saying, “We must have messed something up, someone please help us find it.” But, the popular science reporting put it as if relatively was on the edge of collapse.

        I’m no philosopher, but I am a consumer of philosophy (the way I analyze data being a key example). One place where I think the philosophy of science has failed miserably is in having a coherent philosophy of “Presenting Science to the Public.”

        Perhaps a new field of philosophy?

        • I am as guilty as the authors for representing it here. You have some good points.

          As wiki says:

          “The uncertainty principle has been frequently confused with the observer effect, evidently even by its originator, Werner Heisenberg.[2] The uncertainty principle in its standard form actually describes how precisely we may measure the position and momentum of a particle at the same time — if we increase the precision in measuring one quantity, we are forced to lose precision in measuring the other.[3] An alternative version of the uncertainty principle, more in the spirit of an observer effect, fully accounts for the disturbance the observer has on a system and the error incurred although this is not how the term “uncertainty principle” is most commonly used in practice.[4]”

          • JJH

            Well, here we disagree. I wouldn’t say that you are as guilty as the authors/editors of the article. You ask a philosophical question based on the article’s (from a reputable popular science publication)presentation of the findings, and they are the one’s that present those findings.

            Either, they didn’t realize that reasonable people could easily misinterpret their reporting that way or they intentionally represented the data to increase readership. Either way, the community of science writers is doing a disservice to the public understanding of science.

            I have no idea of how to fix this, nor could I give empirically based argument of why it should be fixed (it could be argued that sensationalism gets people interested and that is what is required to get people to accept empirical facts).

            But just from a “boots on the ground” perspective; I would love to see a popular science headline use the words that scientists use (e.g. might, possibly, requires further study, etc.).

  • I can’t believe I spelled Heisenberg wrong in the title and went away for three days and no one mentioned it!

    tsk. Bad me.

  • If matter has an inherent statistical fluctuation, then the thing you use to measure also fluctuates, and that makes it impossible to  measure accurately.

    But this measurement inaccuracy is a separate thing from the inherent fluctuation in the measured object.

    But of course, on a more fundamental level, there may be hidden variables, but these must somweway behave otherwise that the theory or relativity tells us.

    • I often wonder about the idea that QM is not deterministic but probabilistic. Surely, if they is a given probability that something will happen, it is responding to causal pressures which manifest in a probability. For example, if, when I play tennis against Roger Federer, I lose 99% of the time, this is because of underlying causal behaviour and laws. You can’t have something consistently happening 99% of the time if it is random.

      Maybe I am not understanding stochastics here. But surely at the heart of Monte Carlo style probabilities is the inability to truly randomly generate anything,,,?

  • The uncertainty of measurements is not any fundamental property of the system measured, but a consequence of the fact that the measurement apparatus itself behaves according to probabilistic principles of the same sort as the measured system. On a deeper level, there are probably hidden variables and structures. The discoveries of these will surely take away the paradoxes, but I am not so sure that the probabilistic nature will be taken away.

  • Spiros Koutandos

    I happen to be making research on the hidden variables of quantum mechanics and I am of the opinion that polarization plays a crucial role but there is some form of pressure as well. You may find my writtings at http://www.gsjournal.net “The general science journal”
    Spiros Koutandos

    • Thanks for commenting, Spiros. Can you expand a little? Does this play into the uncertainty or deterministic framework?

      • Spiros Koutandos

        It is deterministic. I am also trying to find some formula for the volume the system occupies so we should have somthing pressing the volume making our calculations of the thermodynamic nature. Please feel free to communicate with me at skoutandos1@gmail.com

        • Do you have a layperson’s account of what you are working on? Can I repost it here?

          • Spiros Koutandos

            What I have been doing consists mainly of attempts to calculate on the grounds of the formalism given to us by quantum mechanics using total time differentials or discussing about polarization in terms of Heisenberg shells what is included within the lobes surrounded by areas of constant psi(wavefunction value) . It is only math but I would be very happy if someone looked at some of it or give me an idea to proceed because I am stuck right now. I have published a book “A search for the hidden variables in quantum mechanics”, Lambert editions which can be seen at amazon

  • Spiros Koutandos

    Heisenberg s uncertainty relationsip is correct but we lack some physical insight into it. I believe that the very volume is affected by the presence of mass and this destroys our measurements

  • A system might have an exact state at each moment, but yet it might not be totally denermined what the state will be in the next moment.