This is brilliantly described. So many physics teachers I had did such a poor job and in 12 minutes you finally got me to have a good intuition of decoherence. Thank you!
@Ron Maimon She is literally a theoretical physicist that works in the field of quantum gravity research, which means she has a strong formal education in quantum mechanics and you call her incompetent in talking about an aspect of her own field?
@@things_leftunsaid Yes, I read what Ron Maimon says about himself: "I have no PhD, I am almost entirely self taught. I like physics, but I think the professionals are, for the most part, completely incompetent. I have a lot of my own personal theories about physics which I like to spread online. I am unemployed and not by choice. Despite this, I consider myself to be the next Isaac Newton." kew1beans.wordpress.com/2014/10/20/167/ I think I don't need to comment this further. I understand that this is the era of so-called "disintermediation", but luckily this approach doesn't work with physics.
Ron Maimon Dear Dr Maimon. I am seriously considering that you are right on this point, because your reputation is frightfully great. But please do go easy on Sabine, even while she does not go easy on the physics community. Please make your point in a constructive way, no need to be overly polite, just friendly would do. I personally am very happy with female physicists, which probably paints me as a sexist, however it was terrible to see that in Utrecht in the late seventies there were like 3 women and 80 men in the freshmen physics course. I do get that being wrong and aggressive at the same time is an unfortunate combination, but to quote a famous saying attributed to a then obscure Jewish rabbi, let a person clean of sin throw the first stone. Please tell us what Sabine should have said, Sir.
Wow! If Sabine had dropped this a year ago, I would be lost. However, she has provided so many great videos building up to this that I can receive a valid familiarity with this important open question in physics. Thank you very much Sabine!
Not to burst the bubble but the familiarity is just an illusion - this topic is much more complicated than this, and unfortunately this video gives a totally false sense of understanding.
Stephen No bubble to burst. I have obviously chosen my words carefully. If you posses a working mastery of this topic, I would be happy to watch a video of yours.
Does this mean that a wave function is decoherence and is described by a probability; furthermore, that coherence occurs after wave function collapse and with value 1?
I've never seen decoherence explained as simple and beautifully as you did. This concept was introduced to me in Quantum Statistics course in the context of ensembles - very different kind of approach.
Finally somebody explained this. A friend of mine had already told me decoherence didn't solve the problem of measurement, but I got lost in his explanation... now I can go to him and tell him that decoherence half solves the problem :). Thanks Sabine.
Great video. You should write a book on quantum mechanics. From elementary to the advanced level and covering decoherence and interpretation matters in detail. You did a great job of explaining an advanced topic in such easy terms and using simple math, whereas many established texts with their full math machinery fail to do so and only confuse the student along the way without giving the essence of the matter and the big picture.
Such clarity, thank you so much for demystifying not only this word, but also why the use of density matrix is mandatory here, and in such a short time. Many thanks for your work on this channel.
The survey mentioned at the beginning of the video was biased based on the list of options given to the respondents. If it had instead given a reasonably comprehensive list of philosophical interpretations of Quantum Mechanics, and asked them to pick their favorite, or to pick "other", then the percentage who would have picked "decoherence" would be much smaller.
@@SabineHossenfelder Why not make one? It could be good research material for a new book - outstanding problems in quantum physics. The poll could simply include the question: "Which interpretation of QM most successfully solves the measurement problem?" with a possible answer as "None of above".
Nitpicks/clarifications: 7:31 - The animation actually illustrates e^(i*theta) with theta decreasing from pi/2 to -3*pi/2 . Ranging from theta to 2*pi would have the vector starting/ending parallel to the R axis and rotating anticlockwise. 9:12 - I believe she's saying that for one of the coefficients to be zero means that one or more PRIME-diagonal elements must also be zero, which it plainly is not in the wave function.
I have a request. In your upcoming videos about the measurement problem, please shed some light on how such measurements are irreversible and might have something to do with the direction of time. I don’t know much about this but would love to know what our current stage of knowledge is about this issue. Thank you for your amazing videos.
@@SabineHossenfelder What if I have my own interpretation of QM, Which laws and experiments I have to follow to verify it mathematically? If you are intrested in details. My interpretation says that time flows in all directions at once and only when particle interacts with smth it gets vector of time directed in relation to its partner, and all such vectors get biased towards each other which makes gravity to appear. Also, measurement problem is solved in my interpretation, it is just generic act of a system getting determined state from hidden parameters and mechanics which we cannot observe but are able to guess. All in my interpretation relies on smart guesses, like causal dynamics but much more deterministic with no statistical data, only rigid and solid structures with zero randomness, only chaos is allowed. Also in a system time can flow in two opposite directions at once and then suddenly turn in only one direction.
Yes. good question. I have always wondered why both classical and quantum mechanics, seem to insist upon time reversibility. It seems that only the 2nd law of thermodynamics actually implies a "direction" to time. You would have thought that somewhere in QM or whatever might "replace" it (in the sense of GR "replacing" Newtonian mechanics) there would be something that is asymmetric with respect to time. Is it actually "merely" statistics that state that entropy increases with time? at least without a situation like here on Earth, where the sun's output supplies the energy to temporarily drive entropy seemingly backward, locally?
@@timbeaton5045 As I understand it, it's simply that no experiment has ever found evidence of time asymmetry so it's not in the theory and the latter hangs together nicely without it. However: the weak force, at least, seems to suffer from broken symmetry. The conserved symmetry there is the compound CPT IIRC, so even there there's no actual time asymmetry.
Does this mean that a wave function is decoherence and is described by a probability; furthermore, that coherence occurs after wave function collapse and with value 1?
@@moses777exodus In collapse models decoherence isn't necessary.You can just postulate for big systems superpositions do not apply since the wavefunction collapses.
When you showed me a Danish, I instantly thought: "It's been a long time since I had a nice cup of coffee and a pastry in a nice coffee shop." But then I remembered why...
Great explanation, thank you. Please let me add: A quantum state can be pure (isolated system) or mixed, pure states are described by a single wave function but mixed state are not. Density matrices allow to describe them both. A density matrix is the result of the addition of ket-bra products of one wave function (pure state) or several wave functions (mixed state) multiplied by probability factors as a result of entanglement between your quantum system and the outside world: Decoherence.
This is a fantastic video. It’s the first time I have seen decoherence clearly and accurately explained for an amateur like me. I had a vague understanding of the measurement problem’s relationship with decoherence, but now I really understand. Thank you!
The transition between explaining science to the act of advertising at 11:50 is hilarious. A phase transition indeed. As always, a wonderful exposition.
Great as usual! Some of your videos bring back the feeling I had in the early 80:s when I read "The Dancing Wu-Li Masters". Unfortunately this is one of the videos when I'm at a loss because of my lacking in math. But that doesn't matter. The important thing is that you are more successful than most scientists in explaining quantum mechanics at a popular level that common people can actually understand. Keep doing it!!
Leonard Susskind will give you the math you need. Check his series on entanglement, not the whole thing to the end, but the beginning few lectures where he explains the math.
Finally some simplified maths to explain how physicists explain the phenomena they observe in experiments. I know it's not very layman friendly, but as someone with an electrical engineering education, it makes it much more approachable and understandable for me. Now I understand why decoherence does not fully resolve the measurement problem. Thank you for another quality video Sabine, you have an excellent channel on your hands and I really enjoy the way you present these topics. Been binge watching your videos since yesterday and I have to say your videos are some of the best scientific ones on RU-vid, and I've seen a lot of them. Very clear, concise, diction and pacing easy to keep up with and a non-distracting style of editing, straight to the point, as well as keeping the tone serious, but not too serious. Keep it up, really looking forward to future content.
My master's thesis is about environmental decoherence and it took me one full year of reading up vague literature available and still understand only 80% of the concepts in this video. This video in just 12 minutes cleared all my conceptual doubts.
An excellent video that frames the decoherence problem well. Thank you. That so many physicists don't understand there is a problem exemplifies apparently simple effects require detailed study. A problem must be fully understood if there is any hope of solution. Perhaps this is one reason fundamental physics discoveries have stalled. Note: the ket-bra notation are touching so they look like a big X, which is visually confusing at first.
I'm so happy to find videos that delve into the mathematics behind quantum mechanics, so I'm not limited to a simplified explanation in layman's terms. As someone close to getting their bachelor's degree in mechanical engineering, I am able to understand the mathematics she's talking about and it's a much more "coherent" explanation than some weird metaphor. I think this video gave me some insight into *how* and *why* quantum tunneling is a problem for the tiniest transistors now. I would like it if she could talk about transistors and the R&D being done to counteract that problem. I read a short article about it and couldn't really understand what they're trying to do.
I stopped eating squid when I realized they had simian level problem solving abilities. It seems now that my love of pastries has come to an end, Danish are obviously intelligent enough to participate in surveys.
I’m a physicist MS. You do a very good job breaking down the topics and illustrating the key points. Please keep up the good work. Also, not to be too forward, but that dress works very well for you. Goes well with your eyes. I hope you have nice day.
Hi Sabine. I'm not a physicist, but I thought about this back in university, and came up with an intuition that I was satisfied with at the time. Would like to briefly run by you if you don't mind: Given that 1) Measuring a system changes it's state. 2) Measuring it multiple times yields the same results. 3) Possible outcomes consist of only eigenstates of the measurement operator. The most simple mathematical process that could exhibit this behaviour is fixed point iteration. Ie take any state, and repeatedly apply the measurement operator on it until it converges (to an eigenstate by definition), with probability proportional to how similar the initial state is compared to the outcome. Since act of measuring something "once" isn't well defined or distinguishable from the myriad of continuous processes it consists of. It would make sense that what we finally observe is the converged state after repeated measurements.
Wow! What a succinct way of conveying this, I’d never even heard of a real mechanism for decoherence, and being able to peak at a bit of the math was very helpful. Fascinating!
8:59 An alternative explanation, for those who are familiar with linear algebra, is that the ket-bra product must be a rank 1 matrix and the diagonal matrix with no zero entries in the diagonal has full rank. Therefore, the density matrix that suffered decoherence must be a sum of at least n ket-bra products, where n is the number of states. I was a bit confused by the fact that averaging -- which is sort of a superposition -- could destroy the ket-bra structure of the density matrix. The fact is that the averaging is done to the density matrices directly, and not to the coefficients of the states -- which would then correspond to a true superposition.
Maybe you can explain this to me. The sum of two wave functions is a wave function, but decoherence is because the sum of a bunch of wave functions is a density matrix that isn't a wave function. I don't understand the difference.
Best physics educator ! Thanks very much. The scope and level of your videos are perfect. You hit the key ideas and treat them in a serious and appropriate fashion while also making quantum mechanics comprehensible.
Wonderful that some of us...Sabine Hossenfelder within the Scientific Community are pointing out and deconstructing the flawed foundational paradigms. Dispassionate examination of the passionate flawed scientific paradigm!
What a beautiful explanation. RU-vid has done some bad harm to jobs and careers (including my own) but one really good thing it has done is to bring the world's great explainers out of their closets.
Really clever video showing the effect of considering the complex conjugate and phase , whilst leaving the open question about WHY a wavefunction collapses into a single observed result (with classical probability) Also that Euler dude was a genius
After watching this video it occurred to me that the simplest model of decoherence is to identify the Uncertainty Principle with classical Brownian motion for all objects heavier than the Planck mass. Reinhold Fuerth wrote a paper comparing the two UPs. Roger Penrose advocates looking at the Planck mass as a boundary between the micro- and macroscopic worlds, but I feel we might have guessed it anyway. I've just put two and two together. This model can inspire a computer simulation, though it does not explain everything. It's one to explore and we have something to get going. My thanks to Dr Hossenfelder.
Decoherence can't really solve the measurement problem. This is because there are many Which Way path experiments usually versions of the double-slit experiment that do not disturb the particles. In such cases, they usually use entanglement as a way of detecting Which Way path information and whether or not you get an interference pattern depends upon the preservation of that information. Dose decoherence occur in these cases, yes, can decoherence itself explain it. The answer is no because these experiments are specifically designed to prevent the problem of disturbing the particles.
I do enjoy that you are one of the few science communicators who i) presents this extremely important idea (and presents it well!) and even more so that ii) you are one of the not-too-many physicists who take conceptual & methodological problems - and in general philosophy of science seriously (I was very glad about the references wrt Popper and Feyerabend during the recent two discussions - would have loved to throw Lakatos, Duhem-Quine, Sneed and Suppes in there). Thank you for both of those things - they are very much appreciated :)
Let me try to explain it a bit simpler. Decoherence is like the waves on a lake. After the initial waves created by an event, we see the waves spread out. This goes on and the waves becomes smaller and more complex, invisible among all other small waves on the surface. This does not solve the measurement problem. The measurement can be seen as the top of the wave reaching a certain amount of energy (a quantum). At a place on the lake, we may detect a wave-top or not, which we interpret as a particle. And when we detect it, we influence the energy of the system. Because our measurement influences the energy-state, it creates a new wave-equation. So essentially a measurement is an energy transfer. Some energy is moved from the observed system to the measurement system.
One must bear in mind that these kind of problems are on the borderline between physics and philosophy, so much of this reduces to logics exercises. The main issue with the interpretations of quantum mechanics is that, depending on the premises (or axioms, from the mathematical point of view) you make, you will end up with different outcomes. What actually matters in physics is if these outcomes in interpretation will be relevant to the experimental observations. Sabine sticks to the orthodox Copenhagen interpretation, which is sufficient to understand many day to day quantum phenomena, but which is well known to be "incomplete" (as opposed to what she has stated elsewhere to be "wrong"). In the orthodox Copenhagen interpretation the measurement problem is unsolvable, and it is necessary to advance beyond it. The conclusion is that up to date, there isn't a theory that solves the measurement problem satisfactorily, but there is some work in progress. For a better insight into this, a good reference for the general public, with references to the technical literature, is Lee Smolin's "Einstein's Unfinished Revolution: The Search for What Lies Beyond the Quantum." While no previous knowledge of quantum mechanics is necessary to read the book, the technical literature is for specialists.
There is one question remaining about decoherence that I would like to understand. While it explains the absence of interference, and entanglement explains the single-outcome, there is a third phenomenon you could call "probableness" or "typicality". Everettians are often criticised that it doesn't make sense that everything possible does happen, while they shrug it off as "why not". Now let's look at decoherence more closely. If you take very small contributions to the wave function and very improbable outcomes into account, you can't gloss over some things anymore. The nature of decoherence is statistical (rare interference effects are still allowed) The averaging is representing a physical process (entanglement). What I wonder is: could this explain "typicality" of observations even in many-worlds? Might "improbable worlds" actually be very short-lived, constantly created and eradicated by sensitivity to interference? Particularly if the total number of basis states is actually finite (which I believe is an open question), fast scrambling over a long time could cause all states to be populated (more precisely, start with a basis state, *fix the basis* and evolve the system over a long time, and the resulting state will have mostly nonzero coordinates), so there is always a small amount of interference, like a background noise. That noise would only allow reasonably probable/typical worlds to look classical and roast the others to a quantum soup.
This is not how decoherence was explained to me by Everretians. All the steps are roughly the same, but at the end, the detector and/or you also have a phase, and that's what selects the concrete outcome, whichever basis vector is coherent with you. My issue with this is that there are only 360 degrees of phase, so if each interaction is going to shift the phase in some way, you ought to be as likely to rejoin a disparate branch as you are to split into a new one. This is a very important question to me right now as I'm trying to decide if I buy the many worlds interpretation. I've always felt that the measurement problem is a 100% indicator of something being terribly wrong, so thanks for diving into it instead of just hiding behind Copenhagen.
For a better insight into this, a good reference for the general public, with references to the technical literature, is Lee Smolin's "Einstein's Unfinished Revolution: The Search for What Lies Beyond the Quantum." While no previous knowledge of quantum mechanics is necessary to read the book, the technical literature is for specialists.
Matt O'Dowd over at Spacetime once quipped phase as 'the angle of a wave's centre of mass off the horizontal axis as corresponding to the wave of a given frequency'. But this lesson today brings alive that previously gleefully dry proposition.
What a great and concise explanation of decoherence issues. The ‘measurement problem’ may require decades, if not centuries of further research. I am somewhat under the impression that events at the scale of the Planck length may play a decisive role, requiring some substantial extension of current quantum mechanics. The intricacies of the measurement problem may be in a similar ballpark like issues of particle creations and annihilations. These events are currently handled via creation and annihilation operators, yet what ‘really happens’ when particles are created or annihilated is currently way beyond the range of current physics . . .
Amazing video! As I see it, the problem of decoherence is a problem for all fundamentally classical systems. In nature there are no "fundamentally" quantum systems. All objects that have ever existed or will exist in the future are fundamentally classical and obey the laws of physics. Thus if we say that something is happening because it is subject to decoherence then something else must be happening because its not subject to decoherence; namely our universe itself. The question is whether any process that we might think of as decoherence can happen to the universe itself. I give you two examples with opposite answers. One answer is that the universe can not experience decoherence. It must be fundamentally classical, and thus it will never subject to any laws other than those of physics. If this is the case, then there are only two ways for something to happen in our universe: either because some fundamental process results in it happening or because a sentient being causes it. If the first case is true, then things must happen because they are part of a fundamental process. If there were no sentient beings or any other classical objects to decohere something (which is impossible), then nothing would ever happen in our universe. If the second case is true, then there is a third way for things to happen. Sentient beings must not be fundamentally classical systems. I don't mean this in the sense that they are made of different stuff than atoms, but rather that their internal workings obey a different set of laws from those which describe fundamental physics. If things happen because sentient beings cause them to happen, then the universe is not a fundamentally classical system. Thus any process that we might imagine happens subjectively must also objectively be true. I am curious your take. Thanks!
That's cool, but it doesn't solve the problem. Decoherence makes for longer recurrence timescales, but it can't produce irreversibility, which is the fundamental requirement for a measurement.
Statistics erases phase information. Density matrix is an attempt to reestablish phase after statistical wave functions already erased phase. Then decoherence is used to erase phase a second time. The moment one chooses a PDF (Probability Density Function) to describe a solution to the wave equation, phase is lost. Fix?: Choose a better basis function. Follow Maxwell's lead and select measurable quantities (E&H) as basis functions. Thus, phase is preserved. How does a series of single-photons form a diffraction pattern? It's all about the phase. In fact, photon's E&H fields wrap around both slits and depending on phase, exit according to the diffraction pattern.
These videos continue to fascinate me and obviously, many others. At 8:43 you say "but the terminology is not the interesting bit". To me the terminology is almost always central, because it links current knowledge to a previous historic era, even if it's dumb or opaque. For example the use of the term "random" has not only seen many ups and downs. There have been periods in which no one believed in in it at all. Laplace thought that we had only our own ignorance to blame when we did not know something. Thank you once again for these short videos. They are quite wonderful.
Her image is flipped at various points in the video. Not sure why this is. Look at her microphone position and the curl in her hair. Not a highbrow comment, but sometimes the details are important. At least I find these little things interesting. I must say, I love watching these videos. Even though much of it is beyond my knowledge, I really do enjoy the insights.
Wow what a good explanation! I'm not a physicist so this is something I actually wondered about -- whether the measurement problem was a real thing or just scientists still stuck using bad metaphors to communicate quantum mechanics, and it was actually just solved by decoherence. Now I understand clearly that I don't understand just as much as everyone else doesn't understand :D
Immensely appreciate your videos. I have a suggestion. I find reading captions along with listening to your narration makes my learning experience more efficient & satisfying ... stickier, I suppose. However when equations are presented on bottom of screen the equations/captions mask each other and can't make sense of either ( type of superposition? ). Thinking if you stood off center ( say to the left when viewing video ) and edit in equations in the open space from top to about your waist or elbow when arms by your side this collision could be avoided in most instances. Kind of picky, embarrassing request but perhaos others feel similarly -- john
This is awesome. Thanks for these videos. Just a small nitpick: I think the complex phase angle is measured from the +ve real axis and in counterclockwise direction (there is an animation that shows it from the complex axis and clockwise direction).
This is amazing. Thank you! A question: if you put an imaginary term in the main diagonal (ie e^itheta for term 1) what happens then? What does this actually mean for the probability density of your system (eg, spin up/down)?
It doesn't make a difference because the random kicks really only change the relative phase. You can therefore just chose to put the phase entirely into the second factor. You may have read this somewhere as the statement that the global phase of the wave-function is unobservable: it's the same statement.
@@SabineHossenfelder Ahhh. That makes sense that's it's all relative rather than absolute. Thanks so much for explaining that point. I wish your videos had existed when I was but a wee physics student. I'd probably still be doing physics rather than AI for my job now :)
It partially solves the problem. The other part is remembering that observers are physical too and the propagation of the decohered state through the environment includes the observer itself.
@@unclebirdman I'm only half joking. It seems admitting that your consciousness actually splits into 2 versions every time you observe a quantum effect without you ever being able to sense it, is hard to grasp, judging by the relatively small number of physicists who support the many-worlds interpretation (17% I think according to one poll Sabine refers to in another video). Another difficulty that's often mentioned is the fact that the splitting is not conceptually different whether the probability is 50/50 or 10/90. This can be solved if you admit that the particles and the observers decoheres into potentially an infinite number of states, splitting 10/90 in the second case. If you consider this, the actual scale of parallel universes that is generated each seconds in the universe is mind-boggling, and this maybe scares away some physicists. One last thought : this problem of the mind-boggling expansion of the universe through interactions between quantum systems and world splitting when decoherence happens, seems to me akin to the behaviour of quantum computers, where, every time you add a qbits, you bring a new dimension to your computing space.
I guess it depends on how you are defining the measurement problem. If you're speaking about it in terms of wave function collapse then decoherence does "solve it" because wave function collapse is not required to tell you the particle will end up in a definite state. Although it does not tell you why you got one state instead of another so I guess you could say it just kicks the can down the road.
Or you got both, but your 2 states that each got one of the 2 different states are decoherent, they cannot interfere with each other, so you are not aware that your wave function taken as a whole did get the 2 states. In the case of Sabine she is also incapable of admitting that she could be only a small part of the wave function of all the Sabines that are experiencing the whole density matrix, the one making the video is only experiencing one result, so she says there is still a measurement problem ... maybe it's an ego problem :)
Many thanks, Sabine! If only my grad school instructors had been a tenth as clear. Really, as a huge benefit to Physics Personkind, please write up all yr quantum presentations in book form. I look forward to it on my shelf beside 'Lost in Math.' Suggested tentative title: 'Found in Math'! Just like John Milton with 'Paradise Lost' ,then 'Paradise Regained.' Like Hollywood, I love a happy ending!
At least you're lucid enough to understand that this topic is much more complicated than the false feeling of understanding that this video gives. It's a shame Sabine doesn't make this point much clearer in the video.
The single outcome is mandated by entanglement, which follows from the basic interaction model of the theory in a muxh more obvious way than decoherence. I guess for a physicist, it's common to omit mentioning something they have learned to always assume. Like I never mention to my colleagues that by integer I actually mean a limited size number represented as a sequence of bits with a certain predetermined length, not the mathematical concept of an integer. It is good to remind yourself and others of the assumptions you make, because challenging them is sometimes the only path to progress. Non-eucledian geometry, as probably many readers know, had its origin in some mathematicians who challenged Euclid's unique parallel axiom. Now we have elliptical geometry (no parallels) and hyperbolic geometry (multiple parallels) as major tools in theoretical physics.
Sometimes I think the term “superposition” is an overly scientifikky way to say “I don't know”. As an afterthought, maybe it would be better if we substitute a black and white ball for Schroedinger's cat analogy. Sometimes we open the box and the black side of the ball is facing up, and other times the white part is facing up. I really have a lot of confidence in people working on quantum computers. There are some very brilliant and sharp people working on those computers with top notch equipment. We live in an age where it should not take a hundred years to get a quantum computer to work. Not even 10 years. I don't know much about qubits, but I suspect two qubits might function properly about twenty-five percent of the time. But I don't know. It may still take us a little more time to determine that maybe qubits are mythical, but I don't think we should give up. Perhaps our current vision of a quantum computer is just a little inaccurate.
I believe this is down to a fundamental terminological problem. In English we refer to Heisenberg's "uncertainty" principle, but I believe the original German has it as the "indeterminacy" principle. This shows a distinction between hidden variables (apparent "quantumness" comes from us not having all the information) and genuine indeterminacy (the statistics are actually incompatible with hidden variables). This distinction is very important because Bell showed that if you assume locality (no FTL influence between quantum states), hidden variables are NOT compatible with quantum statistics. See ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-XL9wWeEmQvo.html for more on this. Despite all this, though, I'm vaguely aligned with you. My hunch is that the world's "quantumness" will somehow turn out to be only skin deep (localised in the combination of spacetime and complexity), and it's the main reason for my interest in quantum computers. I believe that somewhere in the evolution from toy prototypes to a machine capable of attacking real problems (like breaking decent-size RSA keys) some problem will show up that makes true quantum computing unworkable. That will immediately tell us something earth-shatteringly important about what's wrong with our mainstream physics theories and drive the field forward in a huge way. Your comment about qubits highlight the least-well-understood aspect of quantum computing: it's all very well to suppose that you can make the wavefunction explore all parameter-value permutations simultaneously, but that's nothing more an interpretation unless you can somehow make the measurement process return an answer "out of proportion". This is what you seemed to be alluding to with your 25% comment, and in the case of Shor's algorithm this seems to involve the quantum Fourier transform. I'm therefore guessing that the "problem" with QC will turn up as either an inability to scale up the QFT or get the desired measurement behaviour.
So a diagonal density matrix is to a coherent state, what a macroscopic statistical distribution of molecular states (e.g. Boltzmann, Maxwell) is to the actual underlying microstates - much less information due to averaging. In this analogy, the phase information of a coherent state would be like coodinated motion of some of the molecules that gets lost gradually. The solution to the measurement problem would be an explanation why this coordinated motion is so fragile, and why the probability distribution in the limit is so robust - just like Gausses bell curve pops up reliably unter certain conditions. Somehow, measuring spin in a given direction macroscopically in an interaction with the many particles of the apparatus shapes the distribution of possible macroscopic outcomes into "only 1 or 2" - regardless of the coherent state of the original microscopic particle. It is like all coordinated molecule motion gets lost in a bath of Brownian motion, and only the energy contribution of that motion survives in the macroscopic temperature.
@@skebess Have you watched the video? Sabine was talking about a survey among physicists working in *various* fields. That obviously covers me. By the way, I did use time to read several books on the subject of problems with quantum mechanics (also the book written by Sabine on more general problems), unlike probably 95% of my peers.
@@SabineHossenfelder Thanks. It helps me a lot. By the way, my intuition as the student in the 1980s was that interaction of a quantum system with the rest of the Universe must explain a lot of the measurement problem, years before I first learned it is called decoherence. It is always nice to learn I have good physical intuition.
@@arctic_haze What I'm saying is, it shouldn't be surprising to anyone if most physicists get this wrong, because most physicist don't work in this field (just like most MDs aren't surgeons). Your first comment suggested the opposite.
I have a rather silly question to be honest: Suppose we have a double slit experiment setup where we shoot one high frequency photon through at a time. To my understanding (which could be completely wrong) the 'measurement problem' in this setup is that though the photon is a wave and interferes with itself, it still appears as only a single point on the detector screen. Now I don't know exactly how the screen works, but I assume the photon would excite one of the electrons of one of the atoms and the photon would be registered as being detected at the location of that atom. But if I understand correctly, then it should perfect sense that the photon is only detected at one place, because a single photon can only excite one electron at a time. We wouldn't be able to see it spread out over the screen because a photon cannot partially excite many electrons, it is the lowest denomination of energy transfer and thus is either entirely absorbed by one electron or it passes through. Also I assume that it is more likely for an electron to be excited wherever the electromagnetic field is stronger i.e the crests of the interfered photon wave, producing the overall interference pattern. Sorry for rambling, but if someone could tell me what's wrong with what I said (because it has to be wrong), it would be great.
Nothing wrong in your explanation. Measurement is only a 'problem' because some physicists don't like the fact that from the many choices available to the photon (the crests of the wave), the universe chooses only one. But this is what happens, so this should be the end of it: the photon was emitted and then it is absorbed. When there are equiprobable absorbers, the universe randomly chooses one of them. The deterministic evolution of the wave function according to the rules of the Schrodinger equation comes to an end and we get an indeterminate 'collapse' to a single outcome. The wave function and any superpositions disappear because they are waves of possibility and the possibility of the photon being elsewhere after it's absorbed is zero.
The double slit experiment is a classic gotcha. When a particle hits the detection screen the uncertainty is resolved. Any attempt to measure or determine the path of the particle before it hits the detection screen counts as a detection and any uncertainty is resolved at that instant. Hence you either get the classic interference bar pattern, or a circle of decreasing brightness towards the edges if the photons are disturbed by any attempt at measurement during flight from the source to the detector screen. The classic interference pattern arises even if single photons are fired through a diffraction slit because each photon has a 50:50 chance of passing through either slit. Diffraction gratings need to be of the order of the wavelength of light in order to work. The photons of light are influenced by the diffraction grating as they pass through and it is this influence that creates the interference pattern that builds up over time. If any attempt is made to identify which slit photons are passing through, this also influences the photons and you just end up with a roughly circular pattern of intensity, bright in the middle and darker towards the edges, so roughly speaking a classic beam of light. This is (very approximately) a way in which we can determine whether a given quantum signal has been intercepted prior to being received, i.e. is someone spying on our quantum communications?
Classical particle observed or measured separate from wave function (decoherence); conscious awareness of observed or measured particle not the wave function. Conscious awareness may require particle measured by observer instead of quantum wave function.
Thank you Sabina for your videos, I always enjoy to see you clear up some physical ideas. In this video, however, there is something I don't quite understand of your argument. You asserted: "Decoherence only partially solves the measurement problem. It tells you why normally do not observe quantum effects for large objects. It does not tell you how a particle ends up in one and only one possible measurement outcome." Is this really a problem? For what I understand the framework of decoherence is actually avoiding the collapse of the wave function all together: there is no collapse. That should mean that an isolated system never shows collapses; but a subsystem of it can experience a classical reality by just the decoherence mechanism just explained. Isn't decoherence just an approximation to the fact that we can't follow too many interactions between the observed system and the environment? The framework of the density matrix then is just handy, because it can handle both pure and mixed states, and therefore lends itself to represents this loose of information induced by the interaction with the environment.
Or using other words, decoherence solves the measurement problem by showing a mechanism by which we can explain the raising of one random state out of the possible ones during measurement with a "classical apparatus" (I.e., a measurement system so big compared to the measures system, that is acts as an environment) without ever speaking about collapses.