For example, the positioning of Electrons varies depending on the observer, but why not, say, a plane? Why don't people disappear when they go to bed alone, or appear in a different location when they wake up and resume observing themselves?
>>8211663
>but why not, say, a plane?
Interference between molecules has been experimentally confirmed. Why not test it with planes, by flying them through a slit between two tall buildings?
>>8211670
>the realization
>>8211670
Mel
>>8211663
Depending on the size of your box somewhere between n=1000 and n => infinity
>At what point do quantum and Newtonian physics merge?
Assuming the current best theories, the world is quantum mechanical on every scale.
The question is when Newtonian mechanics becomes a good approximation, which is the same as asking when quantum mechanical effects like entanglement and tunneling become effectively unmeasurable.
It depends on the system.
Uncertainty exists on a very small scale. When you observe very large things, you don't notice that uncertainty. Hence, classical mechanics will describe your observations pretty well. If you go too large, though, relativistic phenomena will be more noticeable, and relativity will be a better descriptor. There isn't a point where quantum and classical mechanics merge. It is more precise to say that as you look at larger things, quantum mechanics becomes more negligible.
>>8211663
I think Holly Holm knocked that bitch so hard some low level quantum level physics happened on her face that day
>>8211663
pro tip: virtually all of physics is educated guessing and patchwork models.
its literally all an approximation. the models we use depend entirely on the application and are more than likely empirical. all of thermodynamics is empirical for example, and any modeling of heat is done with an ass pull truncated taylor series expansion.
>>8212347
this
everything else itt is theological grasping.
>>8211663
When you're a big guy.
>>8211663
Look at [math]\sin(x)[/math] and [math]x - \frac{x^3}{6}[/math]. When do both functions merge? You could look at the error and when it's larger than 1%, then you'd end up with something like [math]|x| < 1[/math]. But then again you could use any kind of threshold value. For every point except 0 you could find a threshold value that is smaller than the error, so arguably, both functions never merge and are never the same. But still, for many cases the approximation is good enough.
It's the same with Newtonian mechanics and quantum mechanics: The smaller the scale of your system is, the more inappropriate your approximation gets. When exactly the point is when both merge is entirely up to your needed level of precision. In the end you can always argue that it's never completely the same, even when it's just on the order of [math]10^-80[/math].
Then again, even what is usually called quantum mechanics is just an approximation of quantum field theory, which is eventually a part of the standard model. We have found stuff in the standard model that does not work (i.e. neutrino oscillations, which does not fit in the standard model). So the best theory we have is known to be wrong: Where does that leave us?
The question you ask is not really trivial in the end. It's not that interesting in that case specifically as we have multiple theories that are better, but determining how wrong we exactly are is basically what all of experimental particle physics does atm. It's particularly interesting with particle theory as the things we can measure are mostly very statistical things, so if you do the whole calculation carefully enough, you can basically exactly say just how probable it is that we are right or wrong.