> all closed systems tend towards entropy
> every measurement you make destroys atleast as much information as you gain (usually it destroys way more)
I don't even know what information means in physics, but I remember reading it is constant? is this wrong?
destruction of information creates at least as much information
>>37477888
>all closed systems tend towards entropy
That's not technically true. Entropy is only a manifestation of some underlaying statistical process, so there is a chance of entropy decreasing. In fact on a fundamental level all things are just statistical, perhaps we've all been here before anon.
https://en.wikipedia.org/wiki/Poincar%C3%A9_recurrence_theorem
>>37478417
The average amount of order will decrease.
Sometimes you get lucky but rarely
It's like going into a casino and gambling indefinately until you are broke
>>37478613
I went to the casino and became a gorillionaire you stupid idiot
retard
>>37478613
No. It's all just statistics, there is a small (but non-zero) probability of entropy spontaneously decreasing.
>>37478791
It will decrease with very small probability
and it isn't a permanent decrease
>>37477888
All operators corresponding to observables commute you dumb wikipedia educated fuck
>>37479060
Don't know what that means but I'm pretty sure copy and pasting a law of thermodynamics is pretty bulletproof
>>37479436
>law of thermodynamics
kek.
you are parroting what is intended to make sense to a layman, not a true definition of our physical constraints.
you might as well "analyze" and draw conclusions from the computational theory of the mind. "oh hurr durr the brain is just an input output machine XDDD"
have you ever considered reading for longer than 20 minutes to "understand" something?
>>37479542
do you have any suggestion for a retarded layman to read to get more than a superficial understanding of physics, or to just be less wrong?
>>37479542
Ok feynman, we get it, you are an undergrad, good for you nigga