[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

Post nice limits.

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 52
Thread images: 17

Post nice limits.
>>
e
>>
These are my favourites

[eqn]\lim_{n\rightarrow \infty} 0 = 0[/eqn]
[eqn]\lim_{n\rightarrow \infty} m= m[/eqn]

[eqn]\lim_{n\rightarrow \infty} \pi = \pi[/eqn]
[eqn]\lim_{n\rightarrow \infty} e = e [/eqn]
>>
File: 516570_2.2.jpg (12KB, 180x180px) Image search: [Google]
516570_2.2.jpg
12KB, 180x180px
[math]\int x^{-1}{\mathrm d}x = \log(x)[/math]

and for non-zero z

[math]\int x^{z-1}{\mathrm d}x=\dfrac{x^{z}}{z}[/math]

and so

[math]\lim_{z\to 0}\left(\int x^{-1}x^{z}{\mathrm d}x\right)[/math]

isn't the logarithm - it actually doesn't even exist.

But We have

[math]x^z = {\mathrm e}^{z\,\ln(x)} = 1 + z\,\ln(x) + {\mathcal O}(z^2)[/math]

which we can write as

[math]\dfrac{x^z - 1}{z} = \ln(x) + {\mathcal O}(z^2)[/math]

which we can write as

[math]\int x^{-1+z}{\mathrm d}x - \dfrac{1}{z} = \int x^{-1} {\mathrm d}x + {\mathcal O}(z^2)[/math]

or

[math]\lim_{z\to 0}\left(\int x^{-1}x^{z}{\mathrm d}x - \dfrac{1}{z}\right) = \int x^{-1}x^0 {\mathrm d}x = \ln(x)[/math]

I.e. the difference in switching limit exactly makes for a simple counterterm.

Similarly

[math]\lim_{z\to 0} \left( \sum_{n=1}^\infty n^1 (1+z)^n - (-1)^{1+1} \dfrac{1!}{\log(1+z)^{1+1}} \right) = -\dfrac{1}{1+1} \dfrac{1}{6}[/math]

[math]\lim_{z\to 0} \left( \sum_{n=1}^\infty n^3 (1+z)^n - (-1)^{3+1} \dfrac{m!}{\log(1+z)^{3+1}} \right) = -\dfrac{1}{3+1} \dfrac{1}{-30}[/math]

and so on...
>>
in general
[math] \lim_{z->0} \left(\sum_{n=1}^\infty n^m (1+z)^n - (-1)^{m+1} \dfrac{m!}{\log(1+z)^{m+1}} \right) = -\dfrac{1}{m+1} B_{m+1} [/math]
with B_k the Bernoulli numbers
>>
[eqn]\lim_{n\to\infty} f\bigg(x_{n} - \frac{f\big(x_{n-1} - \frac{f(x_{n-2} - \frac{f(...)}{f'(...)})}{f'(x_{n-2} - \frac{f(...)}{f'(...)})}\big)}{f'\big(x_{n-1} - \frac{f(x_{n-2} - \frac{f(...)}{f'(...)})}{f'(x_{n-2} - \frac{f(...)}{f'(...)})}\big)}\bigg)=0[/eqn]
>>
How nice that n divided by its factorial's geometric mean approaches e. A satisfying limit to look at.
It might go unnoticed that (n!)^(1/n) is actually a geometric mean.
>>
>>8905200

[math] m(n):=(n!)^{1/n} [/math]

[math] \lim_{n\to \infty}\frac{n}{m(n)}={\rm e} [/math]

Plot[n/(n!)^(1/n), {n, -2, 5}]

neat
>>
>>8904287
Woah, that's crazy how constants like [math]e[/math] and [math]\pi[/math] just randomly pop up in math like that. This is one of the reason why I'm so interested in STEM. I hope numberphile does a video on the equations you posted.
>>
File: IMG_1001.jpg (10KB, 300x168px) Image search: [Google]
IMG_1001.jpg
10KB, 300x168px
>>8906085
>numberphile
>>
>>8904025
(sorry for being a brainlet in advance, i know i don't know much)

I'm having trouble proving that limit. My first idea was to use l'hopital's, but to do that I need a function that I can take the derivative of, and I can't figure out how to describe the square root stuff as a function. Is this even the right approach?
>>
Literally how?
[eqn] \lim_{n\to\infty}\left( \left(\sum_{k=1}^{n} \frac{1}{k} \right) - log(n) \right) = 1 - \int_{1}^{\infty} \frac{ t - \lfloor{t}\rfloor }{t^2}dt[/eqn]
>>
>>8906098
You definitely, absolutely, cannot use l'hopitals rule here.
>>
>>8906121
how do i become unretarded
>>
>>8904340
bu what about -1/z going to minus infinity?
>>
>>8906156
Probably study sequences. The first thing you should learn is that most sequences need to be tackled case by case by studying their intrinsic properties. A good analysis textbook will have a couple of nice examples of various proofs for really specific formulas.

After that you should research Viete's formula as that the formula in OP is simply a reformulation of Viete's formula where you simply have to use a lot of trig identities. I tried to look for a proof to link but all of them are locked behind a paywall in journals.
>>
File: 0359214d013a253f92def8984de49.jpg (74KB, 1180x708px) Image search: [Google]
0359214d013a253f92def8984de49.jpg
74KB, 1180x708px
>>8906182
At what step? And why are you asking? Note, in any case, that the Taylor expansion I use is around z=0.

>>8906107
[math] \int_{n-1}^m f(\lfloor{t}\rfloor) \,dt = \sum_{k=n}^m f(k) [/math]
and the log is defined as the int over 1/t.

>>8906098
may be a re-writing of Vieta's formula
https://proofwiki.org/wiki/Vieta's_Formula_for_Pi

It's funky, though. The expression with the positive 2's can be read as iteration of
[math] x \mapsto \sqrt{ 2+ x } [/math]
applied to [math] x_0=0 [/math]:
[math] x_0 \mapsto I(1) =\sqrt{ 2+x_0} [/math]
[math] \sqrt{ 2+x_0} \mapsto I(2) = \sqrt{ 2+ \sqrt{ 2+x_0} } [/math]
[math] \sqrt{ 2+ \sqrt{ 2+x_0} } \mapsto I(3) = \sqrt{ 2+ \sqrt{ 2+ \sqrt{ 2+x_0} } } [/math]
...
and after k applications you might wanna call this
And fixed points often swallow the initial condition. Here it corresponds to the solution
[math] x = \sqrt{ 2+ x } [/math]
which is [math]I(\infty) 2[/math].

OPs claim then is
[math] I(k) - I(\infty) \sim \left( \dfrac{pi}{2^k} \right)^2 [/math]

>>8904025
Here some more methods to compute limits via Cesaro and Abel. Those are methods saying
>If the limit exists, then it's Y.

So if a sequence [math](x_n)[/math] converges, then, by definition, [math]|x_k - x_{k-1}|[/math] gets smaller with growing k, i.e. the approximation [math]x_k \approx x_{k-1}[/math] gets more accurate. This implies [math]x_k\approx\tfrac{1}{2}(x_{k-1}+x_k)[/math] get more accurate with k and so does [math]x_k\approx\tfrac{1}{3}(x_{k-2}+x_{k-1}+x_k)[/math]. In fact

[math]x_k \approx \dfrac{1}{d_k}\sum_{j=k-(d_k-1)}^kx_j[/math]

gets better with growing [math]k[/math] for any [math]d_k[/math]:
On the right hand side you take the average more and more similar terms. The limit [math]\lim_{k\to\infty}[/math] are the same.
>>
[math]I(\infty)=2[/math], I meant

>>8906448
Adding upon the summation method...the right hand may exists even if the left hand side doesn't. For example, [math]x_k=\frac{1}{2}(1+(-1)^k)[/math] giving sequence [math]0,1,0,1,0,1,0,\dots[/math] has no limit, but

[math] \sum_{j=1}^{m}1=m[/math]

so that

[math]\dfrac{1}{2m}\sum_{j=1}^{2m}x_j=\dfrac{1}{2m}\sum_{j=1}^{m}(0+1)=\dfrac{1}{2}[/math]

and

[math]\dfrac{1}{2m+1}\sum_{j=1}^{2m+1}x_j=\dfrac{1}{2m}\left(\sum_{j=1}^{m}(0+1)+1\right)=\dfrac{1}{2}\left(1+\dfrac{1}{m}\right)[/math]

and so

[math]\lim_{k\to\infty}^\text{Cesaro}x_k = \lim_{k\to\infty}\dfrac{1}{k}\sum_{j=1}^{k}x_j = \dfrac{1}{2}[/math]

The point is that if the standard [math]\lim[/math] exists, then this Cesaro limit agress AND also has a finite result for some sequences for which the standard [math]\lim[/math] does not.

Similar argument with a negative sign instead of a arithmetic mean points out that

[math]\lim_{k\to\infty}x_k = x_a + (x_{a+1}-x_{a}) + (x_{a+2}-x_{a+1}) + (x_{a+3}-x_{a+2}) + \dots[/math]

and leads to

[math]\lim_{k\to\infty}^\text{Abel}x_k = \lim_{t\to{}1}(1-t)\sum_{j=0}^\infty x_k t^k[/math]

And e.g. our above example

[math]x_k=\frac{1}{2}(1+(-1)^k)[/math]

leads to

[math](1-t)\sum_{j=0}^\infty x_k t^k = \dfrac{1}{1+t}[/math]

and thus the sum [math]\dfrac{1}{2}[/math] as in the Cesaro case.
>>
>>8906448
(the floor equation is for f's that vary appropriately in an interval [k-1, k])
>>
>>8906448
>[math] \int_{n-1}^m f(\lfloor{t}\rfloor) \,dt = \sum_{k=n}^m f(k) [/math]

I'm the guy you are answering to there. I am very sorry but could you elaborate on how I'd use that to sketch a proof of the identity I posted?

Just for background, I found that identity some weeks ago while reading Apostol's analytic number theory. It is presented without proof to find asymptotic formulas for the Riemann Zeta function.
>>
>>8906470
The numerical value of the left hand side is the definition of the Euler–Mascheroni constant
https://en.wikipedia.org/wiki/Euler%E2%80%93Mascheroni_constant

[math] -\frac{t-f(t)}{t^2} = \frac{f(t)}{t} - \frac{1}{t}[/math]

That the right terms on both sides agree is clear from the very definition of the logarithm
[math] \log(k) := \int_{1}^k \frac{1}{t}\, dt [/math]

Remains to show the identity
[math] \int_{1}^\infty \left( \dfrac{t}{f(t)} - \dfrac{f(t)}{t} \right) \dfrac{dt}{t} = 1 [/math]
pic related
>>
>>8906526
It's btw. interesting note that

[math] \lim_{n\to\infty} \left( \sum_{k=1}^n f(k) - \int_1^n f(k)\,dk \right) [/math]

is finite whenever f is decreasing.
So e.g. the expression for [math] n^{-s} [math] is wll behaved also for s<1
>>
[eqn]\lim_{n\to\infty}\sum_{k=0}^n k = \frac{-1}{12}[/eqn]
>>
>>8906583
deleter this
>>
>>8906583
I posted that one above, pic related.

Also related is

[math] \left(\dfrac{z}{\ln(1+z)}\right)^n=1+\dfrac{n}{2}z+(3n-5)\dfrac{n}{2}\dfrac{1}{12}z^2+\dfrac{n}{2}\dfrac{(n-2)(n-3)}{24}z^3+\dots [/math]

which you can use to generate a whole bunch of limits of the form

[math] c_n = \lim_{z\to 1} \left( \dfrac{p_n(z)}{q_n(z)} - \dfrac{1}{\ln(z)^n} \right) [/math]
>>
File: 1493591290467.gif (56KB, 225x534px) Image search: [Google]
1493591290467.gif
56KB, 225x534px
>>8904025
These are not limits, but still very nice.

[eqn]2^2+3^2+5^2+7^2+11^2+13^2+17^2=666[/eqn]
[eqn]\sum\limits_{n=1}^{666}2n\left(-1\right)^{n} = 666[/eqn]
[eqn]666\prod\limits_{p|666}\left(1-\frac{1}{p}\right)=6\times6\times6[/eqn]
[eqn]\phi = -2\sin\left(666^{\circ}\right)\quad \text{ Where }\phi \text{ is the golden ratio}[/eqn]
>>
I just worked out the ones marked in green in pic related, but maybe I'm just adding opaque information
>>
This one is comfy.
[eqn]\forall \theta \,\in\, \left]0,\, 2\,\pi\right[,\, \sum_{n \,=\, 1}^\infty \frac{\sin \left(n\,\theta\right)}{n} \,=\, \frac{\pi \,-\, \theta}{2}[/eqn]
>>
File: stressed_dyke.jpg (8KB, 258x196px) Image search: [Google]
stressed_dyke.jpg
8KB, 258x196px
>>8907132
semi-related

[math] \int_{0}^\infty \dfrac{\sin(s\,x)}{x} \dfrac{\sin(t\,x)}{x} = \dfrac{\pi}{4}(|s+t|-|s-t|) [/math]

also

[math] \sum_{n=a}^{b}\sin(2kn)=\dfrac{\sin (k (a-b-1)) \sin (k (a+b))} {\sin(k)} [/math]

Unrelated:

I conjecture

[math] \Gamma(s)\sum_{n=0}^{M-1}\, \dfrac{Z^n}{(n+A)^s} = \int_0^\infty x^{s-1}e^{-A\,x}\dfrac{1-(Z\,e^{-x})^M}{1-(Z\,e^{-x})}{\mathrm d}x [/math]
>>
[eqn]\lim_{n\rightarrow\infty} \prod_{k=1} ^n \left(1+\frac{k}{n}\right)^n=\lim_{n\rightarrow\infty}\left(1-\frac{1}{12n}\right)^n[/eqn]
>>
>>8907276
Things at infinity are really weird
>>
>>8907276
>[math] -\frac{1}{12} [/math]

I call bullshit.
>>
>>8907276

memes aside, you got identities like

[math] \lim_{n\rightarrow\infty} \prod_{k=1} ^n \left(1+\frac{1}{2^{k}n}\right)^n=\lim_{n\rightarrow\infty}\left(1+\frac{1}{n}\right)^n [/math]
>>
>>8906156
You can only use l'hopital if you have a function divided by another function.

[eqn]\lim_{x\to anything} \frac{f(x)}{g(x)}[/eqn]

And it also must be in indeterminate form when you try to talk the limits separately. So equal to 0/0 or [math]\frac{∞}{∞}[/math]
>>
>>8907329
Yea, I was thinking if there to make f(x) = 2^k and define a clever function g(x) that was equal to the reciprocal of square root thing, so I could apply l'hopitals to f(x)/g(x) but I see know that was way off.
>>
>>8907099
this is some good stuff
>>
Not a limit, but proved using limits

[eqn] \frac{n}{2^{n-1}} = \sin\Big(\frac{\pi}{n}\Big)\sin\Big(\frac{2\pi}{n}\Big)\sin\Big(\frac{3\pi}{n}\Big)\cdots\sin\Big(\frac{(n-1)\pi}{n}\Big) [/eqn]
>>
>>8908110
though stuff like

[math] \sum\limits_{n=1}^{m}2n\left(-1\right)^{n} = m [/math]

holds for ever even number m anyway
>>
>>8908125
How would this be proved using limits? It looks more like induction to me but what would I know.
>>
File: 271.jpg (7KB, 254x198px) Image search: [Google]
271.jpg
7KB, 254x198px
>>8908337
yea that one isnt to special, I admit.
>>
[math]\displaystyle{lim_{k \rightarrow \inf} \frac{\pi(x) \cdot log(x)}{x} = 1} [/math]
>>
>>8908768
what
>>
>>8908783
Prime number theorem motherfucker.

Complex analysis was invented for this shit
>>
File: november-2016-vogue-cover-07.jpg (981KB, 2420x3270px) Image search: [Google]
november-2016-vogue-cover-07.jpg
981KB, 2420x3270px
>>8908756
I didn't want to be an asshole...

Take those as an apology:

[math] \prod_{n=0}^{\infty}\left(1 + x^{2^n}\right) = \frac{1}{1-x}[/math]

[math] \prod_{m=1}^\infty \left( 1 - q^{2m}\right)\left( 1 + w^{2}q^{2m-1}\right)\left( 1 + w^{-2}q^{2m-1}\right)= \sum_{n=-\infty}^\infty w^{2n}q^{n^2} [/math]

>>8908788
Riemanns paper came completely out of the blue and really it's his only paper on number theory - and given he was an expert in complex integral transforms, the claim is questionable.

There is an English translation of the paper online btw., and to me it's the greatest of all
http://www.claymath.org/sites/default/files/ezeta.pdf
>>
File: Checked.png (20KB, 748x640px) Image search: [Google]
Checked.png
20KB, 748x640px
>>8908833
Oh verry kind of you
>>
>>8904580
For what sequence [math]\left(x_n\right)[/math]?
>>
File: a_vest.png (299KB, 641x667px) Image search: [Google]
a_vest.png
299KB, 641x667px
>>8908910
If I had to guess, I think he posted Newtons method to find zeroes of a function, and which has some continuity requirements.
https://en.wikipedia.org/wiki/Newton%27s_method
Meaning the sequence is the point in the recursion and the starting point is arbitrary

>>8908868
Here some more

[math] \sin(z) = z\prod_{n=1}^\infty \left(1-\left(\frac{z}{n\,\pi}\right)^2\right) [/math]

it's awesome, I agree

[math] \dfrac{1}{\sin(z)} = \dfrac{1}{z} + 2z\sum_{n=1}^\infty (-1)^n \dfrac{1}{z^2 - (n\,\pi)^2} [/math]
>>
while we're at sums, I think in Set

[math] |X\times_{f,g}Y|=\sum_{z\in{}Z}|f^{-1}(z)|\cdot{}|g^{-1}(z)| [/math]

implying any such sum

[math] \sum_{k\in{}K}| n_k\cdot{}m_k [/math]

can be understood as the result of a cardinality computation
>>
[math] \sum_{k\in{}K} n_k\cdot{}m_k [/math]
>>
Here are ones that are cool, the nice thing is that you can prove some of these with elementary mathematics (with a tiny bit of formalising help)

[math]\sum_{n=0}^{\infty} \frac{1}{(2n+1)\binom{2n}{n}} = \frac{2\pi}{3\sqrt{3}}[/math]

[math]\sum_{n=0}^{\infty} \frac{1}{(\binom{2n}{n}} = \frac{2}{27}(18 + \pi \sqrt{3})[/math]

Many other similar results pop up, for a start, I'd say to have a look at the following integrals:

[math]\int_0^1 x^n(1-x)^n \ dx [/math] (known as the Beta(n+1,n+1) function)

[math]\int_0^{\pi/2}\sin^{2n+1}x \ dx [/math]

[math]\int_0^{\pi/2}\sin^{2n}x \ dx [/math]
>>
Here is another that can be worked out by playing with geometric series and integrals:

[math]\sum_{n=1}^{\infty} \frac{\sin nx}{n} = \frac{\pi}{2} - \frac{x}{2} [/math]
>>
>>8910301
That was actually posted here
>>8907132

>>8910297
How did you prove em?

When i see something like sqrt(k) in a result, I'm immediately tempted to consider the summands as coefficients in a series expansion and check what the "origin function" is, pic related.
Thread posts: 52
Thread images: 17


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.