Meeting Mr. Bernoulli

1957974fb78c8ba.jpg

The differential equation

{d \over dx} y + f(x) y = g(x) y^{\alpha}\quad\quad\quad(1)

where \alpha \neq 0, 1 and g(x) \not \equiv 0, is known as the Bernoulli’s equation.

When \alpha is an integer, (1) has trivial solution y(x) \equiv 0.

To obtain nontrivial solution, we divide each term of (1) by y^{\alpha} to get,

\boxed{y^{-\alpha}{d \over dx}y} + f(x) y^{1-\alpha} = g(x)\quad\quad\quad(2)

Since  {d \over {dx}}({{1 \over {1-\alpha}}y^{1-\alpha}}) ={1 \over {1-\alpha}}\cdot (1-\alpha) y^{1-\alpha-1}{d \over dx}y=\boxed{y^{-\alpha}{d \over dx}y}

(2) can be expressed as

{d \over dx} ({{1 \over {1-\alpha}} y^{1-\alpha}}) + f(x) y^{1-\alpha} = g(x)

which is

{{1 \over {1-\alpha}} {d \over dx} y^{1-\alpha}} + f(x) y^{1-\alpha} = g(x) .

Multiply 1-\alpha throughout,

{d \over dx} y^{1-\alpha} + (1-\alpha) f(x) y^{1-\alpha} = (1-\alpha) g(x)\quad\quad\quad(3)

Let z = y^{1-\alpha}, (3) is transformed to a first order linear equation

{d \over dx} z + (1-\alpha) f(x) z = (1-\alpha) g(x),

giving the general solution of a Bernoulli’s equation (see Fig. 1)

Screen Shot 2018-09-28 at 4.36.05 PM.png

Fig. 1

For a concrete example of Bernoulli’s equation, see “What moves fast, will slow down

Advertisements

Pandora’s Box

Unknown.jpeg

Summations arise regularly in mathematical analysis. For example,

\sum\limits_{i=1}^{n}{1 \over {i (i+1)}} = {n \over {n+1}}

Having a simple closed form expression such as {n \over {n+1}} makes the summation easier to understand and evaluate.

The summation we focus on in this post is

\sum\limits_{i=1}^{n}i 2^i\quad\quad\quad(1)

We will find a closed form for it.

In a recent post, I derived the closed form of a simpler summation (see “Beer theorems and their proofs“) Namely,

\sum\limits_{i=0}^{n}x^i={{x^{n+1}-1} \over {x-1}}\quad\quad\quad(2)

From (2) it follows that

{d \over {dx}}{\sum\limits_{i=0}^{n}x^i} = {d \over {dx}}({ {x^{n+1}-1} \over {x-1} })

which gives us

{\sum\limits_{i=0}^{n}{{d \over dx}x^i}}={{(n+1)x^{n}(x-1)-(x^{n+1}-1)} \over {(x-1)^2}}.

Or,

{\sum\limits_{i=0}^{n}{i x^{i-1}}}={{\sum\limits_{i=0}^{n}{i x^{i}}} \over {x}} = {{\sum\limits_{i=1}^{n}{i x^{i}}} \over {x}} = {{(n+1)x^{n}(x-1)-(x^{n+1}-1)} \over {(x-1)^2}}.

Therefore,

{\sum\limits_{i=1}^{n}{i x^{i}}}={{(n+1)x^{n+1}(x-1)-x^{n+2}+x} \over {(x-1)^2}}.

Let x=2, we arrived at (1)’s closed form:

{\sum\limits_{i=1}^{n}i 2^i} = {{(n+1)2^{n+1} -2 ^{n+2} + 2} \over {2-1}} = 2^{n+1} (n-1) + 2.

images.jpg

I have a Computer Algebra aided solution too.

Let s_n \triangleq \sum\limits_{i=1}^{n} i x^i,

we have

s_1 = x,  s_{n}-s_{n-1}=n x^n

Therefore, the closed form of s_n is the solution of initial-value problem

\begin{cases} {s_{n}-s_{n-1} }= {n x^n} \\ s_1=x\end{cases}

It is solved by Omega CAS Explorer (see Fig. 1)

Screen Shot 2018-09-21 at 3.24.33 PM.png

Fig. 1

At ACA 2017 in Jerusalem, I gave a talk on “Generating Power Summation Formulas using a Computer Algebra System“.

I had a dream that night. In the dream, I was taking a test.

It reads:

Derive the closed form for

\sum\limits_{i=1}^{n} {1 \over {(3i-2)(3i+1)}}

\sum\limits_{i=1}^{n} {1 \over {(2i+1)^2-1}}

\sum\limits_{i=1}^{n} {i \over {(4i^2-1)^2}}

\sum\limits_{i=1}^{n} {{i^2 4^i} \over {(i+1)(i+2)}}

\sum\limits_{i=1}^{n} { i \cdot i!}

I woke up with a sweat.

My shot at Harmonic Series

Screen Shot 2018-08-29 at 12.40.22 AM.png

To prove Beer Theorem 2 (see “Beer theorems and their proofs“) is to show that the Harmonic Series 1 + {1 \over 2} + {1 \over 3} + ... diverges.

Below is my shot at it.

Yaser S. Abu-Mostafa proved a theorem in an article titled “A differentiation test for absolute convergence” (see Mathematics Magazine 57(4), 228-231)

His theorem states that

Let f be a real function such that {d^2 f} \over {dx^2} exists at x = 0 . Then \sum\limits_{n=1}^{\infty} f({1 \over n}) converges absolutely if and only if f(0) = f'(0)=0.

Let f(x) = x, we have

\sum\limits_{n=1}^{\infty}f({1 \over n}) = \sum\limits_{n=1}^{\infty}{1 \over n},

the Harmonic Series. And,

f'(x) = {d \over dx} x = 1 \implies f'(0) \neq 0.

Therefore, by Abu-Mostafa’s theorem, the Harmonic Series diverges.

Beer theorems and their proofs

TwoBeer.jpgBeer Theorem 1.

An infinite crowd of mathematicians enters a bar.

The first one orders a pint, the second one a half pint, the third one a quarter pint…

“Got it”, says the bartender – and pours two pints.

Proof.

Let s_n = \sum\limits_{i=1}^{n} a \cdot r^{i-1} = a + a\cdot r + a \cdot r^{2} + ...+ a\cdot r^{n-2} + a \cdot r^{n-1}.

Then r\cdot s_{n} = \sum\limits_{i=1}^{n} a\cdot r^{i} = a\cdot r  + a\cdot r^2+ ... + a\cdot r^{n-1} + a\cdot r^{n}

\implies s_{n}-r\cdot s_{n} = a  - a\cdot r^{n}.

Therefore,

s_{n} = {{a\cdot(1-r^{n})} \over {1-r}} .

When a = 1, r={{1} \over {2}},

s_{n} = \frac{1 \cdot (1-({1 \over 2})^n)}{1-{1 \over 2}} = 2\cdot (1-({1 \over 2})^n)

i.e.,

1+ {1 \over 2} + {1 \over 4} + {1 \over 8}+...+({1 \over 2})^{n-1}= 2\cdot (1-({1 \over 2})^n)

\implies \lim\limits_{n \rightarrow \infty} s_{n} = \lim\limits_{n \rightarrow \infty} {2\cdot (1-({1 \over 2})^n)} = 2.

There is also a proof without words at all:

infinite-series-square.jpg

Beer Theorem 2.

An infinite crowd of mathematicians enters a bar.

The first one orders a pint, the second one a half pint, the third one a third of pint…

“Get out here! Are you trying to ruin me?”, bellows the bartender.

Proof.

See “My shot at Harmonic Series

A brain teaser with an Euclidean origin

It’s time for a brain teaser:

There is a triangle \triangle ABC, and D is an arbitrary interior point of this triangle (see Fig. 1). Prove that AD + DB < AC + CB.Screen Shot 2018-07-27 at 10.22.28 AM.png

Fig. 1

Here is my solution:

Extend line AD to point E on CB (see Fig. 2),

Screen Shot 2018-07-27 at 11.13.54 AM.png

Fig. 2

we have

AD + DB < AD + (DE + EB)\quad\quad\because \triangle DEB: DB < DE + EB

= (AD + DE) + EB

=AE + EB\quad\quad\because AD + DE = AE

< (AC +CE) +EB\quad\quad\because \triangle ACE: AE < AC +CE

= AC + (CE + EB)

= AC +CB\quad\quad\because CE + EB = CB

\implies AD +DB < AC +CB

My solution relies on a well known theorem:

Given a triangle ABC, the sum of the lengths of any two sides is greater than the length of the third side.

In the words of Euclid:

Screen Shot 2018-07-27 at 12.09.35 PM.png

“In any triangle two sides taken together in any manner are greater than the remaining one” (The Elements: Book I: Proposition 20)

I have conjured up the following algebraic proof of Euclid’s proposition:

Any \triangle ABC can be put in a rectangular coordinate system where x_1 > 0, x_2 \ge 0, y_2 > 0 (see Fig. 3)

Screen Shot 2018-07-27 at 2.34.18 PM.png

Fig. 3

It follows that

AB + AC = \sqrt{(x_2-(-x_1))^2+y_2^2} + \sqrt{(x_2-x_1)^2+y_2^2}

= \sqrt{(x_2+x_1)^2+y_2^2} + \sqrt{(x_2-x_1)^2+y_2^2}

> \sqrt{(x_2+x_1)^2} + \sqrt{(x_2-x_1)^2}\quad\quad\quad\because y_2 >  0

\geq \sqrt{(x_1)^2} + \sqrt{(-x_1)^2}\quad\quad\quad\because x_2 \geq 0

= 2 |x_1|

= BC

\implies AB+AC > BC