# Inching towards Definite Integral

In a blog titled “Introducing Lady L“, we showed that

$\lim\limits_{h \to 0} {{log(x+h) - log(x)} \over h} = {1 \over x}\quad\quad\quad(1)$

In light of the fact that $1 \over x$ is a monotonic function on $R^+$, i.e.,

$\forall x_1, x_2\in R^+, x_1 < x_2 \implies {1 \over x_1} > {1 \over x_2}$,

we can prove that

Let $F(x) = \int\limits_{a}^{x} f(t)\;dt, t \in [a, b], \forall x \in [a, b], f(x)$ is continuous and monotonic $\implies {\lim\limits_{h \to 0}{ {F(x+h) - F(x)} \over h } }= f(x)\quad\quad\quad(2)$

The proof is simple, rigorous and similar to what we have done in “Introducing Lady L“.

Let $f(x)$ be a monotonically increasing function,

$h > 0 \implies \forall t \in (x, x+h), f(x) < f(t) < f(x+h)$.

and

$h\cdot f(x) < \int\limits_{x}^{x+h} f(t)\;dt < h\cdot f(x+h)$.

Since

$\int\limits_{x}^{x+h} f(x)\; dx = \int\limits_{x}^{a} f(t)\;dt + \int\limits_{a}^{x+h}f(t)\;dt$

$= \int\limits_{a}^{x+h}f(x)\;dx - \int\limits_{a}^{x}f(t)\;dt$

$= F(x+h)- F(x)\quad\quad\quad(3)$

It follows that

$f(x) < {{F(x+h) - F(x)} \over {h}} < f(x+h)$.

The fact that $f(x)$ is continuous tells us

$\lim\limits_{h \to 0} f(x + h) = f(x + 0) = f(x)$.

In addition,

$\lim\limits_{h \to 0} f(x) = f(x)$.

Therefore,

$\lim\limits_{h \to 0} {{F(x+h) - F(x) } \over {h} }= f(x)$.

The case for $h < 0$ can be handled in a similar fashion.

(2) becomes more general when the condition of $f(x)$ being a monotonic function is removed:

Let $F(x) = \int\limits_{a}^{x} f(t)\;dt, t \in [a, b], \forall x \in [a, b], f(x)$ is continuous $\implies {\lim\limits_{h \to 0}{ {F(x+h) - F(x)} \over h } }= f(x)\quad\quad\quad(4)$

Let’s prove it.

By definition, $f(x)$ is continuous at $x$ means

$\forall \epsilon > 0, \exists \delta \ni 0 <|t - x| <\delta \implies |f(t)-f(x)| < \epsilon$.

It implies $-\epsilon < f(t)-f(x) < \epsilon$.

For $h > 0$, we have

$\int\limits_{x}^{x+h} -\epsilon \;dt < \int\limits_{x}^{x+h} {f(t)-f(x)}\;dt < \int\limits_{x}^{x+h} \epsilon\;dt$

That is

$-\epsilon \int\limits_{x}^{x+h}\;dt < \int\limits_{x}^{x+h}f(t)\;dt - \int\limits_{x}^{x+h}f(x)\;dt < \epsilon \int\limits_{x}^{x+h}\;dt$

Since $\int\limits_{x}^{x+h}\;dt = h, \int\limits_{x}^{x+h}f(x)\;dt = f(x)\cdot\int\limits_{x}^{x+h}\;dt = h \cdot f(x)$,

it follows that $-h\cdot\epsilon < \int\limits_{x}^{x+h} f(t)\;dt -h\cdot f(x) < h \cdot\epsilon$

or, $-\epsilon < { \int\limits_{x}^{x+h} f(t)\;dt \over h }- f(x) < \epsilon$

By (3), we have $-\epsilon < { {F(x+h)-F(x)} \over h }- f(x) < \epsilon$

i.e., $|{{F(x+h)-F(x)} \over h}- f(x)|< \epsilon$

As a result,

${\lim\limits_{h \to 0}{ {F(x+h) - F(x)} \over h } }= f(x)$.

For $h < 0$, since $x+h < x$,

$\int\limits_{x+h}^{x} -\epsilon\;dt < \int\limits_{x+h}^{x}{f(t)-f(x)}\;dt < \int\limits_{x+h}^{x}\epsilon\;dt$.

That is $h\cdot\epsilon < -\int\limits_{x}^{x+h}f(t)\;dt + h \cdot f(x) < -h\cdot\epsilon$

or, $h\cdot\epsilon < -(\int\limits_{x}^{x+h}f(t)\;dt - h \cdot f(x)) < -h\cdot\epsilon$

Divide $-h\;(h < 0 \implies -h >0)$ throughout, and express $\int\limits_{x}^{x+h}f(t)\;dt$ as $F(x+h)-F(x)$, we arrived at

$-\epsilon < { {F(x+h)-F(x)} \over h }- f(x) < \epsilon$

as before.

We are now poised to define the derivative of a function:

Let $f$ be a function on an opensubset $I$ of $R$. Let $x \in I$. We say that $f$ is differentiable at $x$ if

$\lim\limits_{h \to 0}{ {f(x+h) - f(x)} \over h }\quad\quad\quad(5)$

exists. If exists, this limit, commonly denoted by $f'(x)$ or ${d \over dx} f(x)$, is called the derivative of $f$ at $x$.

For function $f_1(x) - f_2(x)$, the difference of two differentiable functions,

${{f_1(x+h)-f_2(x+h) - (f_1(x)-f_2(x))} \over h }= {{f_1(x+h) - f_2(x)} \over h}- {{f_1(x) - f_2(x)} \over h}$.

By definition,

${d \over dx} (f_1(x) - f_2(x)) = \lim\limits_{h \to 0}{{f_1(x+h)-f_2(x+h) - (f_1(x)-f_2(x))} \over h }$

${d \over dx} f_1(x) = \lim\limits_{h \to 0}{{f_1(x+h) - f_1(x)} \over h}$

${d \over dx} f_2(x) = \lim\limits_{h \to 0}{{f_2(x+h) - f_2(x)} \over h}$

We have

$\lim\limits_{h \to 0}{{f_1(x+h)-f_2(x+h) - (f_1(x)-f_2(x))} \over h }=\lim\limits_{h \to 0}{{f_1(x+h) - f_1(x)} \over h}-\lim\limits_{h \to 0}{{f_2(x+h) - f_2(x)} \over h}$

or,

${d \over dx} (f_1(x) - f_2(x)) = {d \over dx} f_1(x) - {d \over dx} f_2(x)\quad\quad\quad(6)$

With this definition, we can also re-state (4) as:

Let $F(x) = \int\limits_{a}^{x} f(t)\;dt, t \in [a, b], \forall x \in [a, b], f(x)$ is continuous $\implies {d \over dx} F(x)= f(x)\quad(7)$

From (7), it is clear that $F(x) = \int_{a}^{x} f(x)\;dx$ is a solution of the following equation:

${d \over dx} y(x) = f(x)\quad\quad\quad(8)$

where $y(x)$ is the unknown function.

In fact, for any function $g(x)$ that satisfies (8), we have

${d \over dx} F(x) = {d \over dx} g(x) = f(x)$.

Therefore by (6),

${d \over dx} (F(x) - g(x)) = {d \over dx} F(x) - {d \over dx} g(x) = f(x) - f(x) = 0\quad\quad\quad(9)$

That is,  $F(x) - g(x)$ is a function whose derivative is everywhere zero.

Geometrically, if the curve of a function is horizontally directed at every point, it represents a constant function.

It is even more obvious if one considers a function $f(t)$ that describes the position (on some axis) of a car at time $t$. Then the derivative of the function, $\lim\limits_{\Delta t \to 0} {{f(t+{\Delta t}) -f(t)}\over {\Delta t}}$ is the instantaneous velocity of the car. If the derivative is zero for some time interval (the car does not move) then the value of the function is constant (the car stays where it is).

Hence, we assert

A function $f(x)$ on an open interval $I$ has derivative zero at each point $\implies \forall x \in I, f(x) = c$, a constant.

From (9) and above assertion, whose rigorous proof we postpone until later in “Sprint to FTC“, it follows that

$F(x) - g(x) = c$

or,

$F(x) = \int\limits_{a}^{x} f(x)\;dx = g(x) + c\quad\quad\quad(10)$

where c is a constant.

We know

$\int\limits_{a}^{a} f(x)\;dx = 0$.

It implies

$\int\limits_{a}^{a} f(t)\;dt = g(a) + c = 0$.

i.e.,

$c = -g(a)$

and, (10) becomes

$\int\limits_{a}^{x} f(t)\;dt = g(x) - g(a)$

Let $x=b$, we have

$\int\limits_{a}^{b} f(x)\;dx = g(b) - g(a)$

This is the well known Newton-Leibniz formula. It expresses an algorithm for evaluating the definite integral $\int_{a}^{b} f(x)\;dx$:

Find any function $g(x)$ whose derivative is $f(x)$, and the difference $g(b)-g(a)$ gives the answer.