Mean Value Theorem
If f is differentiable on an open interval (a,b) and f assumes its maximum or minimum at a point c∈(a,b), then f′(c)=0.
Suppose that f assumes its maximum at c. That is, f(x)≤f(c) for all x∈(a,b). Let (xn) be a sequence converging to c such that a<xn<c for all n. Since f is differentiable at c, the sequence
(xn−cf(xn)−f(c))
converges to f′(c). Since f(xn)≤f(c) and xn≤c, each term in our above sequence is nonnegative. Thus f′(c)≥0.
We repeat this argument with a sequence (yn), such that c<yn<b for all n. Each term in the sequence
(yn−cf(yn)−f(c))
will be nonpositive, so f′(c)≤0. We therefore conclude that f′(c)=0. If f has a minimum at c, we apply the baove results to the function −f.
Let f be a continuous function on [a,b] that is differentiable on (a,b) and such that f(a)=f(b). Then there exists at least one point c in (a,b) such that f′(c)=0.
Since f is continuous on a compact interval [a,b], f obtains a minumum x1 and maximum x2 by the Extreme Value Theorem.
If x1,x2 are both endpoints of [a,b], then the function is a constant, and f′(x)=0 for all x∈[a,b].
Otherwise, f assumes either a minimum or maximum at some point c∈(a,b), and by Theorem 3.2.1, f′(c)=0.
Let f be a continuous function on [a,b] that is differentiable on (a,b). Then there exists at least one point c∈(a,b) such that
f′(c)=b−af(b)−f(a).
Let g(x) be a function whose graph is the chord between f(a) and f(b). More formally
g(x)=b−af(b)−f(a)(x−a)+f(a),for all x∈[a,b].
Then the function h=f−g is continuous on [a,b] and differentiable on (a,b). Since f(a)=g(a) and f(b)=g(b), we have that h(a)=h(b)=0. Applying Rolle’s Theorem we see that for some c∈(a,b),
0=h′(c)=f′(c)−g′(c)=f′(c)−b−af(b)−f(a).
Thus
f′(c)=b−af(b)−f(a).
Let f be a continuous function on [a,b] that is differentiable on (a,b). If f′(x)=0 for all x∈(a,b), then f is constant on [a,b].
Proof by Contradiction
Suppose f were not constant on [a,b]. Then there exists two points a≤x1<x2≤b such that f(x1)=f(x2). But then by MVT, for some c∈(x1,x2) there exists
f′(c)=x2−x1f(x2)−f(x1)=0
Thus a contradiction.
Let f,g be continuous on [a,b] and differentiable on (a,b). Suppose that f′(x)=g′(x) for all x∈(a,b). Then there exists a constant C such that
f(x)=g(x)+C for all x∈[a,b]
Direct Proof
Apply Theorem 3.2.4 to −g.
Let f be differentiable on I. Then
(a) if f′(x)>0 for all x∈I, then f is strictly increasing on I, and
(b) if f′(x)<0 for all x∈I, then f is strictly decreasing on I.
Direct Proof
Let f be differentiable on [a,b] and suppose there exists a number k between f′(a) and f′(b). Then there exists a point c∈(a,b) such that f′(c)=k.
(a) if f′(x)>0 for all x∈I, then f is strictly increasing on I, and
(b) if f′(x)<0 for all x∈I, then f is strictly decreasing on I.
Direct Proof