If $$f\in C^1([a,b])$$ is increasing and nonconstant, then

$\int_a^b\sqrt{1+{f^\prime}^2(x)}\, \mathrm dx \lt b-a+f(b)-f(a).$

For $$\alpha$$, $$\beta\geqslant0$$, $$\sqrt{\alpha^2+\beta^2}\leqslant\alpha+\beta$$, with equality iff one or both of $$\alpha$$, $$\beta$$ equals $$0$$.

Now $$f ^\prime\geqslant 0$$, it follows that

$\sqrt{1+{f^\prime}^2(x)}\leqslant 1+f^\prime(x), x\in [a,b].$

Because $$f(x)$$ is nonconstant, $$f^\prime\gt0$$ in a subinterval. In that subinterval we have strict inequality between these two functions. Integrating both sides then gives the result.

This site uses Akismet to reduce spam. Learn how your comment data is processed.