integral inequality of length of curveAn integral inequalityHow to show the inequality is strict?Prove an...
Using loops to create tables
How would one buy a used TIE Fighter or X-Wing?
Dilemma of explaining to interviewer that he is the reason for declining second interview
Rear brake cable temporary fix possible?
Tikzing a circled star
Can I become debt free or should I file for bankruptcy? How do I manage my debt and finances?
Why did Jodrell Bank assist the Soviet Union to collect data from their spacecraft in the mid 1960's?
Overfitting and Underfitting
Does Windows 10's telemetry include sending *.doc files if Word crashed?
How to prove teleportation does not violate non-cloning theorem?
Jumping Numbers
Why is working on the same position for more than 15 years not a red flag?
Can pricing be copyrighted?
Program that converts a number to a letter of the alphabet
How did the original light saber work?
I am on the US no-fly list. What can I do in order to be allowed on flights which go through US airspace?
Closed form for these polynomials?
What do you call a fact that doesn't match the settings?
Is there hidden data in this .blend file? Trying to minimize the file size
How to generate a matrix with certain conditions
Can polymorphing monsters spam their ability to effectively give themselves a massive health pool?
Everyone is beautiful
Is there a way to drop duplicated rows based on an unhashable column?
What kind of hardware implements Fourier transform?
integral inequality of length of curve
An integral inequalityHow to show the inequality is strict?Prove an integral inequality: $ left(int|f|^2dxright)^2le 4left(int|xf(x)|^2dxright)left(int|f'|^2dxright) $A tricky integral inequalityIntegral inequality problem (cauchy-schwarz)Need help on proving triangle inequality for norm.A particular proof of the mean value inequality for vector valued functionsAn integral inequality concerning compositionsIntegral inequality with a strange conditionAn inequality involving integrals and square root
$begingroup$
Let $f:mathbb{R}to mathbb{R}$ be a continuously differentiable function. Prove that for any $a.bin mathbb{R}$
$$left (int_a^bsqrt{1+(f'(x))^2},dxright)^2ge (a-b)^2+(f(b)-f(a))^2$$.
i think mean value theorem kills it but can't do it ...even try cauchy-schwarz inequality but nothing conclution
real-analysis
$endgroup$
add a comment |
$begingroup$
Let $f:mathbb{R}to mathbb{R}$ be a continuously differentiable function. Prove that for any $a.bin mathbb{R}$
$$left (int_a^bsqrt{1+(f'(x))^2},dxright)^2ge (a-b)^2+(f(b)-f(a))^2$$.
i think mean value theorem kills it but can't do it ...even try cauchy-schwarz inequality but nothing conclution
real-analysis
$endgroup$
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
3 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
3 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
14 mins ago
add a comment |
$begingroup$
Let $f:mathbb{R}to mathbb{R}$ be a continuously differentiable function. Prove that for any $a.bin mathbb{R}$
$$left (int_a^bsqrt{1+(f'(x))^2},dxright)^2ge (a-b)^2+(f(b)-f(a))^2$$.
i think mean value theorem kills it but can't do it ...even try cauchy-schwarz inequality but nothing conclution
real-analysis
$endgroup$
Let $f:mathbb{R}to mathbb{R}$ be a continuously differentiable function. Prove that for any $a.bin mathbb{R}$
$$left (int_a^bsqrt{1+(f'(x))^2},dxright)^2ge (a-b)^2+(f(b)-f(a))^2$$.
i think mean value theorem kills it but can't do it ...even try cauchy-schwarz inequality but nothing conclution
real-analysis
real-analysis
edited 3 hours ago
uniquesolution
8,8161823
8,8161823
asked 4 hours ago
RAM_3RRAM_3R
593214
593214
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
3 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
3 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
14 mins ago
add a comment |
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
3 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
3 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
14 mins ago
3
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
3 hours ago
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
3 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
3 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
3 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
14 mins ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
14 mins ago
add a comment |
4 Answers
4
active
oldest
votes
$begingroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
$endgroup$
1
$begingroup$
This really nice!
$endgroup$
– Nastar
55 mins ago
add a comment |
$begingroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
$endgroup$
add a comment |
$begingroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
$endgroup$
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
24 mins ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
15 mins ago
add a comment |
$begingroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3132801%2fintegral-inequality-of-length-of-curve%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
$endgroup$
1
$begingroup$
This really nice!
$endgroup$
– Nastar
55 mins ago
add a comment |
$begingroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
$endgroup$
1
$begingroup$
This really nice!
$endgroup$
– Nastar
55 mins ago
add a comment |
$begingroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
$endgroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
answered 1 hour ago
Sangchul LeeSangchul Lee
95.1k12170277
95.1k12170277
1
$begingroup$
This really nice!
$endgroup$
– Nastar
55 mins ago
add a comment |
1
$begingroup$
This really nice!
$endgroup$
– Nastar
55 mins ago
1
1
$begingroup$
This really nice!
$endgroup$
– Nastar
55 mins ago
$begingroup$
This really nice!
$endgroup$
– Nastar
55 mins ago
add a comment |
$begingroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
$endgroup$
add a comment |
$begingroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
$endgroup$
add a comment |
$begingroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
$endgroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
edited 1 hour ago
answered 2 hours ago
MatematletaMatematleta
11.5k2920
11.5k2920
add a comment |
add a comment |
$begingroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
$endgroup$
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
24 mins ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
15 mins ago
add a comment |
$begingroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
$endgroup$
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
24 mins ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
15 mins ago
add a comment |
$begingroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
$endgroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
answered 28 mins ago
SongSong
16.1k1739
16.1k1739
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
24 mins ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
15 mins ago
add a comment |
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
24 mins ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
15 mins ago
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
24 mins ago
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
24 mins ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
15 mins ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
15 mins ago
add a comment |
$begingroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
$endgroup$
add a comment |
$begingroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
$endgroup$
add a comment |
$begingroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
$endgroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
New contributor
answered 3 hours ago
se2018se2018
873
873
New contributor
New contributor
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3132801%2fintegral-inequality-of-length-of-curve%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
3 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
3 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
14 mins ago