Chebyshev inequality in terms of RMS Announcing the arrival of Valued Associate #679: Cesar...
Why are the trig functions versine, haversine, exsecant, etc, rarely used in modern mathematics?
How do I use the new nonlinear finite element in Mathematica 12 for this equation?
How to react to hostile behavior from a senior developer?
Why weren't discrete x86 CPUs ever used in game hardware?
Crossing US/Canada Border for less than 24 hours
Do I really need to have a message in a novel to appeal to readers?
Take 2! Is this homebrew Lady of Pain warlock patron balanced?
Why is it faster to reheat something than it is to cook it?
How would a mousetrap for use in space work?
What is the meaning of 'breadth' in breadth first search?
What is the difference between globalisation and imperialism?
Time to Settle Down!
Why does the remaining Rebel fleet at the end of Rogue One seem dramatically larger than the one in A New Hope?
Is a ledger board required if the side of my house is wood?
Converted a Scalar function to a TVF function for parallel execution-Still running in Serial mode
A term for a woman complaining about things/begging in a cute/childish way
How fail-safe is nr as stop bytes?
How to play a character with a disability or mental disorder without being offensive?
Why is Nikon 1.4g better when Nikon 1.8g is sharper?
When a candle burns, why does the top of wick glow if bottom of flame is hottest?
How do living politicians protect their readily obtainable signatures from misuse?
Should I use a zero-interest credit card for a large one-time purchase?
How much damage would a cupful of neutron star matter do to the Earth?
SF book about people trapped in a series of worlds they imagine
Chebyshev inequality in terms of RMS
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Inequality for trace of product of matrices given norms of the matricesCoding for Regression AnalysisQR factorization and linear regressionWhat is the “expressive power” of the composition function in a Recursive Neural Tensor Network?How can one design a polynomial function that really does require higher order terms to approximate it well?Matrix Orthogonal to Vector: why take transpose?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}
$begingroup$
I'm self studying the book Introduction to Applied Linear Algebra – Vectors, Matrices, and Least Squares
In page 48, the author write: "It says,for example, that no more than 1/25 = 4% of the entries of a vector can exceed its RMS value by more than a factor of 5."
I need more explain about it. Especially about why the factor is 5?
linear-algebra
New contributor
$endgroup$
add a comment |
$begingroup$
I'm self studying the book Introduction to Applied Linear Algebra – Vectors, Matrices, and Least Squares
In page 48, the author write: "It says,for example, that no more than 1/25 = 4% of the entries of a vector can exceed its RMS value by more than a factor of 5."
I need more explain about it. Especially about why the factor is 5?
linear-algebra
New contributor
$endgroup$
add a comment |
$begingroup$
I'm self studying the book Introduction to Applied Linear Algebra – Vectors, Matrices, and Least Squares
In page 48, the author write: "It says,for example, that no more than 1/25 = 4% of the entries of a vector can exceed its RMS value by more than a factor of 5."
I need more explain about it. Especially about why the factor is 5?
linear-algebra
New contributor
$endgroup$
I'm self studying the book Introduction to Applied Linear Algebra – Vectors, Matrices, and Least Squares
In page 48, the author write: "It says,for example, that no more than 1/25 = 4% of the entries of a vector can exceed its RMS value by more than a factor of 5."
I need more explain about it. Especially about why the factor is 5?
linear-algebra
linear-algebra
New contributor
New contributor
New contributor
asked 12 hours ago
H. YongH. Yong
162
162
New contributor
New contributor
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
According to Chebyshev's_inequality, the probability of a value to deviate more than $k=5$ standard deviations from the mean is at most $1/k^2$.
When applied to vectors in your specific case, and following the book you cited, let $k$ be the number of elements of the vector $vec{x}=(x_1,ldots,x_n)$ such that $||x_i|| geq a > 0$.
Hence $|vec{x}|^2 = sum x_i^2 geq k a^2 + (n - k) times 0$, which means that we have $k$ values larger than $a^2$ and the others $n-k$ values are at least zero.
Since the root mean square value is $operatorname{rms}(vec{x}) = sqrt{frac{|vec{x}|^2}{n}}$, it follows that $operatorname{rms}(vec{x})^2 = frac{|vec{x}|^2}{n} geq frac {k a^2}{n}$.
Therefore, we get the final expression that says
$$frac {k}{n} leq left( frac{operatorname{rms}(vec{x})}{a} right) ^2$$
So, following the example, where $a = 5 operatorname{rms}(vec{x})$, we have that $frac {k}{n} leq left( frac{1}{5} right) ^2 = 4 %$, so, the fraction of elements of the vector larger (in absolute value) than $5operatorname{rms}$ is at most $4%$.
If we chose another number, say $a = 2 operatorname{rms}(vec{x})$, we would have that $frac {k}{n} leq left( frac{1}{2} right) ^2 = 25 %$.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "65"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
H. Yong is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f403846%2fchebyshev-inequality-in-terms-of-rms%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
According to Chebyshev's_inequality, the probability of a value to deviate more than $k=5$ standard deviations from the mean is at most $1/k^2$.
When applied to vectors in your specific case, and following the book you cited, let $k$ be the number of elements of the vector $vec{x}=(x_1,ldots,x_n)$ such that $||x_i|| geq a > 0$.
Hence $|vec{x}|^2 = sum x_i^2 geq k a^2 + (n - k) times 0$, which means that we have $k$ values larger than $a^2$ and the others $n-k$ values are at least zero.
Since the root mean square value is $operatorname{rms}(vec{x}) = sqrt{frac{|vec{x}|^2}{n}}$, it follows that $operatorname{rms}(vec{x})^2 = frac{|vec{x}|^2}{n} geq frac {k a^2}{n}$.
Therefore, we get the final expression that says
$$frac {k}{n} leq left( frac{operatorname{rms}(vec{x})}{a} right) ^2$$
So, following the example, where $a = 5 operatorname{rms}(vec{x})$, we have that $frac {k}{n} leq left( frac{1}{5} right) ^2 = 4 %$, so, the fraction of elements of the vector larger (in absolute value) than $5operatorname{rms}$ is at most $4%$.
If we chose another number, say $a = 2 operatorname{rms}(vec{x})$, we would have that $frac {k}{n} leq left( frac{1}{2} right) ^2 = 25 %$.
$endgroup$
add a comment |
$begingroup$
According to Chebyshev's_inequality, the probability of a value to deviate more than $k=5$ standard deviations from the mean is at most $1/k^2$.
When applied to vectors in your specific case, and following the book you cited, let $k$ be the number of elements of the vector $vec{x}=(x_1,ldots,x_n)$ such that $||x_i|| geq a > 0$.
Hence $|vec{x}|^2 = sum x_i^2 geq k a^2 + (n - k) times 0$, which means that we have $k$ values larger than $a^2$ and the others $n-k$ values are at least zero.
Since the root mean square value is $operatorname{rms}(vec{x}) = sqrt{frac{|vec{x}|^2}{n}}$, it follows that $operatorname{rms}(vec{x})^2 = frac{|vec{x}|^2}{n} geq frac {k a^2}{n}$.
Therefore, we get the final expression that says
$$frac {k}{n} leq left( frac{operatorname{rms}(vec{x})}{a} right) ^2$$
So, following the example, where $a = 5 operatorname{rms}(vec{x})$, we have that $frac {k}{n} leq left( frac{1}{5} right) ^2 = 4 %$, so, the fraction of elements of the vector larger (in absolute value) than $5operatorname{rms}$ is at most $4%$.
If we chose another number, say $a = 2 operatorname{rms}(vec{x})$, we would have that $frac {k}{n} leq left( frac{1}{2} right) ^2 = 25 %$.
$endgroup$
add a comment |
$begingroup$
According to Chebyshev's_inequality, the probability of a value to deviate more than $k=5$ standard deviations from the mean is at most $1/k^2$.
When applied to vectors in your specific case, and following the book you cited, let $k$ be the number of elements of the vector $vec{x}=(x_1,ldots,x_n)$ such that $||x_i|| geq a > 0$.
Hence $|vec{x}|^2 = sum x_i^2 geq k a^2 + (n - k) times 0$, which means that we have $k$ values larger than $a^2$ and the others $n-k$ values are at least zero.
Since the root mean square value is $operatorname{rms}(vec{x}) = sqrt{frac{|vec{x}|^2}{n}}$, it follows that $operatorname{rms}(vec{x})^2 = frac{|vec{x}|^2}{n} geq frac {k a^2}{n}$.
Therefore, we get the final expression that says
$$frac {k}{n} leq left( frac{operatorname{rms}(vec{x})}{a} right) ^2$$
So, following the example, where $a = 5 operatorname{rms}(vec{x})$, we have that $frac {k}{n} leq left( frac{1}{5} right) ^2 = 4 %$, so, the fraction of elements of the vector larger (in absolute value) than $5operatorname{rms}$ is at most $4%$.
If we chose another number, say $a = 2 operatorname{rms}(vec{x})$, we would have that $frac {k}{n} leq left( frac{1}{2} right) ^2 = 25 %$.
$endgroup$
According to Chebyshev's_inequality, the probability of a value to deviate more than $k=5$ standard deviations from the mean is at most $1/k^2$.
When applied to vectors in your specific case, and following the book you cited, let $k$ be the number of elements of the vector $vec{x}=(x_1,ldots,x_n)$ such that $||x_i|| geq a > 0$.
Hence $|vec{x}|^2 = sum x_i^2 geq k a^2 + (n - k) times 0$, which means that we have $k$ values larger than $a^2$ and the others $n-k$ values are at least zero.
Since the root mean square value is $operatorname{rms}(vec{x}) = sqrt{frac{|vec{x}|^2}{n}}$, it follows that $operatorname{rms}(vec{x})^2 = frac{|vec{x}|^2}{n} geq frac {k a^2}{n}$.
Therefore, we get the final expression that says
$$frac {k}{n} leq left( frac{operatorname{rms}(vec{x})}{a} right) ^2$$
So, following the example, where $a = 5 operatorname{rms}(vec{x})$, we have that $frac {k}{n} leq left( frac{1}{5} right) ^2 = 4 %$, so, the fraction of elements of the vector larger (in absolute value) than $5operatorname{rms}$ is at most $4%$.
If we chose another number, say $a = 2 operatorname{rms}(vec{x})$, we would have that $frac {k}{n} leq left( frac{1}{2} right) ^2 = 25 %$.
edited 8 hours ago
StubbornAtom
3,1511535
3,1511535
answered 12 hours ago
ErtxiemErtxiem
31718
31718
add a comment |
add a comment |
H. Yong is a new contributor. Be nice, and check out our Code of Conduct.
H. Yong is a new contributor. Be nice, and check out our Code of Conduct.
H. Yong is a new contributor. Be nice, and check out our Code of Conduct.
H. Yong is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f403846%2fchebyshev-inequality-in-terms-of-rms%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown