Variance of sine and cosine of a random variable
$begingroup$
Suppose $X$ is a random variable drawn from normal distribution with mean $E$ and variance $V$. How could I calculate variance of $sin(X)$ and $cos(X)$?
(I thought the question was simple and tried to do a search, but did not find any good answer.)
What if there is no assumption about the distribution of $X$, and only sample mean and variance are provided?
variance
New contributor
$endgroup$
add a comment |
$begingroup$
Suppose $X$ is a random variable drawn from normal distribution with mean $E$ and variance $V$. How could I calculate variance of $sin(X)$ and $cos(X)$?
(I thought the question was simple and tried to do a search, but did not find any good answer.)
What if there is no assumption about the distribution of $X$, and only sample mean and variance are provided?
variance
New contributor
$endgroup$
1
$begingroup$
Both variables are bounded in $[-1, 1]$ so one can show that the variance is $le frac{2^2}{4} = 1$.
$endgroup$
– angryavian
2 hours ago
$begingroup$
That's an interesting point, might be helpful so I'll keep it in mind. However, I would like to know an exact formula/procedure if it is possible.
$endgroup$
– Hùng Phạm
2 hours ago
add a comment |
$begingroup$
Suppose $X$ is a random variable drawn from normal distribution with mean $E$ and variance $V$. How could I calculate variance of $sin(X)$ and $cos(X)$?
(I thought the question was simple and tried to do a search, but did not find any good answer.)
What if there is no assumption about the distribution of $X$, and only sample mean and variance are provided?
variance
New contributor
$endgroup$
Suppose $X$ is a random variable drawn from normal distribution with mean $E$ and variance $V$. How could I calculate variance of $sin(X)$ and $cos(X)$?
(I thought the question was simple and tried to do a search, but did not find any good answer.)
What if there is no assumption about the distribution of $X$, and only sample mean and variance are provided?
variance
variance
New contributor
New contributor
edited 2 hours ago
Hùng Phạm
New contributor
asked 2 hours ago
Hùng PhạmHùng Phạm
262
262
New contributor
New contributor
1
$begingroup$
Both variables are bounded in $[-1, 1]$ so one can show that the variance is $le frac{2^2}{4} = 1$.
$endgroup$
– angryavian
2 hours ago
$begingroup$
That's an interesting point, might be helpful so I'll keep it in mind. However, I would like to know an exact formula/procedure if it is possible.
$endgroup$
– Hùng Phạm
2 hours ago
add a comment |
1
$begingroup$
Both variables are bounded in $[-1, 1]$ so one can show that the variance is $le frac{2^2}{4} = 1$.
$endgroup$
– angryavian
2 hours ago
$begingroup$
That's an interesting point, might be helpful so I'll keep it in mind. However, I would like to know an exact formula/procedure if it is possible.
$endgroup$
– Hùng Phạm
2 hours ago
1
1
$begingroup$
Both variables are bounded in $[-1, 1]$ so one can show that the variance is $le frac{2^2}{4} = 1$.
$endgroup$
– angryavian
2 hours ago
$begingroup$
Both variables are bounded in $[-1, 1]$ so one can show that the variance is $le frac{2^2}{4} = 1$.
$endgroup$
– angryavian
2 hours ago
$begingroup$
That's an interesting point, might be helpful so I'll keep it in mind. However, I would like to know an exact formula/procedure if it is possible.
$endgroup$
– Hùng Phạm
2 hours ago
$begingroup$
That's an interesting point, might be helpful so I'll keep it in mind. However, I would like to know an exact formula/procedure if it is possible.
$endgroup$
– Hùng Phạm
2 hours ago
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
What si below is for $mu=0$ (and variance renamed $sigma^2$). Then $mathbb{E}[sin X]=0$, and you have
$$
operatorname{Var} sin X = mathbb{E}[sin^2 X]
= frac{1}{2}left(1-mathbb{E}[cos 2X]right)
$$
and
$$
mathbb{E}[cos 2X] = sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} mathbb{E}[X^{2k}]
= sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} sigma^{2k} (2k-1)!!
= sum_{n=0}^infty (-1)^k frac{2^{k}sigma^{2k}}{k!} = e^{-2sigma^{2}}
$$
and therefore
$$
operatorname{Var} sin X = boxed{frac{1-e^{-2sigma^2}}{2}}
$$
You can deal with the variance of $cos X$ in a similar fashion (but you now have to substract a non-zero $mathbb{E}[cos X]^2$), especially recalling that $mathbb{E}[cos^2 X] = 1- mathbb{E}[sin^2 X]$.
Now, for non-zero mean $mu$, you have
$$
sin(X-mu) = sin Xcos mu - cos Xsinmu
$$
(and similarly for $cos(X-mu)$)
Since $X-mu$ is a zero-mean Gaussian with variance $sigma^2$, we have computed the mean and variance of $sin(X-mu)$, $cos(X-mu)$ already. You can use this with the above trigonometric identities to find those of $cos X$ and $sin X$. (it's a bit cumbersome, but not too hard.)
Without knowing anything about the distribution of $X$, I don't think there's much you can do.
$endgroup$
$begingroup$
I might have missed something, but could you tell me why $Var sin(X) = E[sin^2 X]$ ? Also why does the result not contain anything related to mean or variance of $X$?
$endgroup$
– Hùng Phạm
2 hours ago
$begingroup$
@HùngPhạm Oh, my bad, I did it for mean $0$ and variance $1$. Let me fix that.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
I think you'd need to consider $mathbb E(sin^2x)-[mathbb E(sinx)]^2$
$endgroup$
– Sharat V Chandrasekhar
2 hours ago
$begingroup$
@SharatVChandrasekhar It becomes a bit more complicated than that, actually.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
@HùngPhạm I added the variance. Trying to see if this approach is amenable to a non-zero mean as well without too much pain..
$endgroup$
– Clement C.
2 hours ago
|
show 4 more comments
$begingroup$
Here is a general formulation using the law of the unconscious statistician. that can be applied to other functions too. For specific calculations with $sin$ and $cos$ here though, I would say Clement C.'s answer is better!
The mean of $color{blue}{h(X)}$ (for some function $h$) would be given by the integral
$$mathbb{E}[h(X)]=int_{-infty}^{infty}color{blue}{h(x)}f_X(x), dx,$$
where $f_X$ is the probability density function of $X$.
The second moment would be found similarly as $$mathbb{E}left[(h(X))^2right] = int_{-infty}^{infty}color{blue}{(h(x)^2)}f_X(x), dx.$$
Once you know the first two moments here, you can calculate the variance using $mathrm{Var}(Z) = mathbb{E}[Z^2] - (mathbb{E}[Z])^2$.
Replace $h(x)$ with $cos x$ for the corresponding expectations for $cos X$, and similarly with $sin x$.
If the distribution of $X$ is not known, we cannot generally compute the exact mean and variance of $h(X)$. However, you may want to see this for some approximations that could be used. Some useful ones for you may be that if $X$ has mean $mu_X$ and variance $sigma^2_X$, then
$$mathbb{E}[h(X)]approx h(mu_X) + dfrac{h''(mu_X)}{2}sigma_X^2$$
and
$$mathrm{Var}(h(X))approx (h'(mu_X)^2)sigma^2_X + dfrac{1}{2}(h''(mu_X))^2 sigma^4_X.$$
$endgroup$
add a comment |
$begingroup$
$cos^2(x) = frac{cos(2x)+1}2$, which averages out to $frac12$. So as the variance of $X$ goes to infinity, the variance of $cos(X)$ goes to $frac12$, assuming the distribution of $X$ is "well-behaved". The lower bound is $0$ (the variance can be made arbitrarily small by choosing the variance of $X$ to be small enough), and as @angryavian says, the upper bound is $1$. Since $|cos(x)| leq 0$, and the inequality is strict for all but a measure zero set, the variance of $cos(X)$ is less than the variance of $X$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Hùng Phạm is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3121914%2fvariance-of-sine-and-cosine-of-a-random-variable%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
What si below is for $mu=0$ (and variance renamed $sigma^2$). Then $mathbb{E}[sin X]=0$, and you have
$$
operatorname{Var} sin X = mathbb{E}[sin^2 X]
= frac{1}{2}left(1-mathbb{E}[cos 2X]right)
$$
and
$$
mathbb{E}[cos 2X] = sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} mathbb{E}[X^{2k}]
= sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} sigma^{2k} (2k-1)!!
= sum_{n=0}^infty (-1)^k frac{2^{k}sigma^{2k}}{k!} = e^{-2sigma^{2}}
$$
and therefore
$$
operatorname{Var} sin X = boxed{frac{1-e^{-2sigma^2}}{2}}
$$
You can deal with the variance of $cos X$ in a similar fashion (but you now have to substract a non-zero $mathbb{E}[cos X]^2$), especially recalling that $mathbb{E}[cos^2 X] = 1- mathbb{E}[sin^2 X]$.
Now, for non-zero mean $mu$, you have
$$
sin(X-mu) = sin Xcos mu - cos Xsinmu
$$
(and similarly for $cos(X-mu)$)
Since $X-mu$ is a zero-mean Gaussian with variance $sigma^2$, we have computed the mean and variance of $sin(X-mu)$, $cos(X-mu)$ already. You can use this with the above trigonometric identities to find those of $cos X$ and $sin X$. (it's a bit cumbersome, but not too hard.)
Without knowing anything about the distribution of $X$, I don't think there's much you can do.
$endgroup$
$begingroup$
I might have missed something, but could you tell me why $Var sin(X) = E[sin^2 X]$ ? Also why does the result not contain anything related to mean or variance of $X$?
$endgroup$
– Hùng Phạm
2 hours ago
$begingroup$
@HùngPhạm Oh, my bad, I did it for mean $0$ and variance $1$. Let me fix that.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
I think you'd need to consider $mathbb E(sin^2x)-[mathbb E(sinx)]^2$
$endgroup$
– Sharat V Chandrasekhar
2 hours ago
$begingroup$
@SharatVChandrasekhar It becomes a bit more complicated than that, actually.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
@HùngPhạm I added the variance. Trying to see if this approach is amenable to a non-zero mean as well without too much pain..
$endgroup$
– Clement C.
2 hours ago
|
show 4 more comments
$begingroup$
What si below is for $mu=0$ (and variance renamed $sigma^2$). Then $mathbb{E}[sin X]=0$, and you have
$$
operatorname{Var} sin X = mathbb{E}[sin^2 X]
= frac{1}{2}left(1-mathbb{E}[cos 2X]right)
$$
and
$$
mathbb{E}[cos 2X] = sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} mathbb{E}[X^{2k}]
= sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} sigma^{2k} (2k-1)!!
= sum_{n=0}^infty (-1)^k frac{2^{k}sigma^{2k}}{k!} = e^{-2sigma^{2}}
$$
and therefore
$$
operatorname{Var} sin X = boxed{frac{1-e^{-2sigma^2}}{2}}
$$
You can deal with the variance of $cos X$ in a similar fashion (but you now have to substract a non-zero $mathbb{E}[cos X]^2$), especially recalling that $mathbb{E}[cos^2 X] = 1- mathbb{E}[sin^2 X]$.
Now, for non-zero mean $mu$, you have
$$
sin(X-mu) = sin Xcos mu - cos Xsinmu
$$
(and similarly for $cos(X-mu)$)
Since $X-mu$ is a zero-mean Gaussian with variance $sigma^2$, we have computed the mean and variance of $sin(X-mu)$, $cos(X-mu)$ already. You can use this with the above trigonometric identities to find those of $cos X$ and $sin X$. (it's a bit cumbersome, but not too hard.)
Without knowing anything about the distribution of $X$, I don't think there's much you can do.
$endgroup$
$begingroup$
I might have missed something, but could you tell me why $Var sin(X) = E[sin^2 X]$ ? Also why does the result not contain anything related to mean or variance of $X$?
$endgroup$
– Hùng Phạm
2 hours ago
$begingroup$
@HùngPhạm Oh, my bad, I did it for mean $0$ and variance $1$. Let me fix that.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
I think you'd need to consider $mathbb E(sin^2x)-[mathbb E(sinx)]^2$
$endgroup$
– Sharat V Chandrasekhar
2 hours ago
$begingroup$
@SharatVChandrasekhar It becomes a bit more complicated than that, actually.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
@HùngPhạm I added the variance. Trying to see if this approach is amenable to a non-zero mean as well without too much pain..
$endgroup$
– Clement C.
2 hours ago
|
show 4 more comments
$begingroup$
What si below is for $mu=0$ (and variance renamed $sigma^2$). Then $mathbb{E}[sin X]=0$, and you have
$$
operatorname{Var} sin X = mathbb{E}[sin^2 X]
= frac{1}{2}left(1-mathbb{E}[cos 2X]right)
$$
and
$$
mathbb{E}[cos 2X] = sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} mathbb{E}[X^{2k}]
= sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} sigma^{2k} (2k-1)!!
= sum_{n=0}^infty (-1)^k frac{2^{k}sigma^{2k}}{k!} = e^{-2sigma^{2}}
$$
and therefore
$$
operatorname{Var} sin X = boxed{frac{1-e^{-2sigma^2}}{2}}
$$
You can deal with the variance of $cos X$ in a similar fashion (but you now have to substract a non-zero $mathbb{E}[cos X]^2$), especially recalling that $mathbb{E}[cos^2 X] = 1- mathbb{E}[sin^2 X]$.
Now, for non-zero mean $mu$, you have
$$
sin(X-mu) = sin Xcos mu - cos Xsinmu
$$
(and similarly for $cos(X-mu)$)
Since $X-mu$ is a zero-mean Gaussian with variance $sigma^2$, we have computed the mean and variance of $sin(X-mu)$, $cos(X-mu)$ already. You can use this with the above trigonometric identities to find those of $cos X$ and $sin X$. (it's a bit cumbersome, but not too hard.)
Without knowing anything about the distribution of $X$, I don't think there's much you can do.
$endgroup$
What si below is for $mu=0$ (and variance renamed $sigma^2$). Then $mathbb{E}[sin X]=0$, and you have
$$
operatorname{Var} sin X = mathbb{E}[sin^2 X]
= frac{1}{2}left(1-mathbb{E}[cos 2X]right)
$$
and
$$
mathbb{E}[cos 2X] = sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} mathbb{E}[X^{2k}]
= sum_{n=0}^infty (-1)^kfrac{2^{2k}}{(2k)!} sigma^{2k} (2k-1)!!
= sum_{n=0}^infty (-1)^k frac{2^{k}sigma^{2k}}{k!} = e^{-2sigma^{2}}
$$
and therefore
$$
operatorname{Var} sin X = boxed{frac{1-e^{-2sigma^2}}{2}}
$$
You can deal with the variance of $cos X$ in a similar fashion (but you now have to substract a non-zero $mathbb{E}[cos X]^2$), especially recalling that $mathbb{E}[cos^2 X] = 1- mathbb{E}[sin^2 X]$.
Now, for non-zero mean $mu$, you have
$$
sin(X-mu) = sin Xcos mu - cos Xsinmu
$$
(and similarly for $cos(X-mu)$)
Since $X-mu$ is a zero-mean Gaussian with variance $sigma^2$, we have computed the mean and variance of $sin(X-mu)$, $cos(X-mu)$ already. You can use this with the above trigonometric identities to find those of $cos X$ and $sin X$. (it's a bit cumbersome, but not too hard.)
Without knowing anything about the distribution of $X$, I don't think there's much you can do.
edited 1 hour ago
answered 2 hours ago
Clement C.Clement C.
50.5k33891
50.5k33891
$begingroup$
I might have missed something, but could you tell me why $Var sin(X) = E[sin^2 X]$ ? Also why does the result not contain anything related to mean or variance of $X$?
$endgroup$
– Hùng Phạm
2 hours ago
$begingroup$
@HùngPhạm Oh, my bad, I did it for mean $0$ and variance $1$. Let me fix that.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
I think you'd need to consider $mathbb E(sin^2x)-[mathbb E(sinx)]^2$
$endgroup$
– Sharat V Chandrasekhar
2 hours ago
$begingroup$
@SharatVChandrasekhar It becomes a bit more complicated than that, actually.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
@HùngPhạm I added the variance. Trying to see if this approach is amenable to a non-zero mean as well without too much pain..
$endgroup$
– Clement C.
2 hours ago
|
show 4 more comments
$begingroup$
I might have missed something, but could you tell me why $Var sin(X) = E[sin^2 X]$ ? Also why does the result not contain anything related to mean or variance of $X$?
$endgroup$
– Hùng Phạm
2 hours ago
$begingroup$
@HùngPhạm Oh, my bad, I did it for mean $0$ and variance $1$. Let me fix that.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
I think you'd need to consider $mathbb E(sin^2x)-[mathbb E(sinx)]^2$
$endgroup$
– Sharat V Chandrasekhar
2 hours ago
$begingroup$
@SharatVChandrasekhar It becomes a bit more complicated than that, actually.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
@HùngPhạm I added the variance. Trying to see if this approach is amenable to a non-zero mean as well without too much pain..
$endgroup$
– Clement C.
2 hours ago
$begingroup$
I might have missed something, but could you tell me why $Var sin(X) = E[sin^2 X]$ ? Also why does the result not contain anything related to mean or variance of $X$?
$endgroup$
– Hùng Phạm
2 hours ago
$begingroup$
I might have missed something, but could you tell me why $Var sin(X) = E[sin^2 X]$ ? Also why does the result not contain anything related to mean or variance of $X$?
$endgroup$
– Hùng Phạm
2 hours ago
$begingroup$
@HùngPhạm Oh, my bad, I did it for mean $0$ and variance $1$. Let me fix that.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
@HùngPhạm Oh, my bad, I did it for mean $0$ and variance $1$. Let me fix that.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
I think you'd need to consider $mathbb E(sin^2x)-[mathbb E(sinx)]^2$
$endgroup$
– Sharat V Chandrasekhar
2 hours ago
$begingroup$
I think you'd need to consider $mathbb E(sin^2x)-[mathbb E(sinx)]^2$
$endgroup$
– Sharat V Chandrasekhar
2 hours ago
$begingroup$
@SharatVChandrasekhar It becomes a bit more complicated than that, actually.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
@SharatVChandrasekhar It becomes a bit more complicated than that, actually.
$endgroup$
– Clement C.
2 hours ago
$begingroup$
@HùngPhạm I added the variance. Trying to see if this approach is amenable to a non-zero mean as well without too much pain..
$endgroup$
– Clement C.
2 hours ago
$begingroup$
@HùngPhạm I added the variance. Trying to see if this approach is amenable to a non-zero mean as well without too much pain..
$endgroup$
– Clement C.
2 hours ago
|
show 4 more comments
$begingroup$
Here is a general formulation using the law of the unconscious statistician. that can be applied to other functions too. For specific calculations with $sin$ and $cos$ here though, I would say Clement C.'s answer is better!
The mean of $color{blue}{h(X)}$ (for some function $h$) would be given by the integral
$$mathbb{E}[h(X)]=int_{-infty}^{infty}color{blue}{h(x)}f_X(x), dx,$$
where $f_X$ is the probability density function of $X$.
The second moment would be found similarly as $$mathbb{E}left[(h(X))^2right] = int_{-infty}^{infty}color{blue}{(h(x)^2)}f_X(x), dx.$$
Once you know the first two moments here, you can calculate the variance using $mathrm{Var}(Z) = mathbb{E}[Z^2] - (mathbb{E}[Z])^2$.
Replace $h(x)$ with $cos x$ for the corresponding expectations for $cos X$, and similarly with $sin x$.
If the distribution of $X$ is not known, we cannot generally compute the exact mean and variance of $h(X)$. However, you may want to see this for some approximations that could be used. Some useful ones for you may be that if $X$ has mean $mu_X$ and variance $sigma^2_X$, then
$$mathbb{E}[h(X)]approx h(mu_X) + dfrac{h''(mu_X)}{2}sigma_X^2$$
and
$$mathrm{Var}(h(X))approx (h'(mu_X)^2)sigma^2_X + dfrac{1}{2}(h''(mu_X))^2 sigma^4_X.$$
$endgroup$
add a comment |
$begingroup$
Here is a general formulation using the law of the unconscious statistician. that can be applied to other functions too. For specific calculations with $sin$ and $cos$ here though, I would say Clement C.'s answer is better!
The mean of $color{blue}{h(X)}$ (for some function $h$) would be given by the integral
$$mathbb{E}[h(X)]=int_{-infty}^{infty}color{blue}{h(x)}f_X(x), dx,$$
where $f_X$ is the probability density function of $X$.
The second moment would be found similarly as $$mathbb{E}left[(h(X))^2right] = int_{-infty}^{infty}color{blue}{(h(x)^2)}f_X(x), dx.$$
Once you know the first two moments here, you can calculate the variance using $mathrm{Var}(Z) = mathbb{E}[Z^2] - (mathbb{E}[Z])^2$.
Replace $h(x)$ with $cos x$ for the corresponding expectations for $cos X$, and similarly with $sin x$.
If the distribution of $X$ is not known, we cannot generally compute the exact mean and variance of $h(X)$. However, you may want to see this for some approximations that could be used. Some useful ones for you may be that if $X$ has mean $mu_X$ and variance $sigma^2_X$, then
$$mathbb{E}[h(X)]approx h(mu_X) + dfrac{h''(mu_X)}{2}sigma_X^2$$
and
$$mathrm{Var}(h(X))approx (h'(mu_X)^2)sigma^2_X + dfrac{1}{2}(h''(mu_X))^2 sigma^4_X.$$
$endgroup$
add a comment |
$begingroup$
Here is a general formulation using the law of the unconscious statistician. that can be applied to other functions too. For specific calculations with $sin$ and $cos$ here though, I would say Clement C.'s answer is better!
The mean of $color{blue}{h(X)}$ (for some function $h$) would be given by the integral
$$mathbb{E}[h(X)]=int_{-infty}^{infty}color{blue}{h(x)}f_X(x), dx,$$
where $f_X$ is the probability density function of $X$.
The second moment would be found similarly as $$mathbb{E}left[(h(X))^2right] = int_{-infty}^{infty}color{blue}{(h(x)^2)}f_X(x), dx.$$
Once you know the first two moments here, you can calculate the variance using $mathrm{Var}(Z) = mathbb{E}[Z^2] - (mathbb{E}[Z])^2$.
Replace $h(x)$ with $cos x$ for the corresponding expectations for $cos X$, and similarly with $sin x$.
If the distribution of $X$ is not known, we cannot generally compute the exact mean and variance of $h(X)$. However, you may want to see this for some approximations that could be used. Some useful ones for you may be that if $X$ has mean $mu_X$ and variance $sigma^2_X$, then
$$mathbb{E}[h(X)]approx h(mu_X) + dfrac{h''(mu_X)}{2}sigma_X^2$$
and
$$mathrm{Var}(h(X))approx (h'(mu_X)^2)sigma^2_X + dfrac{1}{2}(h''(mu_X))^2 sigma^4_X.$$
$endgroup$
Here is a general formulation using the law of the unconscious statistician. that can be applied to other functions too. For specific calculations with $sin$ and $cos$ here though, I would say Clement C.'s answer is better!
The mean of $color{blue}{h(X)}$ (for some function $h$) would be given by the integral
$$mathbb{E}[h(X)]=int_{-infty}^{infty}color{blue}{h(x)}f_X(x), dx,$$
where $f_X$ is the probability density function of $X$.
The second moment would be found similarly as $$mathbb{E}left[(h(X))^2right] = int_{-infty}^{infty}color{blue}{(h(x)^2)}f_X(x), dx.$$
Once you know the first two moments here, you can calculate the variance using $mathrm{Var}(Z) = mathbb{E}[Z^2] - (mathbb{E}[Z])^2$.
Replace $h(x)$ with $cos x$ for the corresponding expectations for $cos X$, and similarly with $sin x$.
If the distribution of $X$ is not known, we cannot generally compute the exact mean and variance of $h(X)$. However, you may want to see this for some approximations that could be used. Some useful ones for you may be that if $X$ has mean $mu_X$ and variance $sigma^2_X$, then
$$mathbb{E}[h(X)]approx h(mu_X) + dfrac{h''(mu_X)}{2}sigma_X^2$$
and
$$mathrm{Var}(h(X))approx (h'(mu_X)^2)sigma^2_X + dfrac{1}{2}(h''(mu_X))^2 sigma^4_X.$$
edited 1 hour ago
answered 2 hours ago
Minus One-TwelfthMinus One-Twelfth
1,15819
1,15819
add a comment |
add a comment |
$begingroup$
$cos^2(x) = frac{cos(2x)+1}2$, which averages out to $frac12$. So as the variance of $X$ goes to infinity, the variance of $cos(X)$ goes to $frac12$, assuming the distribution of $X$ is "well-behaved". The lower bound is $0$ (the variance can be made arbitrarily small by choosing the variance of $X$ to be small enough), and as @angryavian says, the upper bound is $1$. Since $|cos(x)| leq 0$, and the inequality is strict for all but a measure zero set, the variance of $cos(X)$ is less than the variance of $X$.
$endgroup$
add a comment |
$begingroup$
$cos^2(x) = frac{cos(2x)+1}2$, which averages out to $frac12$. So as the variance of $X$ goes to infinity, the variance of $cos(X)$ goes to $frac12$, assuming the distribution of $X$ is "well-behaved". The lower bound is $0$ (the variance can be made arbitrarily small by choosing the variance of $X$ to be small enough), and as @angryavian says, the upper bound is $1$. Since $|cos(x)| leq 0$, and the inequality is strict for all but a measure zero set, the variance of $cos(X)$ is less than the variance of $X$.
$endgroup$
add a comment |
$begingroup$
$cos^2(x) = frac{cos(2x)+1}2$, which averages out to $frac12$. So as the variance of $X$ goes to infinity, the variance of $cos(X)$ goes to $frac12$, assuming the distribution of $X$ is "well-behaved". The lower bound is $0$ (the variance can be made arbitrarily small by choosing the variance of $X$ to be small enough), and as @angryavian says, the upper bound is $1$. Since $|cos(x)| leq 0$, and the inequality is strict for all but a measure zero set, the variance of $cos(X)$ is less than the variance of $X$.
$endgroup$
$cos^2(x) = frac{cos(2x)+1}2$, which averages out to $frac12$. So as the variance of $X$ goes to infinity, the variance of $cos(X)$ goes to $frac12$, assuming the distribution of $X$ is "well-behaved". The lower bound is $0$ (the variance can be made arbitrarily small by choosing the variance of $X$ to be small enough), and as @angryavian says, the upper bound is $1$. Since $|cos(x)| leq 0$, and the inequality is strict for all but a measure zero set, the variance of $cos(X)$ is less than the variance of $X$.
edited 1 hour ago
angryavian
41.5k23381
41.5k23381
answered 2 hours ago
AcccumulationAcccumulation
7,0192618
7,0192618
add a comment |
add a comment |
Hùng Phạm is a new contributor. Be nice, and check out our Code of Conduct.
Hùng Phạm is a new contributor. Be nice, and check out our Code of Conduct.
Hùng Phạm is a new contributor. Be nice, and check out our Code of Conduct.
Hùng Phạm is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3121914%2fvariance-of-sine-and-cosine-of-a-random-variable%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
Both variables are bounded in $[-1, 1]$ so one can show that the variance is $le frac{2^2}{4} = 1$.
$endgroup$
– angryavian
2 hours ago
$begingroup$
That's an interesting point, might be helpful so I'll keep it in mind. However, I would like to know an exact formula/procedure if it is possible.
$endgroup$
– Hùng Phạm
2 hours ago