Bayesian Probability question — Pointwise Probability
$begingroup$
I am stuck in this question:
if $a = 1$ then $m sim U(0.2,1)$ else if $a=0$ then $m sim U(0,0.5)$. The question is if $m$ is $0.3$ what is the probability that $a$ equals to 1?
My thought is to compute $p(a=1mid m=0.3)$ and $p(a=0mid m=0.3)$ and whichever class gives the higher probability then it is the answer. However, when I am executing this thought I have a problem of computing $p(m=0.3mid a=1)$ which is supposed to be zero since it follows $U(0.2,1)$. I feel like I can use the density function to compute this probability but I am not sure why?
probability bayesian
$endgroup$
add a comment |
$begingroup$
I am stuck in this question:
if $a = 1$ then $m sim U(0.2,1)$ else if $a=0$ then $m sim U(0,0.5)$. The question is if $m$ is $0.3$ what is the probability that $a$ equals to 1?
My thought is to compute $p(a=1mid m=0.3)$ and $p(a=0mid m=0.3)$ and whichever class gives the higher probability then it is the answer. However, when I am executing this thought I have a problem of computing $p(m=0.3mid a=1)$ which is supposed to be zero since it follows $U(0.2,1)$. I feel like I can use the density function to compute this probability but I am not sure why?
probability bayesian
$endgroup$
$begingroup$
Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
$endgroup$
– Cliff AB
8 hours ago
$begingroup$
As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
$endgroup$
– prony
8 hours ago
$begingroup$
oh sorry, I get your question now. I'll write up an answer.
$endgroup$
– Cliff AB
7 hours ago
add a comment |
$begingroup$
I am stuck in this question:
if $a = 1$ then $m sim U(0.2,1)$ else if $a=0$ then $m sim U(0,0.5)$. The question is if $m$ is $0.3$ what is the probability that $a$ equals to 1?
My thought is to compute $p(a=1mid m=0.3)$ and $p(a=0mid m=0.3)$ and whichever class gives the higher probability then it is the answer. However, when I am executing this thought I have a problem of computing $p(m=0.3mid a=1)$ which is supposed to be zero since it follows $U(0.2,1)$. I feel like I can use the density function to compute this probability but I am not sure why?
probability bayesian
$endgroup$
I am stuck in this question:
if $a = 1$ then $m sim U(0.2,1)$ else if $a=0$ then $m sim U(0,0.5)$. The question is if $m$ is $0.3$ what is the probability that $a$ equals to 1?
My thought is to compute $p(a=1mid m=0.3)$ and $p(a=0mid m=0.3)$ and whichever class gives the higher probability then it is the answer. However, when I am executing this thought I have a problem of computing $p(m=0.3mid a=1)$ which is supposed to be zero since it follows $U(0.2,1)$. I feel like I can use the density function to compute this probability but I am not sure why?
probability bayesian
probability bayesian
edited 6 hours ago
Michael Hardy
3,7901430
3,7901430
asked 8 hours ago
pronyprony
225
225
$begingroup$
Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
$endgroup$
– Cliff AB
8 hours ago
$begingroup$
As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
$endgroup$
– prony
8 hours ago
$begingroup$
oh sorry, I get your question now. I'll write up an answer.
$endgroup$
– Cliff AB
7 hours ago
add a comment |
$begingroup$
Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
$endgroup$
– Cliff AB
8 hours ago
$begingroup$
As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
$endgroup$
– prony
8 hours ago
$begingroup$
oh sorry, I get your question now. I'll write up an answer.
$endgroup$
– Cliff AB
7 hours ago
$begingroup$
Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
$endgroup$
– Cliff AB
8 hours ago
$begingroup$
Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
$endgroup$
– Cliff AB
8 hours ago
$begingroup$
As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
$endgroup$
– prony
8 hours ago
$begingroup$
As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
$endgroup$
– prony
8 hours ago
$begingroup$
oh sorry, I get your question now. I'll write up an answer.
$endgroup$
– Cliff AB
7 hours ago
$begingroup$
oh sorry, I get your question now. I'll write up an answer.
$endgroup$
– Cliff AB
7 hours ago
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begin{cases} 1/(1-0.2) = 1.25 & text{if } 0.2le xle 1, \ 0 & text{if } x<0.2 text{ or } x>1. end{cases}$
If $a=0,$ it is $displaystyle f_m(x) = begin{cases} 1/(0.5-0) = 2 & text{if } 0le xle0.5, \ 0 & text{if } x<0 text{ or } x>0.5. end{cases}$
Thus the likelihood function is
$$
begin{cases} L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. end{cases}
$$
Hence the posterior probability distribution is
$$
begin{cases} Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). end{cases} tag 1
$$
The normalizing constant is
$$
c = frac 1 {1.25Pr(a=1) + 2Pr(a=0)}
$$
(that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)
So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 {13}.$
$endgroup$
$begingroup$
Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
$endgroup$
– prony
5 hours ago
1
$begingroup$
@prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain{0,1},$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain{0,1}$ with $x$ fixed at the observed value, which is $0.3. qquad$
$endgroup$
– Michael Hardy
5 hours ago
add a comment |
$begingroup$
Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.
With discrete probabilities, it is simple to define
$$P(A = a mid B = b) = frac{P(A = acap B = b)}{P(B = b)}$$
As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$
Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value
$$ frac{P(A in a pm varepsilon cap B in b pm varepsilon)}{P(B in b pm varepsilon) }$$
is properly defined for all $epsilon$, as long as $int_{b - varepsilon}^{b + varepsilon} f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as
$$lim_{varepsilon rightarrow 0} frac{P(A in a pm varepsilon cap B in b pm varepsilon) / varepsilon}{P(B in b pm varepsilon) / varepsilon } $$
By definition, this is
$$frac{ f_{A, B}(a, b) }{ f_B(b)}$$
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "65"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f391519%2fbayesian-probability-question-pointwise-probability%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begin{cases} 1/(1-0.2) = 1.25 & text{if } 0.2le xle 1, \ 0 & text{if } x<0.2 text{ or } x>1. end{cases}$
If $a=0,$ it is $displaystyle f_m(x) = begin{cases} 1/(0.5-0) = 2 & text{if } 0le xle0.5, \ 0 & text{if } x<0 text{ or } x>0.5. end{cases}$
Thus the likelihood function is
$$
begin{cases} L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. end{cases}
$$
Hence the posterior probability distribution is
$$
begin{cases} Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). end{cases} tag 1
$$
The normalizing constant is
$$
c = frac 1 {1.25Pr(a=1) + 2Pr(a=0)}
$$
(that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)
So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 {13}.$
$endgroup$
$begingroup$
Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
$endgroup$
– prony
5 hours ago
1
$begingroup$
@prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain{0,1},$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain{0,1}$ with $x$ fixed at the observed value, which is $0.3. qquad$
$endgroup$
– Michael Hardy
5 hours ago
add a comment |
$begingroup$
If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begin{cases} 1/(1-0.2) = 1.25 & text{if } 0.2le xle 1, \ 0 & text{if } x<0.2 text{ or } x>1. end{cases}$
If $a=0,$ it is $displaystyle f_m(x) = begin{cases} 1/(0.5-0) = 2 & text{if } 0le xle0.5, \ 0 & text{if } x<0 text{ or } x>0.5. end{cases}$
Thus the likelihood function is
$$
begin{cases} L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. end{cases}
$$
Hence the posterior probability distribution is
$$
begin{cases} Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). end{cases} tag 1
$$
The normalizing constant is
$$
c = frac 1 {1.25Pr(a=1) + 2Pr(a=0)}
$$
(that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)
So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 {13}.$
$endgroup$
$begingroup$
Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
$endgroup$
– prony
5 hours ago
1
$begingroup$
@prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain{0,1},$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain{0,1}$ with $x$ fixed at the observed value, which is $0.3. qquad$
$endgroup$
– Michael Hardy
5 hours ago
add a comment |
$begingroup$
If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begin{cases} 1/(1-0.2) = 1.25 & text{if } 0.2le xle 1, \ 0 & text{if } x<0.2 text{ or } x>1. end{cases}$
If $a=0,$ it is $displaystyle f_m(x) = begin{cases} 1/(0.5-0) = 2 & text{if } 0le xle0.5, \ 0 & text{if } x<0 text{ or } x>0.5. end{cases}$
Thus the likelihood function is
$$
begin{cases} L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. end{cases}
$$
Hence the posterior probability distribution is
$$
begin{cases} Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). end{cases} tag 1
$$
The normalizing constant is
$$
c = frac 1 {1.25Pr(a=1) + 2Pr(a=0)}
$$
(that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)
So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 {13}.$
$endgroup$
If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begin{cases} 1/(1-0.2) = 1.25 & text{if } 0.2le xle 1, \ 0 & text{if } x<0.2 text{ or } x>1. end{cases}$
If $a=0,$ it is $displaystyle f_m(x) = begin{cases} 1/(0.5-0) = 2 & text{if } 0le xle0.5, \ 0 & text{if } x<0 text{ or } x>0.5. end{cases}$
Thus the likelihood function is
$$
begin{cases} L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. end{cases}
$$
Hence the posterior probability distribution is
$$
begin{cases} Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). end{cases} tag 1
$$
The normalizing constant is
$$
c = frac 1 {1.25Pr(a=1) + 2Pr(a=0)}
$$
(that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)
So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 {13}.$
answered 6 hours ago
Michael HardyMichael Hardy
3,7901430
3,7901430
$begingroup$
Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
$endgroup$
– prony
5 hours ago
1
$begingroup$
@prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain{0,1},$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain{0,1}$ with $x$ fixed at the observed value, which is $0.3. qquad$
$endgroup$
– Michael Hardy
5 hours ago
add a comment |
$begingroup$
Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
$endgroup$
– prony
5 hours ago
1
$begingroup$
@prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain{0,1},$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain{0,1}$ with $x$ fixed at the observed value, which is $0.3. qquad$
$endgroup$
– Michael Hardy
5 hours ago
$begingroup$
Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
$endgroup$
– prony
5 hours ago
$begingroup$
Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
$endgroup$
– prony
5 hours ago
1
1
$begingroup$
@prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain{0,1},$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain{0,1}$ with $x$ fixed at the observed value, which is $0.3. qquad$
$endgroup$
– Michael Hardy
5 hours ago
$begingroup$
@prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain{0,1},$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain{0,1}$ with $x$ fixed at the observed value, which is $0.3. qquad$
$endgroup$
– Michael Hardy
5 hours ago
add a comment |
$begingroup$
Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.
With discrete probabilities, it is simple to define
$$P(A = a mid B = b) = frac{P(A = acap B = b)}{P(B = b)}$$
As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$
Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value
$$ frac{P(A in a pm varepsilon cap B in b pm varepsilon)}{P(B in b pm varepsilon) }$$
is properly defined for all $epsilon$, as long as $int_{b - varepsilon}^{b + varepsilon} f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as
$$lim_{varepsilon rightarrow 0} frac{P(A in a pm varepsilon cap B in b pm varepsilon) / varepsilon}{P(B in b pm varepsilon) / varepsilon } $$
By definition, this is
$$frac{ f_{A, B}(a, b) }{ f_B(b)}$$
$endgroup$
add a comment |
$begingroup$
Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.
With discrete probabilities, it is simple to define
$$P(A = a mid B = b) = frac{P(A = acap B = b)}{P(B = b)}$$
As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$
Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value
$$ frac{P(A in a pm varepsilon cap B in b pm varepsilon)}{P(B in b pm varepsilon) }$$
is properly defined for all $epsilon$, as long as $int_{b - varepsilon}^{b + varepsilon} f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as
$$lim_{varepsilon rightarrow 0} frac{P(A in a pm varepsilon cap B in b pm varepsilon) / varepsilon}{P(B in b pm varepsilon) / varepsilon } $$
By definition, this is
$$frac{ f_{A, B}(a, b) }{ f_B(b)}$$
$endgroup$
add a comment |
$begingroup$
Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.
With discrete probabilities, it is simple to define
$$P(A = a mid B = b) = frac{P(A = acap B = b)}{P(B = b)}$$
As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$
Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value
$$ frac{P(A in a pm varepsilon cap B in b pm varepsilon)}{P(B in b pm varepsilon) }$$
is properly defined for all $epsilon$, as long as $int_{b - varepsilon}^{b + varepsilon} f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as
$$lim_{varepsilon rightarrow 0} frac{P(A in a pm varepsilon cap B in b pm varepsilon) / varepsilon}{P(B in b pm varepsilon) / varepsilon } $$
By definition, this is
$$frac{ f_{A, B}(a, b) }{ f_B(b)}$$
$endgroup$
Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.
With discrete probabilities, it is simple to define
$$P(A = a mid B = b) = frac{P(A = acap B = b)}{P(B = b)}$$
As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$
Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value
$$ frac{P(A in a pm varepsilon cap B in b pm varepsilon)}{P(B in b pm varepsilon) }$$
is properly defined for all $epsilon$, as long as $int_{b - varepsilon}^{b + varepsilon} f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as
$$lim_{varepsilon rightarrow 0} frac{P(A in a pm varepsilon cap B in b pm varepsilon) / varepsilon}{P(B in b pm varepsilon) / varepsilon } $$
By definition, this is
$$frac{ f_{A, B}(a, b) }{ f_B(b)}$$
edited 6 hours ago
Michael Hardy
3,7901430
3,7901430
answered 7 hours ago
Cliff ABCliff AB
12.7k12363
12.7k12363
add a comment |
add a comment |
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f391519%2fbayesian-probability-question-pointwise-probability%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
$endgroup$
– Cliff AB
8 hours ago
$begingroup$
As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
$endgroup$
– prony
8 hours ago
$begingroup$
oh sorry, I get your question now. I'll write up an answer.
$endgroup$
– Cliff AB
7 hours ago