I have found this one (interview question for junior quant)
You have to walk from point A to point B 10 times, blindfolded. Every time before you start walking, a fair coin is tossed, and if it comes out heads, a wall in the middle is present (situation 2 below), while if it comes out tails it is the situation one below
. As you are blindfolded, you do not know if the wall is present or not.
Even though you are blindfolded, you can move in perfectly straight lines towards any point you choose. Can you give a prescription how to walk from A to B so that the total distance you cover is as small as possible? What distance do you expect to cover in those 10 walks?
Just joined this community. Your solution is boggling me down. It ays the truth is told 3 out of 4 times. But in your solution to the first part, you made it \(P(tellsthetruth) =1/4\). Is there a reason to that, or I am getting something missed up. Thanks.Nifty question. My gut was to say that the answer here is \(\frac{1}{4}\), since we're conditioning on the report of the roll, which gives us more information to work with. A closer look shows that this is a reasonable answer to the question, but not the only possible one....
This is actually a conditional probability:
\(P(rolls 6 | reports 6) = \frac{P(rolls 6, reports 6)}{P(reports 6)}\)
Here's where the assumptions begin to come in. For one thing, we assume that the roll and the decision to lie are independent; under this assumption, the numerator is easy:
\(P(rolls 6, reports 6) = P(rolls 6, tells the truth) = \frac{1}{6}*\frac{1}{4} = \frac{1}{24}\)
The denominator is trickier. This is where the biggest assumptions come in.
\(P(reports 6) = P(rolls 6, tells the truth) + P(rolls non-6, lies, reports 6)\)
We've already computed the first term, but how do we compute the second? We have to know something about how the person lies.
Are the person's lies always plausible? (That is, would the person lie by saying a 3.4 had been rolled, or a 529?) Given that they're always plausible, are they "fair?" (That is, is the person's lying strategy to choose a plausible lie at random with an equal probability of each lie, or is there some other distribution to the lies? Is the lie even random, aside from the initial throw of the die?)
For the sake of convenience, let's assume that, when the person lies, he or she does so by reporting some other plausible result at random, with equal probability of each lie. Then:
\(P(reports 6) = \frac{1}{24} + \frac{5}{6}*\frac{3}{4}*\frac{1}{5}=\frac{1}{24}+\frac{1}{8}=\frac{1}{6}\)
Overall, then:
\(P(rolls 6 | reports 6) = \frac{\frac{1}{24}}{\frac{1}{6}} = \frac{1}{4}\), as expected.
For the sake of illustration, though, let's work the problem under a different lying strategy: suppose the person always says "6" when lying about a non-6. (We don't really care what he does when lying about a 6.) Then:
\(P(reports 6) = \frac{1}{24} + \frac{5}{6}*\frac{3}{4}*1 = \frac{1}{24} + \frac{5}{8} = \frac{2}{3}\)
Under this alternative assumption:
\(P(rolls 6 | reports 6) = \frac{\frac{1}{24}}{\frac{2}{3}} = \frac{1}{16}\)
As you can see, how you assume the person lies can have a rather dramatic impact on the answer....
#1 knows:3. (GS) There are 5 thieves numbered 1,..,5 trying to divide 100 gold coins using this algorithm: the number 1 will come up with a way to divide money, if there is more than 50% agreement among them about his method (including the dividing thief) then it's done. If not, then they will kill the first thief and the second thief will divide money coming up with his own method. If you are the first one, what method you will use to divide money ?
Hi,
Here is a question I got stuck on:
X and Y are independent non-negative integer valued random variables. Pr{X=i} = f(i), Pr{Y=i}=g(i). f(i) > 0 and g(i) > 0 for i=0,1,...
Sum[f(i), i=0...inf] = 1 and Sum[g(i), i=0...inf] = 1 of course.
Given Pr{X=k | X+Y = l} = Binomial(l,k) (p^k)(1-p)^l-k , 0<= k <= l, prove that X and Y are Poisson distributed.
It is straightforward to prove "<--" but very difficult to prove "-->"It is indeed easier/more classic to prove that if X and Y are Poisson then the conditionnal law is a Binomial.
But you can prove your result by using Probability Generating Functions and the fact that they do characterize the discrete laws
In order for Z=X+Y to follow a Poisson distribution with parameter 1, X and Y must be Poisson distribution as well.
I think I got it **
You have:
E(z^X | X+Y)= (pz + 1-p)^(X+Y) (1)
E(z^Y | X+Y)= ((1-p)z + p)^(X+Y) (2).
Then using (1) and (2)
(We will note f the PGF of X+Y)
f(z)=f(pz + 1-p) f((1-p)z+p) (3)
Which should implie * that for every z1 and z2
f(z1+z2)=f(z1)f(z2+1)
Which gives f(z1+z2)=f(z1)f(z2)f(2).
...
* That might not be true. I thought it could be done using identity theorem for holomorphic Functions, but it doesnt seem possible to use the theorem actually.
It would be sufficient to prove that relation (3) implies f is proportionnal to exponential Function to prove the result.
** Not too sure ^^
Looks perfect up to (3) but I couldn't follow the rest. It seems you were close with the idea of using the Identity Theorem.
I can do the special case \(p=\frac12\) with the additional assumption that the generating function is an entire function:
\(f(z)=f(\frac{z+1}2)^2\qquad\) (4)
Then \(f'(z)=f'(\frac{z+1}2)f(\frac{z+1}2)\), and dividing by (4) we have
\(\frac{f'(z)}{f(z)}=\frac{f'(\frac{z+1}2)}{f(\frac{z+1}2)}\qquad\). (5)
Iterating \(\frac{z+1}2\) staring with \(z=0\) it follows that \(\frac{f'(z)}{f(z)}\) is constant on the infinite set \(\{0,\frac12,\frac34,...\}\) accumulating to 1, which is in the domain of \(f\) by the extra assumption. It now follows from the Identity Theorem that \((\log\circ f)'\) is constant, proving that \(f(z)=e^{az+b}\).
So it still needs to be proved that \(b=-1\).
And the assumption on \(f\) needs to be proven, if possible.
It's not clear to me how this could be generalized to all p.
If I can prove that the factorial moments of X (or Y for that matter) are of the form \(\lambda^k\) which is what one gets in case of Poisson, would that be considered a proof?Looks perfect up to (3) but I couldn't follow the rest. It seems you were close with the idea of using the Identity Theorem.
I can do the special case \(p=\frac12\) with the additional assumption that the generating function is an entire function:
\(f(z)=f(\frac{z+1}2)^2\qquad\) (4)
Then \(f'(z)=f'(\frac{z+1}2)f(\frac{z+1}2)\), and dividing by (4) we have
\(\frac{f'(z)}{f(z)}=\frac{f'(\frac{z+1}2)}{f(\frac{z+1}2)}\qquad\). (5)
Iterating \(\frac{z+1}2\) staring with \(z=0\) it follows that \(\frac{f'(z)}{f(z)}\) is constant on the infinite set \(\{0,\frac12,\frac34,...\}\) accumulating to 1, which is in the domain of \(f\) by the extra assumption. It now follows from the Identity Theorem that \((\log\circ f)'\) is constant, proving that \(f(z)=e^{az+b}\).
So it still needs to be proved that \(b=-1\).
And the assumption on \(f\) needs to be proven, if possible.
It's not clear to me how this could be generalized to all p.