Masterjun
He/Him
Site Developer, Skilled player (1988)
Joined: 10/12/2010
Posts: 1185
Location: Germany
Randil wrote:
2. Is s(n) bounded above or below? My intuition says no, but I cannot formally prove it. 3. Consider the histogram of the values s(1), s(2), ..., s(n), for ever increasing n. I suspect this should converge to a normal distribution, but again I cannot formally show this. If it does converge to a normal distribution, what are the parameters of said distribution?
Sounds like this problem would be easier to solve if we'd know whether pi is a normal number or not.
Wikipedia wrote:
it is widely believed that the numbers √2, π, and e are normal, but a proof remains elusive.
Warning: Might glitch to credits I will finish this ACE soon as possible (or will I?)
Player (80)
Joined: 8/5/2007
Posts: 865
Masterjun wrote:
Randil wrote:
2. Is s(n) bounded above or below? My intuition says no, but I cannot formally prove it. 3. Consider the histogram of the values s(1), s(2), ..., s(n), for ever increasing n. I suspect this should converge to a normal distribution, but again I cannot formally show this. If it does converge to a normal distribution, what are the parameters of said distribution?
Sounds like this problem would be easier to solve if we'd know whether pi is a normal number or not.
Wikipedia wrote:
it is widely believed that the numbers √2, π, and e are normal, but a proof remains elusive.
Yeah, that makes generalizing a proof to other irrational or even transcendental numbers almost impossible. For example, a number like 0.90900090900000909000909000000090900090900000909000909000000000909... is non-repeating and therefore irrational (possibly even transcendental) but increases without bound. Its histogram also looks nothing like a normal distribution. We could play a similar game and construct this number: 0.110110011000110000110000011000000110000000110000000011... which is certainly irrational because it never repeats. With the 1s always consecutive, s(n) always takes the values of -1, 0, or 1, so it is bounded from above and below. Now, if we were talking about normal numbers I'd have no idea how to proceed, although I suspect that you could construct similar numbers that would be bounded from above and/or below as you please.
Player (36)
Joined: 9/11/2004
Posts: 2630
Constructing a normal number with an unbounded s(n). Take the 10 digits: 0,1,2,3,4,5,6,7,8,9 Partition them into two equally sized sets, one with a sum greater than the other, it doesn't matter how one possibility is: 3,5,6,8,9 and 0,1,2,4,7 Shuffle and interleave the sets, this section will have a total positive sum, one possibility is: 0.5180349762 Next, take all pairs of digits: 00, 01, 02, 03, ... 99 Shuffle them and append, order doesn't matter, this section will have a total sum of 0: Next, take all triples of digits: 000, 001, 002, 003, ... 999 Partition them into two equally sized sets, such that the total sum of the units and hundreds less the tens of one set is greater than the units and hundreds less then tens of the other, shuffle, interleave, and append, this section will have a total positive sum. Repeat, even sized sets get shuffled and appended, odd sized sets get partitioned, shuffled, interleaved and appended. s(n) is unbounded. And the set seems normal, at least in base 10, by the same logic as the Champernown Constant.
Build a man a fire, warm him for a day, Set a man on fire, warm him for the rest of his life.
Skilled player (1827)
Joined: 4/20/2005
Posts: 2161
Location: Norrköping, Sweden
It's not too hard to show that for any irrational number there exists two values x1 and x2 such that |s(x2) - s(x1)| > M for any M, i.e. the "span" of the function is arbitrarily big. Since pi is irrational, every finite decimal sequence will appear in it (right? correct me if I'm wrong here), therefor there will appear a decimal sequence that increases the function value by (at least) M after the sequence is finished (in its simplest form something like 9090909090909...). But I haven't been able to show that this means that s(x) is unbounded, since we don't know the values of s(x1) and s(x2). Interesting results about its connection with normal numbers, I didn't think about that. And a nice unbounded normal number construction, OmnipotentEntity!
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Randil wrote:
Since pi is irrational, every finite decimal sequence will appear in it
That doesn't sound correct at all. A number which decimal expansion lacks completely the digit '3' can still perfectly well be irrational. Or if the digit '3' appears, it's always followed by the digit '5', without exception; and this can still be irrational.
Patashu
He/Him
Joined: 10/2/2005
Posts: 4043
I think what is meant is that 'since pi is a normal number, every finite decimal sequence will appear in it'. https://en.wikipedia.org/wiki/Normal_number However, it is not proven that pi is a normal number, so it can't be used yet.
My Chiptune music, made in Famitracker: http://soundcloud.com/patashu My twitch. I stream mostly shmups & rhythm games http://twitch.tv/patashu My youtube, again shmups and rhythm games and misc stuff: http://youtube.com/user/patashu
Invariel
He/Him
Editor, Site Developer, Player (171)
Joined: 8/11/2011
Posts: 539
Location: Toronto, Ontario
OmnipotentEntity, you're missing one really important step in your generation function, I think: for it to be truly and properly uniform, (and thus normal) in base 10, you have to remove all existing sub-sequences at each stage of generation. Otherwise: 0.24680135791214161820... which is generated by taking all of the even numbers and then all of the odd numbers, for example, has overlap in 91, which will appear later in the series. Also, 912 will appear in the three-digit series, 9121 will appear in the four-digit series, and so on. So, in the double-digit step, you have to remove 24, 46, 68, 80, 13, 35, 57, 79. In the triple-digit step, you have to remove all visible subsequences, and so on.
I am still the wizard that did it. "On my business card, I am a corporate president. In my mind, I am a game developer. But in my heart, I am a gamer." -- Satoru Iwata <scrimpy> at least I now know where every map, energy and save room in this game is
Patashu
He/Him
Joined: 10/2/2005
Posts: 4043
Oh, I have two math questions that I thought about aaages ago. I don't know the answer to them and they might be interesting. 1) Think about the central limit theorem - the normal distribution you get from summing infinitely many uniform distributions. Now think about what kind of distribution you get from summing finitely many uniform distributions. For n = 2 you get a triangular shape, for n = 3 you get a shape that looks like parabolic sections stuck together and so on. How do you generate the formulas that create these curves for each value of n? (For normalization purposes, let's say that the domain is x >= 0, x <= 1 and the integral of the curve should be equal to 1.) 2) We know that sin and cos (and all combinations thereof), if differentiated or integrated four times, equal themselves. And we know that e^x, if differentiated or integrated once, equals themselves. Are there any functions that equal themselves after being differentiated or integrated 2, 3, 5, 6, etc times? If so, construct them. If not, prove it's impossible. We also know that it's possible to differentiate/integrate something a non-integer number of times. ( https://en.wikipedia.org/wiki/Fractional_calculus ) So are there any functions that equal themselves after being differentiated/integrated a non-integer number of times?
My Chiptune music, made in Famitracker: http://soundcloud.com/patashu My twitch. I stream mostly shmups & rhythm games http://twitch.tv/patashu My youtube, again shmups and rhythm games and misc stuff: http://youtube.com/user/patashu
Player (36)
Joined: 9/11/2004
Posts: 2630
Invariel wrote:
OmnipotentEntity, you're missing one really important step in your generation function, I think: for it to be truly and properly uniform, (and thus normal) in base 10, you have to remove all existing sub-sequences at each stage of generation. Otherwise: 0.24680135791214161820... which is generated by taking all of the even numbers and then all of the odd numbers, for example, has overlap in 91, which will appear later in the series. Also, 912 will appear in the three-digit series, 9121 will appear in the four-digit series, and so on. So, in the double-digit step, you have to remove 24, 46, 68, 80, 13, 35, 57, 79. In the triple-digit step, you have to remove all visible subsequences, and so on.
No, that's not actually required, because normality requires asymptotic uniformity, not absolute uniformity at every subsequence (which is actually impossible). Take the Champernown Constant. It's extremely short on zeroes while starting out, because it doesn't attempt to pad with leading zeroes. However, asymptotically it doesn't matter, because most integers are gobsmackingly huge and contain insane numbers of digits, so the edge effects where they meet are negligible. (I did use leading zeroes, but only to guarantee lengths and simplify the algorithm.) In fact the shuffling steps are completely superfluous and don't contribute at all to the normality of the number. I only added these steps in order to demonstrate that it's possible to generate huge numbers of these objects. But in fact, this is not even the most general method of generating numbers with this property, because it's not even strictly required that the odd steps always be positive, only that they're asymptotically positive.
Build a man a fire, warm him for a day, Set a man on fire, warm him for the rest of his life.
Player (36)
Joined: 9/11/2004
Posts: 2630
Patashu wrote:
How do you generate the formulas that create these curves for each value of n? (For normalization purposes, let's say that the domain is x >= 0, x <= 1 and the integral of the curve should be equal to 1.)
The domain restriction is a bit vexing. Normal distributions are necessarily infinite. Is this a strict requirement?
Patashu wrote:
Are there any functions that equal themselves after being differentiated or integrated 2, 3, 5, 6, etc times? If so, construct them. If not, prove it's impossible.
Absolutely. All you need to do is take a differential equation of the form: d^ny/dx^n = y And solve it. For n=2 this results in the function family of c_1*e^x + c_2*e^-x For n=3 this results in the function family of c_1*e^x + c_2*e^-x/2*sin(sqrt(3)/2*x) + c_3*e^-x/2*cos(sqrt(3)/2*x) For n=4 this results in the function family of c_1 e^x+c_2 e^(-x)+c_3 sin(x)+c_4 cos(x) For n=5 this results in the function family of c_1 e^x+c_2 e^(-1/4 (1+sqrt(5)) x) sin(sqrt(5/8-sqrt(5)/8) x)+c_3 e^(1/4 (sqrt(5)-1) x) sin(sqrt(5/8+sqrt(5)/8) x)+c_4 e^(1/4 (sqrt(5)-1) x) cos(sqrt(5/8+sqrt(5)/8) x)+c_5 e^(-1/4 (1+sqrt(5)) x) cos(sqrt(5/8-sqrt(5)/8) x) And in the general case one can see a relationship with these function families and the nth roots of unity. But I'll leave it to you to work out the formalism and proof of this process if you're interested. 1^(1/2) = {1,-1} 1^(1/3) = {1, -1/2 + sqrt(3)/2 i, -1/2 - sqrt(3)/2 i} 1^(1/4) = {1, -1, i, -i} etc.
We also know that it's possible to differentiate/integrate something a non-integer number of times. ( https://en.wikipedia.org/wiki/Fractional_calculus ) So are there any functions that equal themselves after being differentiated/integrated a non-integer number of times?
Because the functions seem to relate to the roots of unity somehow, I strongly suspect that the solutions to this problem are also related to the roots of unity. Due to this, I'd expect to find that d^(3/2)/dx^(3/2) y = y would generate the same function family as d^3/dx^3 y = y. Because the roots of unity are the same between 1^(1/3) and 1^(2/3). However I haven't tested this.
Build a man a fire, warm him for a day, Set a man on fire, warm him for the rest of his life.
Patashu
He/Him
Joined: 10/2/2005
Posts: 4043
OmnipotentEntity wrote:
Patashu wrote:
How do you generate the formulas that create these curves for each value of n? (For normalization purposes, let's say that the domain is x >= 0, x <= 1 and the integral of the curve should be equal to 1.)
The domain restriction is a bit vexing. Normal distributions are necessarily infinite. Is this a strict requirement?
Right, the normal distribution is infinite - but every successive distribution made from combining 1, 2, 3, etc. uniform distributions is not. Basically, imagine creating the distribution that's the result of multiplying two uniform distributions together. It'll look like a triangle shape. Then multiply that with another uniform distribution - it'll look somewhat like the normal distribution, but more basic. If you do this infinitely many times, you get the normal distribution. But I'm curious about what all the previous steps look like. (Thanks for the answer to question 2!)
My Chiptune music, made in Famitracker: http://soundcloud.com/patashu My twitch. I stream mostly shmups & rhythm games http://twitch.tv/patashu My youtube, again shmups and rhythm games and misc stuff: http://youtube.com/user/patashu
Player (36)
Joined: 9/11/2004
Posts: 2630
Patashu wrote:
OmnipotentEntity wrote:
Patashu wrote:
How do you generate the formulas that create these curves for each value of n? (For normalization purposes, let's say that the domain is x >= 0, x <= 1 and the integral of the curve should be equal to 1.)
The domain restriction is a bit vexing. Normal distributions are necessarily infinite. Is this a strict requirement?
Right, the normal distribution is infinite - but every successive distribution made from combining 1, 2, 3, etc. uniform distributions is not. Basically, imagine creating the distribution that's the result of multiplying two uniform distributions together. It'll look like a triangle shape. Then multiply that with another uniform distribution - it'll look somewhat like the normal distribution, but more basic. If you do this infinitely many times, you get the normal distribution. But I'm curious about what all the previous steps look like. (Thanks for the answer to question 2!)
Multiplication isn't the function you're looking for. I'm reasonably certain (if I'm recalling my probability class) that you're looking for convolution. The good news about convolution is that when you convolute two uniform distributions the area under the triangle you get is still 1. As you convolute the unit impulse function with itself again and again you get a curve that approaches a normal distribution. The domain isn't 0 to 1 though. The definition of convolution is available at the wiki page, but stating it here: f(t)*g(t) = integrate f(u)g(t-u) du, -inf, inf If we take our uniform distribution to be 1 from -1/2 to 1/2 (for simplicity) and zero elsewhere (the so called unit impulse function). Then we get the unit triangle function, (which is defined as y = x+1, -1 < x <= 0, y = 1-x, 0 < x <= 1 and 0 elsewhere). We can then repeatedly convolute the unit impulse function with our result to get our desired nth function. It rapidly becomes extremely piecewise. n = 3: piecewise | 1/4 (3-4 x^2) | -1/2<x<1/2 1/8 (-4 x^2-4 x+3) | x = -1/2 1/8 (-4 x^2+4 x+3) | x = 1/2 1/8 (4 x^2-12 x+9) | 1/2<x<3/2 1/8 (4 x^2+12 x+9) | -3/2<x<-1/2 n = 4: piecewise | 1/6 (-3 x^3-6 x^2+4) | -1<x<0 1/6 (-x^3-6 x^2-6 x) | x = -1 1/6 (-x^3+6 x^2-12 x+8) | 1<x<2 1/3 (x^3-3 x+2) | x = 0 1/6 (x^3-6 x^2+6 x) | x = 1 1/6 (x^3+6 x^2+12 x+8) | -2<x<-1 1/6 (3 x^3-6 x^2+4) | 0<x<1 and so on. I don't have a proof prepared for why this converges to the normal distribution. However, I'm reasonably certain that it does, and I doubt that a proof of this fact would be particularly difficult or novel, provided you approached it correctly.
Build a man a fire, warm him for a day, Set a man on fire, warm him for the rest of his life.
Patashu
He/Him
Joined: 10/2/2005
Posts: 4043
Thanks for the reply!
My Chiptune music, made in Famitracker: http://soundcloud.com/patashu My twitch. I stream mostly shmups & rhythm games http://twitch.tv/patashu My youtube, again shmups and rhythm games and misc stuff: http://youtube.com/user/patashu
Joined: 2/19/2010
Posts: 248
Patashu wrote:
1) Think about the central limit theorem - the normal distribution you get from summing infinitely many uniform distributions. Now think about what kind of distribution you get from summing finitely many uniform distributions. For n = 2 you get a triangular shape, for n = 3 you get a shape that looks like parabolic sections stuck together and so on. How do you generate the formulas that create these curves for each value of n? (For normalization purposes, let's say that the domain is x >= 0, x <= 1 and the integral of the curve should be equal to 1.)
If we take the special case of the discrete uniform distribution with exactly two possibilities, you've just described the binomial distribution. EDIT: and in fact for general discrete uniform distributions, you have a multinomial distribution.
Player (147)
Joined: 11/14/2011
Posts: 68
Location: Brookline, MA
rhebus wrote:
Patashu wrote:
1) Think about the central limit theorem - the normal distribution you get from summing infinitely many uniform distributions. Now think about what kind of distribution you get from summing finitely many uniform distributions. For n = 2 you get a triangular shape, for n = 3 you get a shape that looks like parabolic sections stuck together and so on. How do you generate the formulas that create these curves for each value of n? (For normalization purposes, let's say that the domain is x >= 0, x <= 1 and the integral of the curve should be equal to 1.)
If we take the special case of the discrete uniform distribution with exactly two possibilities, you've just described the binomial distribution. EDIT: and in fact for general discrete uniform distributions, you have a multinomial distribution.
Actually, the multinomial distribution is defined as follows: Parameters: n - Number of trials. k - Number of Categories. p - vector of probabilities (pi = P(ith category). Support: k-dimentional vectors of nonnegative integers. The sum of the integers in the vector is always n. Basically, you perform n independent trials and in each one, you select one of k categories, where all categories are equally likely. The outcome is a vector in which for i=1,...,k, the ith component is the number of times you chose the ith category. Some properties of the vector X~Mult(n,k,p): E(X) = np Var(X) = n(diag(p) - ppT) Xi~Bin(n,pi)
arflech
He/Him
Joined: 5/3/2008
Posts: 1120
This puzzle is relevant for RTAs, and I think I found an answer: http://fivethirtyeight.com/features/can-you-slay-the-puzzle-of-the-monsters-gems/ I have rephrased it below: If a certain type of monster drops common, uncommon, and rare gems in the respective ratio 3:2:1, and you slay the monsters until you have at least one of each type of gem, how many gems of each type will you have, on average?
i imgur com/QiCaaH8 png
Skilled player (1651)
Joined: 7/25/2007
Posts: 299
Location: UK
Well, surely the expected value is just 3:2:1 respectively. Pretty easy. Given their chances are P(c)=3/6=1/2, P(u)=2/6=1/3, P(r)=1/6, it'll obviously take longest on average to find the rare gem. Repeating a trial until you get a success takes average time E(t)=1/p, =1/ 1/6 =6. The amount of events with probability p that you expect to get in this time is just E(x)=np. E(c)=6.1/2=3. E(u)=6.1/3=2. E(r)=6.1/6=1, as required.
Invariel
He/Him
Editor, Site Developer, Player (171)
Joined: 8/11/2011
Posts: 539
Location: Toronto, Ontario
That's what I thought as well; things would be different if the subsequent drops only became available once their "unlock" condition was met, that is: The enemy's drop table is item 1, until you have three of them. Then the enemy's drop table becomes item 1 (3/5), item 2 (2/5) until you have two item 2s. Then the enemy's drop table becomes item 1 (3/6), item 2 (2/6), item 3 (1/6). Then the chances of getting the item in six drops is: 1 * 1 * 1 * 0.4 * 0.4 * 0.1666... = 2.6%
I am still the wizard that did it. "On my business card, I am a corporate president. In my mind, I am a game developer. But in my heart, I am a gamer." -- Satoru Iwata <scrimpy> at least I now know where every map, energy and save room in this game is
Joined: 2/19/2010
Posts: 248
The answer cannot be 3:2:1. You will always get at least one of each gem, but you have a one in 36 chance of having 2 rare gems from the first two monsters. This puts a lower bound on the expected number of rare gems at (1/36 * 2 + 35/36 * 1) = 37/36. The exact number will be higher than this, but it's enough to disprove the 3:2:1 answer.
Player (80)
Joined: 8/5/2007
Posts: 865
rhebus wrote:
The answer cannot be 3:2:1. You will always get at least one of each gem, but you have a one in 36 chance of having 2 rare gems from the first two monsters. This puts a lower bound on the expected number of rare gems at (1/36 * 2 + 35/36 * 1) = 37/36. The exact number will be higher than this, but it's enough to disprove the 3:2:1 answer.
I'm not so sure about that. A standard argument in favor of 3:2:1 goes as follows: Line up several players. Name them A, B, C, etc. for clarity. Instruct player A to continue killing enemies until they have at least one of each gem, then stop. After player A is done, instruct player B to continue killing enemies until they have one of each gem, then continue on to player C and so on. After this has been done many times, examine their totals. Surely some will have ratios different from 3:2:1, but as an aggregate, their procedure was no different from one player collecting gems consecutively without restriction. Because this (imaginary) player would surely collect the gems in the ratio 3:2:1 and our individual players are indistinguishable from one another, surely the individual ratios are also 3:2:1. Someone please fill in any gaps in my proof or correct me if I've made a mistake. Edit: Okay, I now see that two different things are being said. The ratio will be 3:2:1, but that doesn't mean the player will pick up 1 rare gem on average. It seems no one has (correctly) determined the expected number of gems to be picked up. I'll think about it for a bit, but someone else is likely to figure it out before I do.
Joined: 2/19/2010
Posts: 248
I tried tackling this by simplifying the problem and solving that. Imagine flipping a biased coin with 2/3 chance heads and 1/3 chance tails, and you keep flipping until you have seen a head and a tail. How many heads and tails do you expect to see? My answer: 7/3 heads and 7/6 tails. My reasoning: I built a state machine diagram. You have two goals to satisfy: a head or a tail. You can therefore be in one of four states: neither seen (initial state), head seen, tail seen, both seen (the terminal state). In the initial state, you have a 1/3 chance of entering the "tails seen" state, and 2/3 chance of entering the "heads seen" state. In the "heads seen" state, you have a 2/3 chance of getting an extra (unneeded) heads, and a 1/3 chance of entering the terminal state. You can expect to see 2 heads in this process. (I solved the infinite series E(h) = 2/3 + 2/3*2/3 + 2/3*2/3*2/3 ... to obtain this figure.). A similar argument shows that in the "tails seen" state, you will continue flipping tails until you flip your first heads, and that you will expect to see 1/2 heads. Therefore, in total, you can expect to see 1 + 2/3 * 2 = 7/3 heads, and 1 + 1/3 * 1/2 = 7/6 tails. (The 1 term is the heads/tails you get on the state transition to satisfy the "heads/tails seen" goal). This technique can be generalized to the monsters and gems problem, but there are two niggles: 1. there are twice as many states (8 in total). 2. the expected number of gems calculation is more complex in the "seen two types" states I'm going to have a go at solving it without resorting to drawing out the whole state diagram.
Player (36)
Joined: 9/11/2004
Posts: 2630
arflech wrote:
If a certain type of monster drops common, uncommon, and rare gems in the respective ratio 3:2:1, and you slay the monsters until you have at least one of each type of gem, how many gems of each type will you have, on average?
This is an interesting problem. Let P(nCmNoR) be the probability of getting n common drops, m uncommon drops, and o rare drops. Let P(nD) be the probability of stopping after n drops. So P(1C0N0R) = 1/2, P(0C1N0R) = 1/3, P(0C0N1R) = 1/6 In the general case: P(nCkN0R) = (1/2)^n * (1/3)^k * (n+k)!/(n!k!) P(nC0NkR) = (1/2)^n * (1/6)^k * (n+k)!/(n!k!) P(0CnNkR) = (1/3)^n * (1/6)^k * (n+k)!/(n!k!) This is found by considering the probability of a single case, multiplied by all possible permutations (taking into account the indistinguishable alternatives.) If you stop after getting one of each gem, then the shortest trail is: P(3D) = P(1C1N1R) = P(C|0C1N1R U N|1C0N1R U R|1C1U0R) = P(C|0C1N1R) + P(N|1C0N1R) + P(R|1C1U0R) = 1/2 * P(0C1N1R) + 1/3 * P(1C0N1R) + 1/6 * P(1C1N0R) = 1/2 * P(N|R U R|N) + 1/3 * P(C|R U R|C) + 1/6 * P(C|N U N|C) = 1/2*(1/3*1/6 + 1/6*1/3) + 1/3 * (1/2*1/6 + 1/6*1/2) + 1/6*(1/2*1/3 + 1/3*1/2) = 3*2*1/2*1/3*1/6 = 1/6 And the EV of rare gems given 3 drops is 1 trivially. Let's consider the case of stopping after 4 drops to get a handle on something a little less trivial P(4D) = P(2C1N1R) + P(1C2N1R) + P(1C1N2R) Because we stop after getting one of each we can ignore a possibility in each case, this will be denoted by 0 times the probability, I'll only show it the first time. P(2C1N1R) = P(N|2C0N1R) + P(R|2C1N0R) + 0*P(C|1C1N1R) = 1/3*P(2C0N1R) + 1/6*P(2C1N0R) = 1/3 * (1/2)^2*1/6 * 3!/(2!1!) + 1/6*(1/2)^2*1/3*3!/(2!1!) = 6/2 * 2 * 1/3*1/6*1/4 = 1/3 * 1/4 = 1/12 P(1C2N1R) = P(C|0C2N1R) + P(R|1C2N0R) = 1/2 * (1/3)^2 * 1/6*3!/(2!1!) + 1/6 * (1/3)^2 * 1/2 * 3!/(2!1!) = 6/2 * 2 * 1/9 * 1/2 * 1/6 = 1/9 * 1/2 = 1/18 P(1C1N2R) = P(C|0C1N2R) + P(N|1C0N2R) = 1/2 * (1/6)^2 * 1/3 * 3!/(2!1!) + 1/3 * (1/6)^2 * 1/2 * 3!/(2!1!) = 6/2 * 2 * 1/2 * 1/3 * 1/36 = 1/2 * 1/3 * 1/6 = 1/36 P(4D) = 1/12 + 1/18 + 1/36 = 1/6 And the EV for rare gems in the case of 4D is: EV(R|4D) = 1*1/12 + 1*1/18 + 2*1/36 * (1/6)^-1 = 7/6 Let's consider P(nD) and EV(R|nD). Importantly, we only need consider the cases where there is 1C or 1N or 1R. Because if there are more than one of each then it's an unreachable state. P(nD) = P(1CnD U 1NnD U 1RnD) Importantly, these are not mutually disjoint, so we need to make sure we account for that: P(nD) = P(1CnD) + P(1NnD) + P(1RnD) - P(1C1NnD) - P(1C1RnD) - P(1N1RnD) + P(1C1N1RnD) The final term will always be 0 unless n = 3, whose case we handled separately. P(nD) = P(1CnD) + P(1NnD) + P(1RnD) - P(1C1NnD) - P(1C1RnD) - P(1N1RnD) And we can make the first three mutually disjoint again by not considering the 1,1,(n-2) cases within the first three terms, we just need to correct later by adding the 1,1,(n-2) cases. For instance we can define: P(1C'nD) = P(1C(n-3)N2R) + P(1C(n-4)N3R) + ... + P(1C2N(n-3)R) = sum{k=2, n-3}(P(1C(n-1-k)NkR)) And we have: P(nD) = P(1C'nD) + P(1N'nD) + P(1R'nD) + P(1C1NnD) + P(1C1RnD) + P(1N1RnD) Let's consider P(1C(n-1-k)NkR) for k >=2, n >= 5 P(1C(n-1-k)NkR) = P(C|0C(n-1-k)NkR) = 1/2 * (1/3)^(n-1-k) * (1/6)^k * (n-1)!/((n-1-k)!k!) So via WA: P(1C'nD) = sum{k=2, n-3}(P(1C(n-1-k)NkR)) = 2^(-2-n) 3^(-n) (12-3 2^n+4 3^n-3 (8+2^n) n) P(1N'nD) = sum{k=2, n-3}(P((n-1-k)C1NkR)) = 2^(-n-1) 3^(-n-2) (-4 (3^n+27) n-8 3^n+9 4^n+72) P(1R'nD) = sum{k=2, n-3}(P((n-1-k)CkN1R)) = 1/5 6^(-n-2) (-5 (27 2^n+8 3^n) n+45 2^n-20 3^n+36 5^n) Now for the edge cases: P(1C1NnD) = P(C|0C1N(n-2)R) + P(N|1C0N(n-2)R) = 1/2 * 1/3 * (1/6)^(n-2) * (n-1)!/((n-2)!1!) + 1/3 * 1/2 * (1/6)^(n-2) * (n-1)!/((n-2)!1!) = 1/3 * (1/6)^(n-2) * (n-1) P(1C1RnD) = 1/6 * (1/3)^(n-2) * (n-1) P(1N1RnD) = (1/2)^(n-2) * (n-1) I'll stop here, because it's getting a bit too hairy for me, and I didn't properly separate the different possibilities for rares etc, so this mathematical machinery won't help directly. Instead I wrote a program to get an empirical answer:
#include <random>
#include <iostream>
#include <sstream>

std::random_device rd;
std::uniform_real_distribution<double> dist(0.0, 1.0);

void usage(char* progName) {
  std::cout << progName << " <number of trials>" << std::endl;
}

enum Gem {
  Common,
  Uncommon,
  Rare
};

Gem gen_drop() {
  double drawing = dist(rd);
  if (drawing <= 1./2.) {
    return Common;
  } else if (drawing <= 5./6.) {
    return Uncommon;
  } else {
    return Rare;
  }
}

int main(int argc, char** argv) {
  if (argc < 2) {
    usage(*argv);
    return -1;
  }

  size_t trials;
  std::stringstream s(argv[1]);
  s >> trials;

  size_t results[3] = {0,0,0};
  for (size_t i = 0; i < trials; ++i) {
    size_t this_trial[3] = {0,0,0};
    while (!this_trial[Rare] ||
           !this_trial[Uncommon] ||
           !this_trial[Common]) {
      ++this_trial[gen_drop()];
    }

    for (size_t i=0; i<3; ++i) {
      results[i] += this_trial[i];
    }
  }

  std::cout << "Results" << std::endl;
  for (size_t i=0; i<3; ++i) {
    std::cout << (double)results[i]/(double)trials << " ";
  }
  std::cout << std::endl;
}
The result for 10,000,000 trials: 3.64914 2.43294 1.21663
Build a man a fire, warm him for a day, Set a man on fire, warm him for the rest of his life.
Editor, Expert player (2073)
Joined: 6/15/2005
Posts: 3282
All this problem is, is what Bobo the King said above with the ratio tending to 3:2:1, and the negative binomial distribution applied to the expected number of total drops required to get one of each gem. If probability of success is p and probability of failure is q=1-p, then the expected number of trials required to obtain success is 1(p)+2(qp)+3(q2p)+... = p(1+2q+3q2+...) . Using the identity 1/(1-x)2 = 1+2x+3x2+... , the expected number is p(1/(1-q)2)=p/p2=1/p. Ex. The expected number of rolls of a 6-sided die required to roll a 1 is 1/(1/6)=6. To calculate the expected number of total drops required to get one of each gem, condition on the order in which the types of gems are first encountered. That is, we calculate: 1 + P(Common)*{E(# of drops to get Uncommon or Rare) + P(Uncommon|Uncommon or Rare)*E(# of drops to get Rare) + P(Rare|Uncommon or Rare)*E(# of drops to get Uncommon)} + P(Uncommon)*{E(# of drops to get Common or Rare) + P(Common|Common or Rare)*E(# of drops to get Rare) + P(Rare|Common or Rare)*E(# of drops to get Common)} + P(Rare)*{E(# of drops to get Common or Uncommon) + P(Common|Common or Uncommon)*E(# of drops to get Uncommon) + P(Uncommon|Common or Uncommon)*E(# of drops to get Common)} = 1 + (1/2)*{2 + (2/3)*6 + (1/3)*3} + (1/3)*{3/2 + (3/4)*6 + (1/4)*2} + (1/6)*{6/5 + (3/5)*3 + (2/5)*2} = 1 + (1/2)*7 + (1/3)*(13/2) + (1/6)*(19/5) = 1 + 7/2 + 13/6 + 19/30 = 30/30 + 105/30 + 65/30 + 19/30 = 219/30 = 7.3 Since the ratio of the types of gems tends to 3:2:1, the expected number of common, uncommon, and rare gems, respectively is 7.3/2 = 3.65, 7.3/3 = 2.43333... , and 7.3/6 = 1.21666... . This is consistent with the numbers given by OmnipotentEntity's program.
Player (80)
Joined: 8/5/2007
Posts: 865
FractalFusion wrote:
All this problem is, is what Bobo the King said above with the ratio tending to 3:2:1, and the negative binomial distribution applied to the expected number of total drops required to get one of each gem. If probability of success is p and probability of failure is q=1-p, then the expected number of trials required to obtain success is 1(p)+2(qp)+3(q2p)+... = p(1+2q+3q2+...) . Using the identity 1/(1-x)2 = 1+2x+3x2+... , the expected number is p(1/(1-q)2)=p/p2=1/p. Ex. The expected number of rolls of a 6-sided die required to roll a 1 is 1/(1/6)=6. To calculate the expected number of total drops required to get one of each gem, condition on the order in which the types of gems are first encountered. That is, we calculate: 1 + P(Common)*{E(# of drops to get Uncommon or Rare) + P(Uncommon|Uncommon or Rare)*E(# of drops to get Rare) + P(Rare|Uncommon or Rare)*E(# of drops to get Uncommon)} + P(Uncommon)*{E(# of drops to get Common or Rare) + P(Common|Common or Rare)*E(# of drops to get Rare) + P(Rare|Common or Rare)*E(# of drops to get Common)} + P(Rare)*{E(# of drops to get Common or Uncommon) + P(Common|Common or Uncommon)*E(# of drops to get Uncommon) + P(Uncommon|Common or Uncommon)*E(# of drops to get Common)} = 1 + (1/2)*{2 + (2/3)*6 + (1/3)*3} + (1/3)*{3/2 + (3/4)*6 + (1/4)*2} + (1/6)*{6/5 + (3/5)*3 + (2/5)*2} = 1 + (1/2)*7 + (1/3)*(13/2) + (1/6)*(19/5) = 1 + 7/2 + 13/6 + 19/30 = 30/30 + 105/30 + 65/30 + 19/30 = 219/30 = 7.3 Since the ratio of the types of gems tends to 3:2:1, the expected number of common, uncommon, and rare gems, respectively is 7.3/2 = 3.65, 7.3/3 = 2.43333... , and 7.3/6 = 1.21666... . This is consistent with the numbers given by OmnipotentEntity's program.
Bravo, FractalFusion! I knew it had to do with the negative binomial distribution but I couldn't quite put all the pieces together. Your explanation is clear and it appears to be correct.
fcxiaopengyou
He/Him
Experienced player (559)
Joined: 7/25/2015
Posts: 123
Location: Republic of China
Working on: [NES] Downtown Special - Kunio-kun no Jidaigeki Dayo Zenin Shuugou! (J) ''2 players 100%'' Plan: [SNES] Kenyuu Densetsu Yaiba (Japan) _________________ My English is pour.