Page 11234..1020..»

The Race To December 31 – Sunday Puzzle

Here’s a fun game you can play with another person.

The game starts on January 1. Each of two players takes turns calling out another date. The new date has to be a later date in the year with either the same month OR the same day (from January 1, a player can call out a later day in January or another month with the day 1 like February 1, March 1, etc.).

The person who calls out December 31 wins the game.

Who has the winning strategy, the first or second player? Does it matter if the game is played in a leap year?

This game was covered by Grey Matters in 2010, who communicated it to ScamSchool who made a video.

I have posted a video that explains the winning strategy.

Can You Solve The Race To December 31 Riddle?

Can you figure it out? Keep reading for my explanation.

.
.
.
.
M
I
N
D
.
Y
O
U
R
.
D
E
C
I
S
I
O
N
S
.
P
U
Z
Z
L
E
.
.
.
.
Answer To First To December 31 Wins

The winning strategy belongs to the first player, and it does not matter if it is a leap year.

The first player should call out January 20 on the first turn. The second player then names a later day in January or another month with the day 20.

Regardless, the first player can then name one of following dates (month/day), and continue to do so no matter what the second player does.

1/20 (first turn)
2/21
3/22
4/23
5/24
6/25
7/26
8/27
9/28
10/29
11/30
12/31

The pattern is:

day = month + 19

Played correctly, the first player always calls out December 31.

Solving the game

Understanding the strategy is easy enough. But how could one derive the strategy?

The game is an example of a well-known combinatorial game called Nim, in which two players remove objects from different piles of objects. Nim is a completely solved game for any number of objects and piles, so it is always possible to figure out a winning strategy for one of the players.

In specific games, we can use backwards induction, which means to think ahead and reason backwards. So let us start from the end of the calendar year.

Let’s start out by solving for winning dates. By rule, December 31 is a winning date.

12/31

If you name any other date in December, you would lose, as your opponent could then name 12/31. So all other dates in December are losing dates. You also want to avoid calling earlier months with the day value 31–as your opponent could then name 12/31.

What about November? The next date to analyze is 11/30. If you could call that out, there is no later date in November, so your opponent would have to name 12/30. This allows you to call 12/31 and win. So 11/30 is another winning date. If you can name it, you are sure to win.

11/30
12/31

Now we reason backwards. Similarly any earlier date in November is losing, as are the dates 10/31 and 10/30. So you want to name one day backwards, which is 10/29.

If you could name 10/29, your opponent would have to name 10/30, 10/31, 11/29, or 12/29. From 10/30 and 11/29 you can get to 11/30, and from 10/31 or 12/29 you can get to 12/31. In other words, if you can name 10/29, you can always get to a winning date.

10/29
11/30
12/31

Now we can extend the logic that the next winning date is 1 month backwards and 1 day backwards. So we get the winning date 9/28, then 8/27, and so on, until we get all of the winning dates.

1/20
2/21
3/22
4/23
5/24
6/25
7/26
8/27
9/28
10/29
11/30
12/31

The first player can name 1/20 on the first turn, and then keep naming a winning date for any choice that the second player is allowed to make.

The pattern in the dates is:

day = month + 19

Sources

Grey Matters
http://headinside.blogspot.com/2010/06/scam-school-meets-grey-matters.html

Scam School
https://www.youtube.com/watch?v=J3_XJtscvnQ

Go here to read the rest:
The Race To December 31 – Sunday Puzzle

The Friendship Theorem – Sunday Puzzle

In a group of 6 people, you might find that some people are friends with each other on Facebook, or you might find that no one is friends with each other on Facebook. There are many combinations for how people can be friends or not.

Show that there is always a group of 3 people in which either:

–All 3 people are mutual friends on Facebook.
–All 3 people are mutual strangers (no one is friends on Facebook).

Watch the video for a solution.

Can You Solve The Friendship Riddle?

Or keep reading.

.
.
.
.
M
I
N
D
.
Y
O
U
R
.
D
E
C
I
S
I
O
N
S
.
P
U
Z
Z
L
E
.
.
.
.
Answer To The Friendship Problem

Consider 1 person. That person is friends with 0, 1, 2, 3, 4, or 5 people, which pairs with not being friends with 5, 4, 3, 2, 1, or 0 people.

In every single case, either the person is friends with 3 or more people, or the person is not friends with 3 or more people.

Consider the first case of 3 or more friends and take 3 of those friends. If any of those are friends with each other, then that forms a group of 3 people who are mutual friends. If none of them are friends with each other, then that is a group of 3 people who are mutual strangers where no one is friends on Facebook.

The other case is similar. Suppose the person is not friends with 3 or more people and take 3 of those people. If any of those 3 people are not friends with each other, then that forms a group of 3 people who are mutual strangers. If all of them are friends with each other, then that is a group of 3 people who are mutual friends.

In every possible case, there is always a group of 3 people who are mutual friends or a group of 3 people who are mutual strangers.

The problem can also be visualized. Each person can be represented as a vertex of a graph. Draw a blue line between two vertices if the two people are friends, and a red line if the two people are not friends. No matter how the graph is colored, there will always be a monochromatic triangle–either a red triangle or a blue triangle. The video illustrates the proof using the colored graph.

Can You Solve The Friendship Riddle?

This problem is also known as the Friendship Theorem. It is an example of Ramsey Theory, a field that studies conditions under which certain properties must be true.

Further reading

Plus Magazine
https://plus.maths.org/content/friends-and-strangers

Wikpedia Theorem On Friends And Strangers
https://en.wikipedia.org/wiki/Theorem_on_friends_and_strangers

Math Garden blog
http://mathgardenblog.blogspot.com/2012/12/Ramsey33.html

View original post here:
The Friendship Theorem – Sunday Puzzle

Spies Sharing Secrets – Sunday Puzzle

On a successful mission, each spy obtains a unique secret. The n spies need to share all their secrets with each other, which they accomplish through a series of phone calls between 2 spies at a time. During a call, a spy can share all the secrets the spy knows, including secrets learned from previous calls.

What is the minimum number of phone calls needed so every spy learns all of the secrets? Solve the problem for the following two specifications of phone calls.

a) Each phone call is a 1-way call. Only the spy that makes the call can share secret information to the spy on the receiving end.

b) Each phone call is a 2-way call. Both spies can share their secret information with each other.

*Warning: it is very difficult to prove minimality for case b).

Watch the video for a solution.

Can You Solve The Spies Sharing Secrets Puzzle?

Or keep reading.

.
.
.
.
M
I
N
D
.
Y
O
U
R
.
D
E
C
I
S
I
O
N
S
.
P
U
Z
Z
L
E
.
.
.
.
Answer To Spies Sharing Knowledge

I will present the answer first and then justify why these are minimal procedures in the proof section later.

I will label the spies with letters in alphabetical order (A, B, C, etc.). I will write X→Y to indicate a 1-way call from X to Y, so spy X shares information to spy Y. I will write X↔Y to indicate a 2-way call between spies X and Y where both spies exchange information.

Here is an optimal procedure for 1-way calls. Call one of the spies the “the hub.” All of the other spies call the hub (n – 1 calls), and then the hub calls everyone back (n – 1 calls). If spy A is the hub, for example, the calling procedure with 3 spies would be B→A, C→A, A→B, A→C with the calls in that order.

spies-sharing-secrets-optimal-1-way-hub

For n spies, there are 2n – 2 calls made:

1-way calls
2 for 2 spies
4 for 3 spies
6 for 4 spies
2n – 2 for n ≥ 5 spies

The minimum number of 1-way calls is 2n – 2.

There can be other minimal methods too. Consider a “circular pattern” of calls. Imagine arranging the spies in a circle in alphabetical order. The spies first make a clockwise chain of calls starting and ending with A, which is n calls. At this point the last spy and spy A are fully informed. Now spy A starts a clockwise chain of calls which ends two calls before spy A; this is n – 2 calls.

spies-sharing-secrets-optimal-1-way-greedy

For example, with 4 spies the procedure would be A→B, B→C, C→D, D→A for the first clockwise chain (4 calls), and then A→B, B→C for the second clockwise chain (2 calls). This greedy algorithm has the advantage that the first call shares 1 secret, the second call shares 2 secrets, and so on until n secrets, so each call shares a maximum number of secrets. This method also requires n + (n – 2) = 2n – 2 calls.

With 2-way calls, you might expect that about half the number of calls are needed because both spies can share information during a call. Surprisingly this is not the case! With 2-way calls you can only save 2 calls, and the minimum number of calls is 2n – 4.

For 2 spies, only 1 call is needed: A↔B. For 3 spies, 3 calls are needed: A↔B, B↔C, C↔A. For 4 spies, 4 calls are needed: A↔B, C↔D, A↔C, B↔D. The pattern for 4 spies is particularly important. The graph shows the pattern and the number on each edge indicates the order of the call.

spies-sharing-secrets-proof-2-way-optimal-4

For 5 or more spies, the pattern can be generalized by adding 2 calls. The new spy calls A, B, C, or D at the beginning, and then the new spy gets a call from A, B, C, or D at the end. For example with spy A, the new spy calls A at the beginning, and the new spy calls A at the end. In this way the new spy shares information with A (which then gets shared with all other spies), and then A calls back at the end (so the new spies learns all the other secrets too). For 5 spies, for example, the calling pattern results in 6 calls: A↔E, A↔B, C↔D, A↔C, B↔D, A↔E.

The pattern continues like this, and we have the result:

2-way calls
1 for 2 spies
3 for 3 spies
4 for 4 spies
2n – 4 for n ≥ 5 spies

With 2-way calls, the efficient procedures for n > 4 have the following pattern. Designate 4 spies as hubs A, B, C, and D. Each of the other n – 4 spies calls one of the hubs. Then the 4 hubs share information in the “square” pattern for the case n = 4, which takes 4 calls. Finally, each of the n – 4 spies receives a call from exactly one hub spy. The total number of calls is (n – 4) + 4 + (n – 4) = 2n – 4, and this is the minimum number of calls. The key to the procedure is that 4 hub spies can share information efficiently in 4 calls. The general pattern looks like the following graph (here only spy B is being called by all other spies):

spies-sharing-secrets-optimal-2-way

This is the least number of calls needed with 2-way calls. Now compare the result with 1-way calls.

The surprising part is that sharing information is a fairly inefficient process! Although 2-way calls allow for twice the information exchange as 1-way calls, they only reduce the total number of calls by 2. The process of sharing secrets is not much better than everyone calling the hub, and then the hub calling everyone back.

The proofs that these formulas are minimal appear below.

Video with proofs

You can watch the proofs.

Spies Sharing Secrets Proof Minimum Number Calls

Or keep reading.

Proof for 1-way calls

For 1-way calls, each new spy adds on a minimum of 2 calls–one to share the secret and one to learn other secrets. From the base cases where 2n – 2 calls are necessary, it is intuitive that you need 2 more calls for each spy, hence the formula 2n – 2 makes sense.

Why is this minimal? The proof comes from Harary and Schwenk in 1974. Consider any n – 2 calls. It is impossible for any spy to be fully informed, as fully informing a single spy requires the transmission of n – 1 secrets in n – 1 calls. After n – 2 calls, it is possible that each subsequent call could fully inform 1 spy. To inform n spies means n additional calls. Therefore the minimum number of calls is (n – 2) + n = 2n – 2. The protocols of calling the hub and calling circularly do acheive this lower bound, and therefore they are minimal.

Proof for 2-way calls

For 2-way calls, it is not possible to get fewer than 2n – 4 calls when n ≥ 4. The formal proof involves many steps. The reason is that if you assume you could make fewer calls to inform everyone, that would force a specific condition on the calling procedure. The defining condition is that no spy can share his secret and then have that secret shared back to himself. This condition then leads to the conclusion, after a series of logical deductions, that each spy has to make at least 5 calls. This means the total number of 2-way calls is at least 5n/2 > 2n, which is more than 2n – 4 calls. To summarize: assuming fewer than 2n – 4 calls are possible leads to contradiction that more than 2n calls are needed. This proves that 2n – 4 is a lower bound on the calls needed. The procedure to call 4 hub spies above acheives this lower bound, so this is the theoretical minimal number of calls required.

This proof method is due to Brenda Baker and Robert Shostak “Gossips and Telephones” in Discrete Mathematics Volume 2 (1972), 191–193. There are other proofs of this problem; see the references at the end.

Assume that 2n – 4 is not the minimal number of calls for some n > 4. Let m be the smallest value, so that at most 2m – 5 calls are sufficient. The m spies call each other in some procedure.

Claim (NOHOI): The procedure obeys the principle that no one hears their own information (NOHOI) from another person. That is, no spy shares a secret and then has that secret told back to them by another person. If the calling procedure is drawn as an ordered graph (the spies are vertices and the edge number equals order of the call), then NOHOI means that there is no closed walk of edges in increasing order.

(Side note: the optimal method with 2n – 4 calls does not obey NOHOI–a spy calls the hub, and then the hub calls back, so that spy hears his own information.)

Proof (NOHOI): This is a proof by contradiction. If the procedure does not obey NOHOI, we will show we can share secrets with m – 1 spies in fewer than 2(m – 1) – 4 calls. This would be a contradiction because m is defined as the smallest value for which spies can share secrets in at most 2m – 5 calls.

Suppose there is a closed walk of edges in increasing order starting and ending with spy X. This is a sequence of calls X↔S1, S1↔S2, …, Sk↔X. If there are other people that call X, denote those calls by the grouped sets E0, E1, …, Ek+1, where Ei denotes all the calls made before Si↔X and Ek+1 is any calls made after X↔Sk.

Now let’s make a new procedure for m – 1 agents that shares all secrets in less than 2(m – 1) – 4 calls.

All calls that do not involve X remain the same (they are omitted in the diagrams below). Then we will eliminate all calls with X as follows. First, eliminate 2 calls X↔S1 and Sk↔X.

Next, keep all the remaining calls in the same order with the following adjustments. Replace E0↔X with the call E0↔S1, and also replace Ek+1↔X with the call Ek+1↔Sk. Also replace the remaining calls Ej↔X with the call Ej↔Sj.

This is a new calling procedure excluding spy X in which all the other spies share all their secrets. The adjustment process looks like the following:

spies-sharing-secrets-proof-2-way-nohoi1

spies-sharing-secrets-proof-2-way-nohoi2

The calling procedure with m spies had at most 2m – 5 calls. We eliminated 2 calls, so the reduced calling procedure with m – 1 spies has at most 2m – 7 = 2(m – 1) – 5 calls. If the original calling procedure informed all spies, then this adjusted calling procedure also informs all m – 1 spies excluding spy X.

But this means we have created a calling procedure with m – 1 spies that requires at most 2(m – 1) – 5 calls. This contradicts that m is the minimal value.

Therefore, we must have NOHOI in the minimal procedure.

Main proof of minimality: If NOHOI holds, then a call between two spies must be the last call for both, or for neither. Why? Suppose the call is A↔B. If it is the last call for spy A, then spy A is fully informed after the call. This means spy B will be be fully informed and knows all secrets. If spy B calls another spy (say C), then spy B will be sharing the secret of spy C to spy C. This contradicts NOHOI.

Similarly, a call between two spies must be the initial call for both, or for neither. Otherwise, imagine B↔C takes place and then A↔B, which is A’s initial call. Since spy A learns the secret of C, spy A’s information and spy C’s information will be paired together. Eventually spy C needs to learn spy A’s secret. But at that time spy C will learn the secret of C, which contradicts NOHOI.

The remaining calls are all intermediate calls between spies.

For the m > 4 spies, no initial call can be a final call. Each spy has to make an initial call and a final call, so there will be m calls that are either initial or final calls. This means there can be at most m – 5 intermediate calls to stay below the 2m – 5 limit of calls.

For a graph of m vertices, it takes a minimum of m – 1 edges for the graph to be connected. Since there can be at most m – 5 intermediate calls, the graph of intermediate calls has at most m – 5 edges, and so it must have at least 5 disjoint connected components.

Spy A is part of one component, spy A’s initial call might be with another component of intermediate calls, and spy A’s final call might be with a third component of intermediate calls at most. So at least 2 of the components are not involved, or are wasted, from spy A’s perspective of learning secrets. Write w(A) for the number of wasted calls for spy A.

It takes m – 1 calls for spy A to be informed, and it takes m – 1 calls for spy A’s secret to be shared with everyone. By NOHOI, the only calls that can do both involve spy A, which will be denoted as p(A).

The total calls for spy A to share the secret and learn others is at least 2m – 2 – p(A). Taking into account wasted calls, there are at least 2m – 2 – p(A) + w(A) total calls. The total number of calls must be less than or equal to the maximum number of calls, 2m – 5. So:

2m – 2 – p(A) + w(A) ≤ 2m – 5
p(A) ≥ 3 + w(A)

Because the number of wasted calls is non-negative, w(A) ≥ 0, this implies:

p(A) ≥ 3

This means spy A has to make at least 3 calls. Since spy A makes an initial call and a final call, spy A must also make at least one intermediate call. All of this logic is true for each spy, and so every spy makes an intermediate call.

There are at least 2 components in the graph of intermediate calls that are wasted from spy A’s perspective. Since each spy makes an intermediate call, each of the 2 components has to have at least 1 edge. This means w(A) ≥ 2.

Using this information with a previous inequality gives the following result:

p(A) ≥ 3 + w(A)
p(A) ≥ 5

This means each spy is involved in at least 5 calls. Since each call involves two spies, this means there are at least 5m/2 > 2m calls.

This results contradicts that the spies used at most 2m – 5 calls.

Therefore, 2n – 4 is the minimal number of calls for n spies to share their secrets.

Sources

This puzzle is known as the “gossiping problem” which is the primary name used in the references.

I first came across in Algorithmic Puzzles by Anany Levitin and Maria Levitin By Anany Levitin, Maria Levitin, “Rumor Spreading,” but the text does not include a proof.

The problem does appear with the above proof in in The Art of Mathematics: Coffee Time in Memphis by Béla Bollobás, “Gossiping Dons.”

MathWorld entry gossiping
http://mathworld.wolfram.com/Gossiping.html

One-way communications proof. Harary, F. and Schwenk, A. J. “The Communication Problem on Graphs and Digraphs.” J. Franklin Inst. 297, 491-495, 1974.
https://www.researchgate.net/publication/30825362_The_communication_problem_on_graphs_and_digraphs

Gossiping Dons (several proofs, including Brenda Baker and Robert Shostak “Gossips and Telephones” in Discrete Mathematics Volume 2 (1972), 191–193)
https://www.math.uni-bielefeld.de/~sillke/PUZZLES/gossips.pdf

Spreading Gossip Efficiently, C.A.J. Hurkens.
http://www.nieuwarchief.nl/serie5/pdf/naw5-2000-01-2-208.pdf

Gossiping problem proven
http://web.pdx.edu/~caughman/Gossip.pdf

Thanks to Patrons!

Kyle
Alberto Nishikawa
Brian M. Mooney

You can support me and this site at Patreon.

Visit link:
Spies Sharing Secrets – Sunday Puzzle

Brain Exercises, Brain Fitness, Brain Games – BrainHQ from …

REAL SCIENCE TO BELIEVE IN

BrainHQ is built and tested by more than 100 neuroscientists with more than 100 published papers showing real benefits. No other program has this level of proof.

Do you want to excel at work? Hear better in crowded places? You can choose courses to meet your goal or use the personal trainer to set up a daily schedule based on your performance.

BrainHQ’s exercises work out your attention, memory, brain speed, people skills, navigation and intelligence. There are hundreds of levels of exercises that automatically adapt to your skill level.

BrainHQ’s robust progress features show you how much you’ve trained, how much you have improved, and how you compare to others. (If you want to know.)

We’ve heard stories from all kinds of people about how BrainHQ has done everything from enabling them to get a job, to reviving their creativity, to making them feel more confident about their future. What could you do with a sharper brain?

Set goals and track your progress

Stay in the game with daily reminders

Get daily brain tips on exercise, diet, and activities (iPad Only)

BrainHQ works on the web, iOS, and Android (in browser)

See the article here:
Brain Exercises, Brain Fitness, Brain Games – BrainHQ from …

The Coin Flipping Sequences Riddle – Sunday Puzzle

What is the expected number of coin flips until you get two heads in a row (HH)?

What is the expected number of flips until you get a heads followed by a tails (HT)?

(Alternate story by John Baez: one couple decides to have children until getting two boys consecutively. Another couple has has children until getting a boy immediately followed by a girl. Will the two couples expect to have the same number of children, or will they be different?)

Watch the video for a solution.

The Coin Flipping Sequences Riddle. Can You Solve This HARD Probability Problem?

Or keep reading.

(Note: I did post a similar problem a year ago problem and solution. If you remember it, this post has new material in that I am posting an alternate explanation and I also made a video on it. Numberphile also made a video on this topic, consecutive coin flips.)

.
.
.
.
M
I
N
D
.
Y
O
U
R
.
D
E
C
I
S
I
O
N
S
.
P
U
Z
Z
L
E
.
.
.
.
Answer To Coin Flips Until An Outcome

When we flip a coin two times, there are four equally likely outcomes:

HH
HT
TH
TT

Since each pair of flips has the same 25% chance of occurring in two flips, intuitively it seems like it should take the same number of flips until we see any of the outcomes.

Somewhat strangely this is not true!

In repeated coin flips, the expected flips until HT or TH is 4, while the same expectation for HH and TT is 6. Why does is there a difference? And how can we prove it?

Why flips to HH and HT are intuitively different problems

When we first flip a coin, there is a 50% chance of getting H or T. This can be represented in a kind of tree where we can proceed to the two states of H or T. (Each arrow represents a 50% transition probability.)

coin-flipping-sequence-hh-first

Now what happens depends on which state we are in. If we are in T, then there is a 50% chance we flip a tails and stay there, and there is a 50% chance we flip a heads and move to the state H.

coin-flipping-sequence-hh-second

If, on the other hand, we are in the state H, then there is a 50% chance we flip a tails and move to T, and there is a 50% chance we flip a heads. In the latter case we have flipped two heads in a row, so we move to the state HH and the game ends (we stay in that state).

coin-flipping-sequence-hh-third

This kind of diagram is an example of a Markov chain: a set of states from which there are fixed transition probabilities to other states. Once we get to HH the game ends and we stay in that state forever, so HH is called an absorbing state. (Technically there should be an arrow looping on HH with 100% probability, but I have omitted that as it is understood the game ends).

Suppose instead we were flipping until HT. We can similarly draw the state diagram. From the first flip we either go to H or T with 50% probability each. If we are in T, then there is a 50% chance we flip T and stay there, or a 50% chance we flip H and go to state H. If we are in state H, there is a 50% chance we flip H and stay in H, or there is a 50% chance we flip T and move to the state HT where we win the game.

coin-flipping-sequence-ht

Now consider the two state diagrams side by side.

coin-flipping-sequence-hh-vs-ht

Note these are different diagrams so it is understandable the expected flips until absorption could be different. In each diagram you need to be at H in the flip before you win. In the HT diagram, once you get to H you stay there or you win. In the HH diagram, on the other hand, once you get to H, you might go back to state T, and you will need to flip yourself back to H.

So this is a reason to think HT would take less time. However, intuition is not always the best guide, so let us prove this.

Proof HT takes 4 flips

Let us solve for the number of flips until HT. Let x denote the expected number of flips to HT from state H, and let y denote the analogous number from state T.

We want to find E(HT). We will use 1 flip in order to get to either the state H or T, from which we then have some number of flips until getting to HT. So we have the equation:

E(HT) = (first flip) + Pr(H) E(HT | H) + Pr(T) E(HT | T)
E(HT) = 1 + 0.5 E(x) + 0.5 E(y)

If we are in state H, then with 50% chance we go to HT, and with 50% chance we remain in state H. In the latter case, we have used 1 flip, and then our expected time to HT is still equal to E(x). So we have the equation:

E(x) = 0.5 E(x| flip T) + 0.5 E(x | flip H)
E(x) = 0.5(1) + 0.5(1 + E(x))
0.5 E(x) = 1
E(x) = 2

On the other hand, if we are in state T, then with 50% chance we stay in T, so we use 1 flip and we still have E(y) flips until HT. There is also a 50% chance we go to state H, which means we have used 1 flip and our expected time to HT is equal to E(x). So we have the equation:

E(y) = 0.5 E(y| flip T) + 0.5 E(x | flip H)
E(y) = 0.5(1 + E(y)) + 0.5(1 + E(x))
E(y) = 1 + 0.5 E(y) + 0.5 E(x)
0.5 E(y) = 1 + 0.5 E(x)
0.5 E(y) = 1 + 0.5(2)
E(y) = 4

We can solve our original equation.

E(HT) = 1 + 0.5 E(x) + 0.5 E(y)
E(HT) = 1 + 0.5(2) + 0.5(4)
E(HT) = 4

Proof HH takes 6 flips

We can solve for the number of flips until HH in a similar manner. Let x denote the number of flips to HH from state H, and let y denote the similar number from state T.

We want to find E(HH). The first flip takes us to either H or T, so we have the equation:

E(HH) = (first flip) + Pr(H) E(HH | H) + Pr(T) E(HH | T)
E(HH) = 1 + 0.5 E(x) + 0.5 E(y)

If we are in state H, then with 50% chance we go to HH and end, and with 50% chance we go to state T. In the latter case, we have used 1 flip, and then our expected time to HH is equal to E(y). So we have the equation:

E(x) = 0.5 E(x| flip H) + 0.5 E(x | flip T)
E(x) = 0.5(1) + 0.5(1 + E(y))
E(x) = 1 + 0.5 E(y)

If we are in state T, then with 50% chance we stay in T, and with 50% chance we go to state H. In the latter case, we have used 1 flip, and then our expected time to HH is equal to E(x). So we have the equation:

E(y) = 0.5 E(y| flip T) + 0.5 E(y | flip H)
E(y) = 0.5(1 + E(y)) + 0.5(1 + E(x))
E(y) = 1 + 0.5 E(y) + 0.5 E(x)
0.5 E(y) = 1 + 0.5 E(x)

We now have a system of equations with two equations for the two variables E(x) and E(y).

E(x) = 1 + 0.5 E(y)
0.5 E(y) = 1 + 0.5 E(x)

We can substitute the value for E(x) from the first equation into the second equation, and then solve:

0.5 E(y) = 1 + 0.5 (1 + 0.5 E(y))
0.5 E(y) = 1.5 + 0.25 E(y)
.25 E(y) = 1.5
E(y) = 6

We can then substitute into the first equation to find:

E(x) = 1 + 0.5 E(y)
E(x) = 1 + 0.5(6)
E(x) = 4

We now substitute to our original equation.

E(HH) = 1 + 0.5 E(x) + 0.5 E(y)
E(HH) = 1 + 0.5(4) + 0.5(6)
E(HH) = 6

To solve for TT and TH

The proof for TT is analogous to the proof for HH and the proof for TH is analogous to the proof for TH, and we can conclude E(TT) = E(HH) = 6 and E(TH) = E(HT) = 4.

A couple that wants two boys or two girls in a row can expect to have 6 children, while a couple that wants a boy/girl or girl/boy pair in order can expect to have 4 children.

Read more here:
The Coin Flipping Sequences Riddle – Sunday Puzzle

The Magical Pond – Sunday Puzzle

I visit a shrine where I place flowers at 3 different temples. Before visiting each temple, I have to swim through a magical pond. (My path is pond-temple1-pond-temple2-pond-temple3).

Each time I swim in the pond, it magically triples the number of flowers that I have.

At the start of my trip, I have some flowers. At the end of my trip, I have no flowers, and I have placed an equal number of flowers at each temple.

What is the least number of flowers I could have bought and placed at each temple? I only buy and place whole numbers of flowers.

Watch the video for the answer.

Can You Solve The Magical Pond Puzzle?

Or keep reading.

.
.
.
.
M
I
N
D
.
Y
O
U
R
.
D
E
C
I
S
I
O
N
S
.
P
U
Z
Z
L
E
.
.
.
.
Answer To The Magical Pond Puzzle

Suppose I start the trip with x flowers, and I leave y flowers at each temple. Here are the number of flowers I have at after each stage of the trip.

xstart
3xpond (triple the flowers)
3xytemple1 (leave y flowers)
9x – 3ypond (triple the flowers)
9x – 4ytemple2 (leave y flowers)
27x – 12ypond (triple the flowers)
27x – 13ytemple3 (leave y flowers)

After placing flowers at the 3rd temple, I have no flowers. This means the final equal is equal to 0.

27x – 13y = 0
x = (13/27) y

We want to find whole number solutions to this equation. As the fraction 13/27 cannot be reduced any further, the variable y needs to be a multiple of 27 to eliminate the denominator. The smallest value for y is 27, which means the smallest value for x is 13.

The answer is I started with 13 flowers and I left 27 flowers at each temple. We can check:

13 flowersstart
39 flowerspond (triple the flowers)
12 flowerstemple1 (27 flowers)
36 flowerspond (triple the flowers)
9 flowerstemple2 (27 flowers)
27 flowerspond (triple the flowers)
0 flowerstemple3 (27 flowers)

This kind of problem happens in financial settings and is known as an annuity problem (an annuity gives fixed, equal payments over many time periods).

Suppose you owe a loan of x dollars. At the start of each period, the unpaid debt grows by an interest rate of i percent. At the end of each period, you pay a fixed amount y.

The temple problem corresponds to paying an interest rate of 200% and having the debt paid in 3 periods.

The value of the annuity is the original loan multiplied by the interest rate compounded over the 3 time periods.

value = x(1 + i)3

The value is also equal to the amount paid in each period, multiplied by the interest rate. The first payment grows over 2 more periods, the second payment grows over 1 more period, and the third payment is final and does not grow.

value = y[(1 + i)2 + (1 + i)1 + 1] = y((1 + i)3 – 1)/i

The last step uses the formula for the sum of a geometric series.

Now we can set these formulas equal to each other.

x(1 + i)3 = y((1 + i)3 – 1)/i

The interest rate is i = 200% = 2, so we have:

x(1 + 2)3 = y((1 + 2)3 – 1)/2
27x = y(26)/2
27x = 13y

This results in the same equation as before and confirms the solution.

If the annuity went for n periods, then we can modify the time periods in the equations:

value = x(1 + i)n

value = y[(1 + i)n – 1 +…+ (1 + i)1 + 1] = y((1 + i)n – 1)/i

So we would have the condition:

x(1 + i)n = y((1 + i)n – 1)/i

Sources
http://puzzling.stackexchange.com/questions/35902/how-many-flowers-to-buy

http://puzzling.stackexchange.com/questions/22725/temple-lake-question

Follow this link:
The Magical Pond – Sunday Puzzle

Puzzle Games – Girl Games 1 – Games for Girls

Check back each week to play all new fun, and free online Puzzle Games for Girls only.

Puzzle games come in a wide variety of game types with all sorts of different gameplay. It’s not limited to the old fashioned jigsaw puzzles, although these are still a lot of fun for girls and boys. In fact, some puzzle games require you to solve complex problems by discovering combinations of things to do in order to advance to the next level and win. Others puzzles all about matching and moving the right pieces into place, the most popular game of this style is Bejeweled, but there have now been many more that have come out which are similar and just as fun. These puzzle games for girls are great because they will challenge your mind and also entertain you. Enhance your vocabulary playing word search games or test your knowledge in crossword games. All of these can be found here which is why it’s no mystery these games are so popular.

Is this like the coolest way to make money for a fun vacation in the sun on what.

Put the atoms in order, arrange three or more atoms with a similar color in a horizontal, vertical, and diagonal sequence.

A word puzzle that can really flip your mind.

An advanced memory match game where burger building masters win.

Analyze your power of observation by finding out the alphabets.

Jenny’s room is a mess and she can`t find anything in it.

When anyone wants to eat a delicious treat in summer the most common choice will always definitely be ice cream.

In ‘Memory Touch’ you are shown a set of cards.

Santa Clause is creating Christmas gifts with the help of the Elves.

Solve riddles by picking the correct picture in this fun puzzle game for girls.

The parents of this princess wants to find a handsome prince for her, so they organized a mega ball and invited all the royal princes to take part in this magical event so that the princess could choose a handsome charming prince she wants to get married to.

Complete this adorable puzzle.

See the original post here:
Puzzle Games – Girl Games 1 – Games for Girls

The Pirate Game – Game Theory Tuesdays

image by Elliott Brown. CC BY 2.0
image by Elliott Brown. CC BY 2.0

This is a fun puzzle I have posted about before with 3 pirates. This is a variation with 5 pirates accompanied by a video explanation.

A group of 5 pirates is dividing up 500 gold coins. How will they split the treasure?

The pirates are a disciplined and logical group, and they have a custom of how to split up treasure.

The 5 pirates (A, B, C, D, and E) have a strict organization by strength: pirate A is the strongest, followed by B, then C, then D, and then E. The voting process is a series of proposals with a lethal twist. Here are the rules:

1. The strongest pirate (A) offers a split of the gold. An example split is: “450 to A (me), 10 to B, 20 to C, 10 to D, and 10 to E.”

2. All of the pirates, including the proposer, vote on whether to accept the split. In the case of a tie, the proposer holds the casting vote to break the tie.

3. If the pirates agree to the split, it happens.

4. Otherwise, the pirate who proposed the plan gets thrown overboard from the ship and perishes.

5. The next strongest pirate takes over and then offers a split of the money. The process is repeated until a proposal is accepted.

Pirates care first and foremost about living, then about getting gold.

If a pirate is indifferent between saying “Yes” and “No” to a split in terms of money, then the pirate votes “No” because the pirate prefers to eliminate stronger pirates.

How does the game play out?

Video explanation

The YouTube channel EckoChamb3r offers a careful and detailed explanation of how to find the solution.

The Pirate Problem – Famous Game Theory Puzzle

If you cannot watch the video now, keep reading for an explanation.

Solution to game

The game can be solved by backwards induction. You want to think ahead and reason backwards.

Start at the end of the game. What would happen if the game continued so only pirate E remained? This is a trivial case as pirate E would take all 500 coins for himself.

Now reason backwards one step. What would happen if the game got to pirate D proposing a split? Similarly, pirate D would take all 500 coins for himself. While pirate E would oppose the split, pirate D is in favor, so the vote would be tied at 1-1. Pirate D could then cast the tie-breaking vote and make the proposal go through.

Now comes the interesting part when we reason one more step backwards. What would happen if pirate C is offering the split? Pirate C needs to buy 1 vote to make the plan go through. If pirate C dies, then pirate D would take all 500 coins and pirate E ends up with nothing. All the pirates know this. This presents an opportunity to buy the vote of pirate E.

Pirate C does not take all 500 coins for himself. Instead pirate C offers 1 coin to pirate E. Now pirate E can either vote for this split and get 1 coin, or pirate E can vote against it which leads to getting nothing when pirate D is in charge. So pirate E prefers this plan and would vote for it.

Therefore, pirate C would offer, “499 to C, 0 to D, and 1 to E.” Pirates C and E would vote for this plan and it would go through.

Now let’s reason one more step. What would happen if pirate B was in charge? Pirate B similarly needs to buy 1 vote. The easiest vote to buy is pirate D, who ends up with nothing if the split fails and pirate C ends up in charge.

Accordingly, pirate B would offer, “499 to B, 0 to C, 1 to D, and 0 to E.” Pirates B and D vote affirmatively against C and E, and pirate B holds the tie-breaking vote so the plan goes through.

Now we return to the original situation. What does pirate A do? Pirate A needs to buy two votes in order to make a proposal pass. If pirate A dies, then pirate B ends up in charge and that would leave pirates C and E with nothing. Pirate A can therefore buy the votes of pirates C and E by offering each 1 coin.

Pirate A offers, “498 to A, 0 to B, 1 to C, 0 to D, and 1 to E.” Pirates A, C, and E vote for the plan and it passes.

The surprising part of the problem is the strongest pirate can end up with most of the money, even though the other pirates hold the power to toss him overboard. The reason is some of the weaker pirates can be bought off for very little, knowing they would end up with nothing if the original split failed.

Game theory in The Dark Knight

One of my first posts and videos is about how this relates to the opening scene of the Dark Knight.

The Joker plans a bank heist and uses similar planning to buy off weaker criminals. The scene plays out quite like the game theory outcome.

Blog post: Game Theory in The Dark Knight: The Bank Robbery And The Pirate Game (Spoilers)

Video: Dark Knight Game Theory: The Robbery Scene And The Pirate Game

On a concluding note, the pirate game can be extended to more pirates. What would happen if 202 pirates are dividing up 100 gold coins? What would happen if there are 500 pirates dividing up 100 gold coins? It is a crazy logical problem to figure out what happens here. Check out the puzzle by Stephen Omohundro in Ian Stewart’s article “A Puzzle For Pirates”:

A Puzzle For Pirates (500 pirates with 100 coins)

Read more:
The Pirate Game – Game Theory Tuesdays

The Red Ball Lottery – Sunday Puzzle

This problem involves 100 red balls, 100 blue balls, and 2 urns.

First you distribute all of the balls between the 2 urns, however you wish, so each urn has at least 1 ball.

Then you select an urn at random, and you will draw a ball at random from that urn. If the ball you select is red, then you win $100.

What is your best strategy, and what is your maximum chance of winning?

Watch the video for a solution.

Can You Solve The Red Ball Lottery Puzzle?

[What do you think about the music in the video? Let me know! For those that prefer, here is a version of the video without music: Can You Solve The Red Ball Lottery Puzzle? (no background music).]

Or keep reading.

.
.
.
.
M
I
N
D
.
Y
O
U
R
.
D
E
C
I
S
I
O
N
S
.
P
U
Z
Z
L
E
.
.
.
.
Answer To The Red Ball Lottery

In urn A, place 1 red ball and 0 blue balls. In urn B, place the remaining 99 red balls and 100 blue balls.

If you select urn A you have a 100% chance of selecting a red ball. If you select urn B you have a 99/199 ≈ 49.7% chance of selecting a red ball. As each urn is equally likely to be picked, the overall probability of winning is the average of these cases:

Pr(red ball) = (1 + 99/199)/2 = 149/199 ≈ 74.87%.

This is a high success rate!

In many places I have seen this puzzle, this solution is presented and it is stated to be optimal. Some people try out all the possibilities, but that is unnecessary. In fact, we can prove it using mathematical reasoning.

Let’s consider the different distributions. There are two main cases to consider.

Case 1: Suppose both urns have the same number of red and blue balls. Then each urn has a 50% chance of selecting a red ball, and therefore this strategy has an overall 50% success rate. But this is worse than the strategy above which has a 74.87% success rate, so any distribution of this type cannot be optimal.

Case 2: Suppose some urn has more red balls than blue balls. Call this urn A. Then the other urn, call it urn B, has to have more blue balls b than red balls r. In urn B, we have b ≥ r + 1.

We will prove the best we can do is the strategy described: place 1 red ball in urn A and the remaining balls in urn B.

To begin, what is the probability of drawing a red ball from urn B? This is the number of red balls r in the urn divided by the total number of balls r + b in the urn.

Pr(r | urn B) = r/(r + b)

In urn B we have more blue balls than red balls, so b ≥ r + 1. We have a lower bound on the denominator, which gives an upper bound on the ratio:

Pr(r | urn B) = r/(r + b) ≤ r/(r + r + 1) = r/(2r + 1)

Now notice the function r/(2r + 1) is strictly increasing because its derivative 1/(2r + 1)2 is strictly positive (for r ≠ -1/2).

In other words, the more red balls we place in urn B–which must have more blue balls by assumption–the more likely it is we can draw a red ball.

Since urn A has more red balls than blue balls by assumption, urn A must have at least 1 red ball.

This means urn B can have at most 99 red balls. This will maximize the probability of drawing a red ball from urn B.

Pr(r | urn B) ≤ r/(2r + 1) ≤ 99/199

We have proven the best we can do for urn B is Pr(red | urb B) = 99/199. And naturally, the best we can do for urn A is Pr(red | urb B) = 1 because the probability cannot be larger than 100%.

The strategy of placing 1 red ball in urn A and the rest in urn B achieves these maximum probabilities, and any other distribution of this type would have a lower winning probability in urn B. Therefore this strategy is optimal.

Proof of optimality source (see solution to problem 5)
https://services.math.duke.edu/~dherzog/Math230_01HW2Sols.pdf

Calculus approach for optimality
http://www.radford.edu/wyang/480/Probability.pdf

Read the original:
The Red Ball Lottery – Sunday Puzzle

Matchstick Puzzle Make 3 Squares In 3 Moves

This classic matchstick/toothpick puzzle is making the rounds because it is “extremely hard” for adults. Can you make 3 squares by moving 3 matchsticks?

matchstick-puzzle-make-3-squares-in-3-moves-social

The problem is like it sounds and there are no tricks. For instance, you cannot break any of the sticks, the resulting squares must be of equal size, and each toothpick has to be part of a square.

Can you figure it out? Watch the video for a solution.

Matchstick Puzzle That’s “Extremely Hard” For Adults


.
.
.
.
M
I
N
D
.
Y
O
U
R
.
D
E
C
I
S
I
O
N
S
.
.
.
.
Answer To Matchstick Puzzle Make 3 Squares In 3 Moves

The only clever part is figuring out which matchsticks to move. Here are the 3 steps to making 3 squares of equal size.

matchstick-puzzle-3-squares-move-1

matchstick-puzzle-3-squares-move-2

matchstick-puzzle-3-squares-move-3

Due to symmetry of the original shape, there are equivalent solutions by rotating the moves 90, 180, or 270 degrees from the center. For example, a rotation of 180 leads to solving for the upside down shape.

matchstick-puzzle-3-squares-upsidedown-move-1

matchstick-puzzle-3-squares-upsidedown-move-2

matchstick-puzzle-3-squares-upsidedown-move-3

Did you figure it out?

Source “extremely hard”
https://www.quora.com/Puzzles-and-Trick-Questions-What-is-a-brain-teaser-that-is-very-short-and-extremely-hard-for-adults/answer/Brian-Hoang-3

Via
https://www.thesun.co.uk/uncategorized/1378872/this-quick-challenge-is-the-latest-brain-teaser-to-sweep-the-web-do-you-have-what-it-takes-to-solve-it/

Original post:
Matchstick Puzzle Make 3 Squares In 3 Moves


Page 11234..1020..»