Math 20, Spring 2011

Schedule

Week Day Date Section in text
Week 1 Day 1 March 28 Introduction and basic concepts
Day 2 March 30 1.2
x-hour March 31 Optional: Proof-writing tips
Day 3 April 1 1.2
Week 2 Day 4 April 4 3.1
Day 5 April 6 3.1
x-hour April 7 3.2
Day 6 April 8 3.2
Week 3 Day 7 April 11 More combinatorics!
Day 8 April 13 4.1
x-hour April 14 Quiz #1
Day 9 April 15 4.1
Week 4 Day 10 April 18 4.1
Day 11 April 20 6.1
x-hour April 21 6.1
Day 12 April 22 6.1
Week 5 Day 13 April 25 no class
Day 14 April 27 no class
x-hour April 28 no x-hour
Day 15 April 29 6.2
Week 6 Day 16 May 2 5.1
Day 17 May 4 5.1
x-hour May 5 Quiz #2
Day 18 May 6 5.1
Week 7 Day 19 May 9 8.1
Day 20 May 11 8.1
x-hour May 12 no x-hour
Day 21 May 13 9.1
Week 8 Day 22 May 16 9.1
Day 23 May 18 9.2
x-hour May 19 Quiz #3
Day 24 May 20 Confidence intervals and embarrassing questions
Week 9 Day 25 May 23 11.1
Day 26 May 25 11.2
x-hour May 26 no x-hour
Day 27 May 27 11.3
Week 10 Day 28 May 30 no class: Memorial Day!
Day 29 June 1 Review

Day 1 (Monday, March 28)

Today we went over the outline of the course and started with some basic concepts. The problem I left you with was this: Suppose Alice and Bob flip a fair coin 50 times. If the coin comes up heads, Alice wins a penny from Bob, and if the coin comes up tails, Bob wins a penny from Alice. Let Pj be the probability that at the end of the game, Alice has won j pennies. What is the value of j that makes Pj highest? Now, can you guess how many times out of 50 Alice is likely to be in the lead? (We will say that if Alice and Bob are tied after a flip, the person who was in the lead before that flip is still in the lead.)

I also told you that it's useful for us all to share a mathematical language. I suggest looking at this handout to understand mathematical statements a little better.

Day 2 (Wednesday, March 30)

Remember that when defining an experiment, it's very important to define your sample space correctly. In class today, I set up an experiment where we chose a sequence of two numbers at random from the set {1,2,3}. In this case, the outcomes were the possible sequences. What if I had asked for the outcomes to be the number of 2s in the resulting sequences? The number of odd numbers in the resulting sequences?

Day 3 (Friday, April 1)

Remember that the experiment I talked about at the end of class was flipping a fair coin until tails came up and counting the number of flips it took. We agreed that any number of coin flips was possible. I have a question for you: since the coin is fair, the probability of getting a tails on one flip is the same as the probability of getting a tails on any other flip. Why is it that the probabilities of needing different numbers of flips to get a tails are different?

Day 4 (Monday, April 4)

Here's the problem I promised you: Student parking is difficult to find at College, so one student parks in the faculty lots every day. He has just noticed that none of the last ten tickets he's gotten has been issued on a Monday or Friday. He is wondering if the campus police actually patrol the faculty lots on those days. If the campus police at College don't give tickets on weekends, is it reasonable to assume that the campus police don't actually patrol the faculty lots on Mondays and Fridays? Why?

Day 5 (Wednesday, April 6)

I mentioned in class that you might want to show that if < a_n > ~ < b_n >, then < b_n > ~ < a_n >. I'll also leave you with a calculation: Suppose that three math books, two physics books, and three government books are shelved randomly (and that all the books are different). What is the probability that all of the books in each subject are shelved together? (Think carefully about what needs to be ordered.)

x-hour, Week 2 (Thursday, April 7)

Here's a problem for you: If you draw 8 cards from a standard deck without replacement, what is the probability that you will draw at least one king? (Remember to be careful about not inducing order in your hand!)

Day 6 (Friday, April 8)

The equation we talked about at the end of class today is called the Inclusion-Exclusion Principle in your book. I prefer the Principle of Inclusion and Exclusion because the acronym is nicer.

The problem I recommended to you about the professors grading multiple-choice tests is #18 in section 3.2.

Day 7 (Monday, April 11)

Here's another problem: Suppose that you've put all the possible nonnegative integer solutions of x_1 + x_2 + x_3 = 7 on slips of paper, put them into a container, and shaken it so well that you have equal probability of drawing out any slip. If you draw a slip, what is the probability that the solution it contains will have x_1=3?

We aren't going over section 3.3 in this course, but it's a rather interesting piece on card shuffling. You might want to have a look at it (after this week's quiz, of course).

Day 8 (Wednesday, April 13)

First of all, the Monty Hall simulation I was using is here.

Second, several of you came up with nice variations on this problem. I'll write them here and leave them to you to think about. Analyzing them involves exactly the same strategy I used at the board today.
• Suppose Monty doesn't always open a door hiding a goat. If both doors hide goats, he has a 50% chance of choosing either door, but if one hides a goat and one hides a car, he has a 90% chance of choosing the door with the goat and a 10% chance of choosing the door with the car. Given this, should you still switch?
• Now suppose there are four doors and the game goes like this:
• You choose a door.
• Monty chooses one of the other three, always revealing a goat, and offers you the chance to switch to one of the other two doors.
• You decide whether to switch (and, if you do switch, which door to switch to).
• Monty opens one of the remaining two doors (if you switched, this includes your original choice of door) and offers you the chance to switch again.
• You decide whether to switch (at this point, you will have no choice about which to switch to).
For the second game, should your strategy be different in the following two cases: if you originally switched and then Monty opens the door you had originally chosen, and if you originally switched and then Monty opens the other door that you could have switched to and didn't?

Day 9 (Friday, April 15)

If P(F|E)=P(F), why must P(E|F)=P(E)? You may also want to work through the proof of the theorem I gave in class.

Day 10 (Monday, April 18)

For more information about the logical fallacy I was talking about at the end of class, you might want to take a look at the Wikipedia page on conditional probability. The book it mentions by John Allen Paulos, Innumeracy, is very good.

You might also want to try phrasing the medical example I did formally in terms of Bayes's Theorem.

Day 11 (Wednesday, April 20)

Here's another Bayes' formula problem for you: Suppose that you are a general who is listening to two colonels giving you advice about the enemy's plans. The enemy will attack from exactly one of the right, the center, and the left in the next two days. The first colonel tells you that the probability of an attack from the left is 1/5, from the right is 3/10, and from the center is 1/2. The second colonel tells you that he's tapped into the enemy's communications, and that the probability of hearing the messages he heard is 1/5 if the enemy plans to attack from the left, 7/10 if they plan to attack from the right, and 1/10 of they plan to attack from the center. Given that these messages have been heard, what are the probabilities of attacks from the left, right, and center?

x-hour (Thursday, April 21)

Here's a nice article about the St. Petersburg paradox. You might want to think about different variations on this game: what if you stop flipping after n flips whether or not a tails comes up? What if you change the payoff for the nth flip to something other than 2^n?

Day 12 (Friday, April 22)

For your cultural education, here's Frank Sinatra and Dean Martin and Marlon Brando singing a couple songs from "Guys and Dolls." Both of them involve playing craps.

You should also keep in mind that in a casino, the payout for a win may not be precisely what we would calculate it should be... it may be tilted slightly against you.

Day 15 (Friday, April 29)

Try proving that the standardized random variable X* (defined in #12, section 6.2) has the expected value and standard deviation I said it would in class. You might also want to read the section on p. 259 explaining why V(X+Y) isn't always V(X) + V(Y).

Day 16 (Monday, May 2)

The Galton board applet I used in class can be found here. Enjoy!

Day 17 (Wednesday, May 4)

As we talk about more and more distributions, try to keep track of the ways they're related to each other. The negative binomial distribution can be figured out from the geometric distribution, for instance. Try to remember what kind of situation each distribution we talk about applies to, too!

Day 18 (Friday, May 6)

The Poisson distribution is probably the least discrete distribution we've looked at. It's discrete in the sense that the collection of outcomes is a discrete set (0 events occur, 1 event occurs, 2 events occur, etc.), but these events can occur at any point. Contrast this with the binomial distribution, where events can only occur at specified points (drawing a ball out of a box, flipping a coin, drawing a card, etc.). If you're interested, here's a nice analysis of the original Prussian cavalry horse-kick data.

Here is a link to some information about the Benford distribution with some nice examples. If you'd like to find out what a log table looks like, go over to Rauner library and ask to see this.

Day 19 (Monday, May 9)

The coupon collector simulator I was using can be found here. Please make sure you understand what the purposes of all the random variables I talked about today are!

Day 20 (Wednesday, May 11)

I stated the Weak Law of Large Numbers today in class:

The limit of the probability that a sample average differs from the expected value by more than epsilon is zero for every positive epsilon.

The Strong Law of Large Numbers is as follows:

The probability that the limit of the sample average differs from the expected value by more than epsilon is zero for every positive epsilon.

The difference between these two is subtle but important. The first says that as you take a larger and larger sample, the probability that your sample average will be outside some particular margin of error shrinks. The second says that as you take a larger and larger sample, your sample average will almost certainly get closer and closer to the actual expected value. The second implies the first, but not vice versa (see #16 in Section 8.1).

Day 21 (Friday, May 13)

If you're reading this, you know we're working through section 9.1 right now. Everything we've done today is on pp. 326-27. It's dense stuff, but I assure you that pp. 328-29 will fly right by!

Day 22 (Monday, May 16)

The moral of the Central Limit Theorem is that if you normalize a binomial distribution correctly, you can approximate it by a normal distribution with expected value 0 and variance 1. I showed you some tables of probabilities for this normal distribution. Please understand how to use them!

Day 23 (Wednesday, May 18)

Today we talked about confidence intervals and polling. Towards the end, questions came up about how to estimate the p and q to plug into the calculation of the standard deviation. One possibility is just to use the observed p and q from another poll. Another possibility (and a very good one!) is just to note that pq is never more than .25, so we could use .25 in place of pq. That might result in a larger n, but at least we can be sure it will be big enough!

You might want to take the poll at Fallacy Watch, and you might be interested in this list of questions for understanding how much data from a given poll are worth.

Day 24 (Friday, May 20)

First of all, we found that approximately 0% of you have gone longer than a month without washing your sheets and that approximately 22% of you sleep with a stuffed animal or baby blanket.

To do confidence intervals for this sort of question, we consider the fraction of the "yes" answers that come from the coin flip. This is the only part of the problem for which we need to worry about a confidence interval (think about why!). Now we can just calculate the confidence interval as for a standard binomial distribution: we have (1-p)k coin tosses that matter, and each has probability of success .5.

Day 25 (Monday, May 23)

Ian Stewart wrote two wonderful articles about Markov chains and Monopoly for Scientific American in April and October 1996. You should be able to download the .pdfs from the Scientific American archives as long as you're online via Dartmouth's secure network.

If you want to review matrix multiplication and finding inverses of matrices, the Dartmouth library has several e-books. Here's how to find them:

1. Go to the Dartmouth library website.
2. Type "linear algebra" into the "Summon" box.
3. When the search results come up, you'll see a list of check boxes on the left labeled "Content Type." Check the box that says "eBook."
4. Look through the list. The Schaum's outline is generally pretty good, but you might find others just as (or more) useful.

Day 26 (Wednesday, May 25)

Today we talked about absorbing Markov chains. Here's an extra example to work through: the Coin and Die game. There are two players: C and D. C has a fair coin, and D has a fair 6-sided die. On C's turn, she flips her coin. If it's heads, she wins. If it's tails, it's D's turn next. On D's turn, he rolls his die. If it's a 1, he wins. If it's a 6, it's C's turn next. If it's 2, 3, 4, or 5, he rolls again.

• Determine the transition matrix for this game and identify the transient and absorbing states. (Hint: the states should be "C's turn," "D's turn," "C wins," and "D wins.")
• Put the transition matrix into the form we used in class and compute its fundamental matrix.
• Given a starting transient state, find the expected number of times the game will be in each transient state.
• Given a starting transient state, find the expected number of steps before reaching an absorbing state.
• If there's more than one absorbing state, calculate the probabilities that if you start in a particular transient state, you finish in each of the various absorbing states. (We haven't done this yet, but we'll talk about it on Friday.)
The Chutes and Ladders example is here.

Day 27 (Friday, May 27)

The Monopoly simulator I showed in class appears here.