CHANCE AND PROBABILITY.

Previous

In calculating the probability of any event, the difficulty is not, as many persons imagine, in the process, but in the statement of the proposition, and the great trouble with many of those who dispute on questions of chance is that they are unable to think clearly.

The chance is either for or against the event; the probability is always for it. The chances are expressed by the fraction of this probability, the denominator being the total number of events possible, and the numerator the number of events favourable. For instance: The probability of throwing an ace with one cast of a single die is expressed by the fraction ?; because six different numbers may be thrown, and they are all equally probable, but only one of them would be an ace. Odds are found by deducting the favourable events from the total, or the numerator from the denominator. In the example, the odds against throwing an ace are therefore 5 to 1. The greater the odds against any event the more improbable it is said to be, and the more hazardous it is to risk anything upon it.

When an event happens which is very improbable, the person to whom it happens is considered lucky, and the greater the improbability, the greater his luck. If two men play a game, the winner is not considered particularly lucky; but if one wanted only two points to go out and the other wanted a hundred, the latter would be a very lucky man if he won.

It is a remarkable fact that luck is the only subject in the world on which we have no recognised authority, although it is a topic of the most universal interest. Strictly speaking, to be lucky simply means to be successful, the word being a derivative of gelingen, to succeed. There are a few general principles connected with luck which should be understood by every person who is interested in games of chance. In the first place, luck attaches to persons and not to things. It is useless for an unlucky man to change the seats or the cards, for no matter which he chooses the personal equation of good or bad luck adhering to him for the time being cannot be shaken off. In the second place, all men are lucky in some things, and not in others; and they are lucky or unlucky in those things at certain times and for certain seasons. This element of luck seems to come and go like the swell of the ocean. In the lives of some men the tide of fortune appears to be a long steady flood, without a ripple on the surface. In others it rises and falls in waves of greater or lesser length; while in others it is irregular in the extreme; splashing choppy seas to-day; a storm to-morrow that smashes everything; and then calm enough to make ducks and drakes with the pebbles on the shore. In the lives of all the tide of fortune is uncertain; for the man has never lived who could be sure of the weather a week ahead. In the nature of things this must be so, for if there were no ups and downs in life, there would be no such things as chance and luck, and the laws of probability would not exist.

The greatest fallacy in connection with luck is the belief that certain men are lucky, whereas the truth is simply that they have been lucky up to that time. They have succeeded so far, but that is no guarantee that they will succeed again in any matter of pure chance. This is demonstrated by the laws governing the probability of successive events.

Suppose two men sit down to play a game which is one of pure chance; poker dice, for instance. You are backing Mr. Smith, and want to know the probability of his winning the first game. There are only two possible events, to win or lose, and both are equally probable, so 2 is the denominator of our fraction. The number of favourable events is 1, which is our numerator, and the fraction is therefore ½, which always represents equality.

Now for the successive events. Your man wins the first game, and they proceed to play another. What are the odds on Smith’s winning the second game? It is evident that they are exactly the same as if the first game had never been played, because there are still only two possible events, and one of them will be favourable to him. Suppose he wins that game, and the next, and the next, and so on until he has won nine games in succession, what are the odds against his winning the tenth also? Still exactly an even thing.

But, says a spectator, Smith’s luck must change; because it is very improbable that he will win ten games in succession. The odds against such a thing are 1023 to 1, and the more he wins the more probable it is that he will lose the next game. This is what gamblers call the maturity of the chances, and it is one of the greatest fallacies ever entertained by intelligent men. Curiously enough, the men who believe that luck must change in some circumstances, also believe in betting on it to continue in others. When they are in the vein they will “follow their luck” in perfect confidence that it will continue. The same men will not bet on another man’s luck, even if he is “in the vein,” because “the maturity of the chances” tells them that it cannot last!

GAMES. ODDS.
One 1 to 1
Two 3 to 1
Three 7 to 1
Four 15 to 1
Five 31 to 1
Six 63 to 1
Seven 127 to 1
Eight 255 to 1
Nine 511 to 1
Ten 1023 to 1

If Smith and his adversary had started with an agreement to play ten games, the odds against either of them winning any number in succession would be found by taking the first game as an even chance, expressed by unity, or 1. The odds against the same player winning the second game also would be twice 1 plus 1, or 3 to 1; and the odds against his winning three games in succession would be twice 3 plus 1, or 7 to 1, and so on, according to the figures shown in the margin.

GAMES.
1st 2nd
1 1
1 0
0 1
0 0

That this is so may easily be demonstrated by putting down on a sheet of paper the total number of events that may happen if any agreed number of games are played, expressing wins by a stroke, and losses by a cipher. Take the case of two games only. There are four different events which may happen to Smith, as shown in the margin. He may win both games or lose both; or he may win one and lose the other, either first. Only one of these four equally probable events being favourable to his winning both games, and three being unfavourable, the odds are 3 to 1 that he does not win both; but these are the odds before he begins to play. Having won the first game, there are only two events possible, those which begin with a win, and he has an equal chance to win again.

GAMES.
1st 2nd 3rd
1 1 1
1 1 0
1 0 1
1 0 0
0 0 0
0 0 1
0 1 0
0 1 1

If the agreement had been to play three games, there would have been eight possible events, one of which must happen but all of which were equally probable. These are shown in the margin. If Smith wins the first game, there are only four possible events remaining; those in which the first game was won. Of these, there are two in which he may win the second game, and two in which he may lose it, showing that it is still exactly an even thing that he will win the second game. If he wins the second game, there are only two possible events, the first two on the list in the margin, which begin with two wins for Smith. Of these he has one chance to win the third game, and one to lose it. No matter how far we continue a series of successive events it will always be found that having won a certain number of games, it is still exactly an even thing that he will win the next also. The odds of 1023 to 1 against his winning ten games in succession existed only before he began to play. After he has won the first game, the odds against his winning the remaining nine are only 511 to 1, and so on, until it is an even thing that he wins the tenth, even if he has won the nine preceding it.

In the statistics of 4000 coups at roulette at Monte Carlo it was found that if one colour had come five times in succession, it was an exactly even bet that it would come again; for in twenty runs of five times there were ten which went on to six. In the author’s examination of 500 consecutive deals of faro, there were 815 cards that either won or lost three times in succession, and of these 412 won or lost out. In a gambling house in Little Rock a roulette wheel with three zeros on it did not come up green for 115 rolls, and several gamblers lost all they had betting on the eagle and O’s. When the game closed the banker informed them that the green had come up more than twenty times earlier in the evening. They thought the maturity of the chances would compel the green to come; whereas the chances really were that it would not come, as it had over-run its average so much earlier in the evening. The pendulum swings as far one way as the other, but no method of catching it on the turn has ever yet been discovered.

Compound Events. In order to ascertain the probability of compound or concurrent events, we must find the product of their separate probability. For instance: The odds against your cutting an ace from a pack of 52 cards are 48 to 4, or 12 to 1; because there are 52 cards and only 4 of them are aces. The probability fraction is therefore 1/13. But the probabilities of drawing an ace from two separate packs are 1/13 × 1/13 = 1/169, or 168 to 1 against it.

Suppose a person bets that you will not cut a court card, K Q or J, from a pack of 52 cards, what are the odds against you? In this case there are three favourable events, but only one can happen, and as any of them will preclude the others, they are called conflicting events, and the probability of one of them is the sum of the probability of all of them. In this case the probability of any one event separately is 1/13, and the sum of the three is therefore 1/13 + 1/13 + 1/13 = 3/13; or ten to 3 against it.

In order to prove any calculation of this kind all that is necessary is to ascertain the number of remaining events, and if their sum, added to that already found, equals unity, the calculation must be correct. For instance: The probability of turning a black trump at whist is 13/52 + 13/52 = 26/52; because there are two black suits of 13 cards each. The only other event which can happen is a red trump, the probability of which is also 26/52, and the sum of these two probabilities is therefore 26/52 + 26/52 = 52/52, or unity.

Another fallacy in connection with the maturity of the chances is shown in betting against two successive events, both improbable, one of which has happened. The odds against drawing two aces in succession from a pack of 52 cards are 220 to 1; but after an ace has been drawn the odds against the second card being an ace also are only 16 to 1, although some persons would be mad enough to bet 1000 to 1 against it, on the principle that the first draw was a great piece of luck and the second ace was practically impossible. While the four aces were in the pack the probability of drawing one was 4/52. One ace having been drawn, 3 remain in 51 cards, so the probability of getting the second is 3/51, or 1/17. Before a card was drawn, the probability of getting two aces in succession was the product of these fractions; 1/13 × 1/17 = 1/221. On the same principle the odds against two players cutting cards that are a tie, such as two Fours, are not 220 to 1, unless it is specified that the first card shall be a Four. The first player having cut, the odds against the second cutting a card of equal value are only 16 to 1.

Dice. In calculating the probabilities of throws with two or more dice, we must multiply together the total number of throws possible with each die separately, and then find the number of throws that will give the result required. Suppose two dice are used. Six different throws may be made with each, therefore 6 × 6 = 36 different throws are possible with the two dice together. What are the odds against one of these dice being an ace? A person unfamiliar with the science of probabilities would say that as two numbers must come up, and there are only six numbers altogether, the probability is 2/6, or exactly 2 to 1 against an ace being thrown. But this is not correct, as will be immediately apparent if we write out all the 36 possible throws with two dice; for we shall find that only 11 of the 36 contain an ace, and 25 do not. The proper way to calculate this is to take the chances against the ace on each die separately, and then to multiply them together. There are five other numbers that might come up, and the fraction of their probability is ? × ? = 25/36, or 25 to 11 in their favour.

Take the case of three dice: As three numbers out of six must come up, it might be supposed that it was an even thing that one would be an ace. But the possible throws with three dice are 6 × 6 × 6 = 216; and those that do not contain an ace are 5 × 5 × 5 = 125; so that the odds against getting an ace in one throw with three dice, or three throws with one die, are 125/216, or 125 to 91 against it.

To find the probability of getting a given total on the faces of two or three dice we must find the number of ways that the desired number can come. In the 36 possible throws with two dice there are 6 which will show a total of seven pips. The probability of throwing seven is therefore 6/36, or 5 to 1 against it. A complete list of the combinations with two dice were given in connection with Craps.

Poker. In calculating the probability of certain conflicting events, both of which cannot occur, but either of which would be favourable, we must make the denominator of our fraction equal in both cases, which will, of course, necessitate a proportionate change in our numerator. Suppose a poker player has three of a kind, and intends to draw one card only, the odds against his getting a full hand are 1/16; against getting four of a kind, 1/48. To find the total probability of improvement, we must make the first fraction proportionate to the last, which we can do by multiplying it by 3. The result will be 3/48 + 1/48 = 4/48; showing that the total chance of improvement is 1 in 12, or 11 to 1 against it.

Whist. To calculate the probable positions of certain named cards is rather a difficult matter, but the process may be understood from a simple example. Suppose a suit so distributed that you have four to the King, and each of the other players has three cards; what are the probabilities that your partner has both Ace and Queen? The common solution is to put down all the possible positions of the two named cards, and finding only one out of nine to answer, to assume that the odds are 8 to 1 against partner having both cards. This is not correct, because the nine positions are not equally probable. We must first find the number of possible positions for the Ace and Queen separately, afterward multiplying them together, which will give us the denominator; and then the number of positions that are favourable, which will give us the numerator.

As there are nine unknown cards, and the Ace may be any one of them, it is obvious that the Queen may be any one of the remaining eight, which gives us 9 × 8 = 72 different ways for the two cards to lie. To find how many of these 72 will give us both cards in partner’s hand we must begin with the ace, which may be any one of his three cards. The Queen may be either of the other two, which gives us the numerator, 3 × 2 = 6; and the fraction of probability, 6/72, = 1/12; or 11 to 1 against both Ace and Queen.

If we wished to find the probability of his having the Ace, but not the Queen, our denominator would remain the same; but the numerator would be the three possible positions of the Ace, multiplied by the six possible positions of the Queen among the six other unknown cards, in the other hands, giving us the fraction 18/72. The same would be true of the Queen but not the Ace. To prove both these, we must find the probability that he has neither Ace nor Queen. There being six cards apart from his three, the Ace may be any one of them, and the Queen may be any one of the remaining five. This gives us 6 × 5 = 30, and the fraction 30/72. If we now add these four numerators together, we have:—for both cards in partner’s hand, 6; for Ace alone, 18; for Queen alone, 18; and for neither, 30; a total of 72, or unity, proving all the calculations correct.

In some of the problems connected with Whist, it is important to know the probability of the suits being distributed in various ways among the four players at the table; or, what is the same thing, the probable distribution of the four suits in any one hand. The author is indebted to Dr. Pole’s “Philosophy of Whist” for these calculations. As an example of the use of this table, suppose it was required to find the probability of any other player at the table holding four or more trumps if you had six. Take all the combinations in which the figure 6 appears, and add together the number of times they will probably occur. That will be your denominator, 166. The numerator will be the number of times that the combinations occur which contain a figure larger than 3, in addition to the 6. This will be found to be 74, and the probability will therefore be 74/166.

DISTRIBUTIONS. TIMES IN
1000.
8 2 2 1 2
8 3 1 1
8 3 2 0 1
8 4 1 0 ½
8 5 0 0 0
7 2 2 2 5
7 3 2 1 19
7 3 3 0 3
7 4 1 1 4
7 4 2 0 3
7 5 1 0 1
7 6 0 0 0
6 3 2 2 57
6 3 3 1 35
6 4 2 1 47
6 4 3 0 13
6 5 1 1 7
6 5 2 0 6
6 6 1 0 1
5 3 3 2 155
5 4 2 2 106
5 4 3 1 130
5 4 4 0 12
5 5 2 1 32
5 5 3 0 9
4 3 3 3 105
4 4 3 2 215
4 4 4 1 30

MARTINGALES. Many gamblers believe that as the science of probabilities teaches us that events will equalise themselves in time, all that is necessary is to devise some system that will keep a person from guessing, so that he may catch the pendulum as it swings; and to add to it some system of betting, so that he will have the best of it in the long run. Some content themselves with playing a “system” against banking games, which is merely a guide to the placing of the bets, the simplest example of which would be to bet always on heads if a coin was tossed a thousand times, or to bet on nothing but red at Roulette. Others depend more on martingales, which are guides to the amount of the bets themselves, irrespective of what they are placed on.

The most common form of martingale is called doubling up, which proceeds upon the theory that if you lose the first time and bet double the amount the next time, and continue to double until you win, you must eventually win the original amount staked. If there was no end to your capital, and no betting limit to the game, this would be an easy way to make money; but all banking games have studied these systems, and have so arranged matters that they can extend their heartiest welcome to those who play them.

In the first place, by simply doubling up you are giving the bank the best of it, because you are not getting the proper odds. If you double up five times you are betting 16 to 1; but the odds against five successive events are 31 to 1, as we have already seen, and the bank should pay you 31 instead of 16. You should not only double, but add the original amount of the stake each time, betting 1, 3, 7, 15, 31, 63, and so on. If you do this, you will win the amount of your original stake for every bet you make, instead of only for every time you win. This looks well, but as a matter of fact doubling up is only another way of borrowing small sums which will have to be paid back in one large sum when you can probably least afford it.

Suppose the game is Faro, the chips five dollars a stack, and the limit on cases twenty-five dollars. The limit on cases will then be 400 chips. If eight successive events go against your “system,” which they will do about once in 255 times, your next bet will be beyond the limit, and the banker will not accept it. At Monte Carlo the smallest bet is a dollar, and the limit is $2,400. They roll about 4,000 coups a week, and if you were to bet on every one of them, doubling up, you would win about $1,865, one dollar at a time, and would lose $4,092 simply through being unable to follow your system beyond the limit of the game during the two or three occasions, in the 4,000 coups, that your system would go against you for eleven or more coups in succession. It is useless to say it would not go against you so often, for probabilities teach us that it would be more wonderful if it did not than if it did.

It must never be forgotten that the most wonderful things that happen are not more wonderful than those that don’t happen. If you tossed a coin a thousand times, and did not once toss heads eight times in succession, it would be four times more surprising than if you tossed heads ten times in succession.

Bets
Won. Lost.
10 -
9 -
8 -
- 7
- 8
9 -
- 8
- 9
10 -
- 9
46 41

Progression. This is a favourite martingale with those who have not the courage or the money to double up. It consists in starting with a certain amount for the first bet, say ten dollars, and adding a dollar every time the bet is lost, or taking off a dollar every time a bet is won. If the player wins as many bets as he loses, and there is no percentage against him, he gets a dollar for every bet he wins, no matter how many bets he makes, or in what order the bets are won and lost, so that the number won equals the number lost. That this is so may be easily demonstrated by setting down on a sheet of paper any imaginary order of bets, such as the ten shown in the margin, five of which are won, and five lost; the net profit on the five bets won being five dollars. No matter how correctly the player may be guessing, and how much the luck runs his way, he wins smaller and smaller amounts, until at last he is “pinched off.” But if a long series of events goes against him his bets become larger and larger, but he must keep up the progression until he gets even. If ten bets go his way he wins $55; if ten go against him he loses $145.

It is said that Pettibone made a fortune playing progression at Faro, which is very likely, for among the thousands of men who play it the probabilities are that one will win all the time, just as the probabilities are that if a thousand men play ten games of Seven Up, some man will win all ten games. At the same time it is equally probable that some man will lose all ten.

Some players progress, but never pinch, keeping account on a piece of paper how many bets they are behind, and playing the maximum until they have won as many bets as they have lost. Against a perfectly fair game, with no percentage and no limit, and with capital enough to follow the system to the end, playing progression would pay a man about as much as he could make in any good business with the same capital and with half the worry; but as things really are in gambling houses and casinos, all martingales are a delusion and a snare. It is much better, if one must gamble, to trust to luck alone, and it is an old saying that the player without a system is seldom without a dollar. It is the men with systems who have to borrow a stake before they can begin to play.

Such matters as calculating the probability of a certain horse getting a place, the odds against all the horses at the post being given, would be out of place in a work of this kind; but those interested in such chances may find rules for ascertaining their probability in some of the following text books.

TEXT BOOKS.

  • Calcul de ProbabilitÉ, by Bertrand.
  • Philosophy of Whist, by Dr. Pole.
  • Winning Whist, by Emory Boardman.
  • Chance and Luck, by R.A. Proctor.
  • Complete Poker Player, by John Blackbridge.
  • Bohn’s Handbook of Games.
  • Betting and Gambling, by Major Churchill.

                                                                                                                                                                                                                                                                                                           

Clyx.com


Top of Page
Top of Page