Thursday, March 24, 2011

Two + Two Makes Fold


Old school purists say its all about the feel of the game. The new whizkids on the block are all about the math. Like it or not, Math plays an important role in Poker. Card Player India gets Rishabh Jhunjhunwala, an avid poker player and Maths Hons. graduate from Oxford Universtiy to exlplain some of the mathematical concepts related to poker.

Contrary to what a poker novice might think, mastering the math of poker is a fairly straightforward  exercise. To put it in perspective, let’s begin by stating David Sklansky’s Fundamental Theorem of Poker that, though named overzealously, gets to the heart of what poker is all about:

“Every time you play a hand differently from the way you would have played it if you could see all your opponents' cards, they gain; and every time you play your hand the same way you would have played it if you could see all their cards, they lose.”

The skill of reading opponents’ cards, then, is what it’s about. The closer you get to a perfect read – one is tempted to say a Phil Iveyan read, but desists out of deference for an omniscient power - the better you become at poker. But what of the betting pattern you should follow given a good read? That is dictated primarily by the underlying math of poker, and the Fundamental Theorem implicitly assumes -with good reason - that you can work the math perfectly. A good read and sound, internalized math basics are the foundational blocks of a good poker player, and math is the easy part.

I’ve seen players with respectable reading skills and strategy make shameful calls or laydowns because they either did not stop to work out the math properly or did not know how to. Many players would have a lot more moolah to show for their hours of poker (certainly I would) if they had done just a little bit of background reading on poker theory before playing. I went through tens of sessions of incessantly chasing flush and straight draws when the pot odds dictated otherwise and doing min raises when I should have been overbetting the pot, gifting away a lot of money in the process. Even a cursory study can remedy such a situation. 
In this write-up, I cover the essentials of poker math for novices with a few examples.

Odds and Probability
These concepts are best illustrated straight up with examples. The odds of hitting a 6 on a die are 1:5, which means that for every 1 time you expect to hit a 6, you expect not to hit it 5 times. It is identical to say that the odds against hitting a 6 on a die are 5:1, or that the probability of hitting a 6 on a die is 1/6th.

A little clarification on the common confusion between odds and probability: odds of x : y for an event means that over x+y trials, the event will occur x times and not occur y times. The probability, then, of this event occurring is x out of x+y times, or x/(x+y). These are two distinct ways of conveying the same information.

Expected Value
Expected value, or EV, of a bet is the long-term average value you’d expect to get from it. Simply put, if you ran several trials of a bet and divided the final outcome by the number of trials, this average figure would be the EV (or pretty darn close to it).

EV is a simple and useful measure of good decision-making in any bet, whether in poker or otherwise. If the EV of a bet is positive, you should accept the bet, and if it is negative, you shouldn’t. The more positive it is, the better the bet for you. You can get lucky several times even if you go against EV calculations, but in the long run, the gods of probability will have their way and smile upon players who make the mathematically correct decisions more often. In a typical game of blackjack in a casino, the house has an EV of less than $1 for every $100 bet against it, yet this very small positive EV churns out impressive profits for the house in the long run.
To keep it simple, we will focus on how to make decisions in the midst of a poker game using EV and not introduce precise EV calculations here.

Let’s start with an elementary example. If you were offered a bet where you’d win $5 for hitting a 6 on a die and lose $1 if you didn’t, should you accept it? As intuition would dictate, this decision rests on whether the money you stand to win is enough to compensate for the small chance of hitting a 6. Mathematically, the odds of hitting a 6 are 1:5 and you stand to lose $1 for an opportunity to win $5. As it turns out, this bet is neither favorable nor unfavorable for you; the EV is zero. However, if anything more than $5 were offered in this bet, then the EV of the bet would be positive and accepting it would be the mathematically correct option, and vice versa.

Now we get to how these concepts apply to betting in poker. First, some basic terminology to help our discussion:    

Pot odds
Pot odds are the ratio of the current size of the pot to the size of the call a player must make to stay in the hand. For example, if the pot contains $100 and you must call a bet of $10 to stay in the hand, you are getting 100:10, or 10:1 pot odds.

An out is any unseen card that will improve a hand to a winning hand. What constitutes a winning hand depends on what cards one puts the other players on.

Say you start a poker hand with Ah Kh, and the flop comes Qh 8h 2c, the turn 7s. You are on a nut flush draw, and there are nine unseen cards – all the remaining hearts – that make your flush. So you have nine outs to a flush. If you think a pair of aces or kings is good enough to win against the other players in the hand, then you have six additional outs – all the unseen aces and kings. However, if you put a player on a set, then 2h and 7h are not outs for you – either card would make that player a full house or better, blowing your nut flush out of the water.

Say that in the given situation you think you have exactly the 9 outs that make your flush. There are 46 unseen cards: 52 cards in the deck minus 2 in your hand and 4 on the board.  Then there are 37 cards that miss your flush (46 – 9). Therefore the odds against your making a winning hand are 37:9, or about 4:1.

Say a player ahead of you bet $10 into the pot of $20 on the turn, making the pot size $30, and the action shifts to you. Then the pot odds you are laid are 30:10, or 3:1.

How do you decide if you should call or not? (We ignore the option of raising for now.)

You use the following simple principle: if the pot odds one is getting are better than the odds against making a winning hand, the expected value of the call must be positive, and therefore the call correct. This comparison, really, in its various avatars, is the gist of the mathematics of poker. 

In this case, since 3:1 is less than 4:1, it appears prima facie that this call is incorrect. Let us stick with that decision for the moment until we revisit this example shortly. However, if say the pot size were $50 instead of $30, the pot odds would be 50:10 or 5:1, making a call the right decision.

If you’re thinking that you’d rather lose money than make such calculations mid-game and kill all the fun of playing poker, fear not. Accurate pot odds calculations can be really useful when it comes to drawing hands like flush and straight draws, and all-in situations where there is no further betting to be done, but they are not required in most situations. With practice, one develops a familiarity with pot odds in standard situations and a sense of when to call with a hand, when to fold, and how much to bet such that you don’t concede favorable pot odds to a drawing opponent.   

A quick and useful way of approximating the probability of making a winning hand in poker is provided by this simple method: multiply the number of outs you have by 2 to get an estimate of the percentage chance of hitting one of these outs on the next street. So, in our example, the probability of making a flush on the next street is 9 outs times 2, which is 18%. This means that on the flop, with two cards to come, you have a 9 x 2 x 2 = 36% chance of making a flush by the river. As you can see from the table of standard probabilities provided here, this estimate is fairly close to the actual probability.

This clarifies, I hope, the general method of determining whether a call has a positive EV and therefore whether it is mathematically correct. It is an easy exercise to verify that whenever you have a winning hand – I mean a hand that has more than a 50% probability of remaining the winning hand on the next street – the EV of a call will always be positive, since the pot odds can never be less than 1:1 (check this for yourself).

How do you use EV when you want to be the aggressor and bet? In general, you ensure that your opponent is not getting good pot odds to call your bet. For instance, if you have the top pair or two pair on the flop and you’re putting your opponent on a drawing hand, you should make a sizeable bet, ideally one that gives her as low an EV for her call as possible without scaring her off. Every time your opponent makes a mistake by accepting a mathematically poor wager, you gain. Once in a while when your opponent will hit her draw and win the pot you’ll curse yourself for not betting hard enough to chase her away, but in the long run you’ll be better off having her call with negative EV. The gods of probability would have it no other way.  

Implied Odds
Let’s get back to our example: you have Ah Kh, and the board has Qh 8h 2c 7s. Say, in addition, that you’re playing heads up and you think you need a flush to beat you opponent’s hand. Your opponent has bet $10 into a pot of $20, making the pot size $30, and the action is on you. Based on our earlier calculation, you’re not getting the pot odds to make a profitable call.

However, if you’ve played enough poker, you have sensed that something is not quite right about this decision. It is this: our calculation did not take into account what may happen on the next street. If you make the flush, you might be able to get enough of your opponent’s chips on the river to make calling on the turn worthwhile.

Where pot odds take into consideration the money that's in the pot right now, implied odds are estimations taking into account how much money you can win if you hit one of your outs.
In this example, if you think your opponent might pay off a bet of $25 on the river if you hit your flush, you’re calling a bet of $10 on the turn to win $30 in the pot plus $25 on the river. If you don’t hit a flush, you won’t bet or call on the river, so you don’t stand to lose more than $10. Your implied odds are 55:10, or 5.5:1, which is better than 4:1, the approximate odds against making a flush. Therefore it is a profitable call. 

A note of caution: it is a common tendency to chase draws hoping that the implied odds justify the chase. Consider that good opponents will often correctly put you on a draw and not pay you off if you hit the draw, making your implied odds no better than your pot odds. 

I hope this little introduction to poker math has helped you identify some gaps in your game or at least made you think about whether your play is mathematically sound. Once you’ve developed a good understanding of the themes presented here – through, I hope, further reading and observing examples rather than losing a lot of money - you’ll find it easy to make your play consistent with mathematical principles and exploit your opponents’ weaknesses.
Happy calculating!  



No comments:

Post a Comment