17:45 |
#1
|
||||
|
||||
The nature of poker
Understanding the nature of poker - By Mike Caro
Note (This entry first appeared in Card Player magazine. It was originally published 1998 in two parts). Let’s think together. Today’s column is going to be a little tougher on our brains than most. But, it won’t hurt much, and we can use the exercise. What we’re going to think about is important, and I want to make sure that we understand. Someone died a few months ago. His name was Andy Morton, and he had contributed greatly to an Internet newsgroup called rec.gambling.poker. RGP is a place where I hang out a lot and participate in the discussions, and it’s a place you should learn to visit – whether just to read messages or to add to the discussions. You’ll need newsreader software like Forte’s Agent, or you can use facilities built into America Online, Netscape Navigator, or Internet Explorer. Anyway, ask around if you don’t know how to access the newsgroups. It will be worth your trouble. Back to Andy. He was young and died tragically in a motorcycle accident. He was a rising star among poker theorists, and he offered us something that has come to be known as Morton’s Theorem that was eloquently stated and brilliantly explained. (You learned more about it in Lou Krieger’s column a few issues ago.) He had once contacted me to say that he agreed with 90 percent of what I said. But after Andy died, a friend of his posted something he had written – something I had never seen before. In it, Andy says he had totally agreed with a column I had written, then changed his mind. He goes on to partially challenge David Sklansky’s Fundamental Theory of Poker and to describe and explain the foundation that makes up Morton’s Theorem. Who’s right? I want to talk about these things today. First of all, I don’t disagree with Andy’s theory. But it simply doesn’t dispute what I said. And I think Sklansky’s theory stands as gallantly as ever. But, more than anything else, I’m going to show you today how a simple example I used for teaching poker 20 years ago really goes a long way toward bringing all these thoughts – and more – together. First, here is Sklansky’s Fundamental Theorem of Poker, from the book Sklansky on Poker Theory, page 30. "Anytime you are playing an opponent who makes a mistake by playing his hand incorrectly based on what you have, you have gained. Anytime he plays his hand correctly based on what you have, you have lost." (David has stated that this is intended to apply to head-to-head situations, which involve only two players.) The dispute began with a publicly posted message from Abdul Jalib (who is himself a great asset to the RGP community) to rec.gambling.poker. He said he was saddened by the loss of Andy Morton, who had once stayed at his residence and shared many powerful insights about poker. He included the words of Morton. And these are Morton’s words: "I usually enjoy reading Mike Caro’s Card Player column. One from last June made a big impression on me. In it he says: Begin my quote: The real low-limit secret for today The most important thing I can teach you about playing the lower limits is that you usually should *not* raise from early positions, no matter what you have… because all of those theories of thinning the field and driving out opponents who might draw out on you don’t hold true in these smaller games [where] you’re usually surrounded by players who often call with nearly hopeless hands…. Which is better, playing against a few strong and semi strong players with possibly a small advantage for double stakes, or playing against a whole herd of players, mostly weak, for single stakes? Clearly, when you’re not likely to win the pot outright by chasing everyone out, you want to play against weak opponents, and the more the merrier. So, why raise? There, I’ve just described one of the costliest mistakes in low-limit poker. The mistake is raising when many potential callers remain behind you, thus chasing away your profit. Don’t do that. End of my quote. Until recently, this made a lot of sense to me. After all, the Fundamental Theorem of Poker states (roughly) that when your opponents make mistakes, you gain, and when they play correctly, you lose. In holdem, if all of those calling stations in the low-limit games want to chase me with their 5 out draws to make trips or 2 pair when I flop top pair best kicker, and they don’t have the pot odds to correctly do so, that sounds like a good situation for me. Yet, it seems like these players are drawing out so often that something must be wrong. Hang around the mid-limits, holdem or stud, for any length of time and you’re sure to hear players complain that the lower limit games can’t be beat. You can’t fight the huge number of callers, they say. You can’t protect your hand once the pot has grown so big, they say. At first, I thought these players were wrong. They just don’t understand the increased variance of playing in such situations, I told myself. In one sense, these players are right, of course. The large number of calling stations combined with a raise or two early in a hand make the pots in these games very large relative to the bet size. This has the effect of reducing the magnitude of the errors made by each individual caller at each individual decision. Heck, the pot might get so big from all that calling that the callers _ought_ to chase. For lack of a better term, I call this behavior on the fishes’ part _schooling_. Still, tight-aggressive players are on average wading into these pots with better than average hands, and in holdem when they flop top pair best kicker, for example, they should be taking the best of it against each of these long-shot draws (like second pair random kicker). In holdem, the schooling phenomenon increases the variance of the player who flops top pair holding AK, but probably also _increases_ his expectation in the long run, I thought, relative to a game where these players are correctly folding their weak draws. Thinking this way, I was delighted to follow Caro’s advice, and not try to run players with weak draws out of the pots where I thought I held the best hand on the flop or turn. This is contrary to a lot of advice from other poker strategists, as Caro points out, and I found myself (successfully, I think) trying to convince some of my poker playing buddies of Caro’s point of view in a discussion last week. Well, some more thinking, rereading some old r.g.p. posts (thank you, dejanews), a long discussion with Abdul Jalib, and a little algebra have changed my mind: I think Caro’s advice is dead wrong (at least in many situations) and I think I can convince you of this, if you’ll follow me for a bit longer. End of Morton post. Andy’s post goes on and becomes quite compelling. You should not be satisfied with the way I’m summarizing his thoughts. Instead, if you have access to the web, you should go to http://www.DejaNews.com and search for his article. In general, he says that I am wrong because there is a point in very many multi-way hands at which every additional caller gives you the worst of it. By making incorrect decisions, they are costing you money. And, Andy is right! I just wish we’d had the chance to discuss this, because I’ve actually said that – although I haven’t set forth a proof as elaborate or as example-filled and as thought-provoking as he did. David Sklansky’s theorem comes under question in the same post, because – although he pointed out specifically (right under the theorem itself) that his concept may not always apply to multi-way pots, Morton thought that the theorem seldom applies to multi-way pots. I don’t know which way David believes in this regard, but I think that for most situations, even in multi-way pots, you will profit when an opponent makes a bad decision. True, the error can sometimes help another opponent more than you, and can even cost you money, but – in general – I still believe that loose callers are to your benefit in real poker games, with eight or fewer opponents. Fine. I have said all these things previously: 1. That when you raise with a strong hand with the intention of "thinning the field" of opponents, you are usually employing the wrong strategy. Why? First, because you’ll usually make more money if everyone calls (this is what’s in dispute). Second, if you do succeed in thinning the field, you are most likely going to chase away the players you would most likely like to have call you and be isolated against the players you wanted to fold. In other words, thinning the field usually results in you chasing out the weaker hands and failing to chase out the stronger ones. That’s a problem with the strategy. 2. There really are some hands that play better against fewer opponents. 3. If you have the second-best hand possible and people are drawing to beat it, there comes a point when there are too many callers and your profit dwindles. 4. There comes a point beyond that when you hand is not profitable at all. 5. There comes a point beyond that when your hand is almost worthless. At a seminar in the late eighties, I explained that, in a real-life, eight-handed poker game, some hands play best against an exact number of opponents. More opponents and these hands fare worse. Fewer opponents and these hands fare worse. (Of course, the exact number of opponents that most hands – in fact, all but the very strongest hands – play best against is zero! This means that you will usually make more instant profit by winning the ante or blinds outright than you will make, on average, if you are pursued by opponents.) In discussions on RGP, people have surmised that you can have too many pathetically loose callers – that you should prefer a game where there are one or two sensible opponents to one where all opponents are out to play every hand and destroy their bankrolls. This is, perhaps, a way of applying Morton’s Theorem, but it’s wrong. I guarantee you that in most ring games, consisting of the usual number of opponents – you will make more money if you are surrounded by the weakest foes you can find – and the more the better. Still, I’m about to show you why this isn’t so for large groups of opponents. Nobody gets it. Not only have I theorized that there are hands in actual real-life poker games that play best against an exact number of opponents, I have taught something since the seventies that should help us to understand these seemingly mysterious poker principles. I posted this concept to RGP, but everyone seemed to skip right over it, as if it were trivial and inconsequential. It isn’t. It explains everything – and so, I’m going to share it with you now. In draw poker, two small pair is a favorite against someone drawing cards, often against two players drawing cards. But beyond that the hand often becomes unprofitable. In poker, you see, a hand can be a favorite against each opponent individually and still lose money against a large field of opponents collectively. To explain this, I have – for more than 20 years – used the everyone-in-the-world-playing-poker analogy. I goes like this. If you were playing five card draw against five billion opponents (everyone in the world back in 1976) and you were dealt a pat king-high straight flush, you’d be a huge favorite against each individual opponent. But if everyone in the world called, your hand would be almost worthless. How come? It’s because somebody out there is sure to end up with a royal flush – in fact, many opponents will share the pot with royal flushes. Suppose everyone bet $1. The pot is now $5 billion. I know what you’re thinking. You’re thinking, "OK, I’m probably going to get this king-high straight flush beat, but it’s still profitable." If I lose, I lose $1. If I win, I win $5 billion. So, I’ll just stick with this hand and hope nobody beats it." But, it doesn’t work that way. Come close. Listen to me. Each additional opponent only adds $1 to the pot. But the cumulative likelihood of somebody beating you gets larger by a factor that quickly overwhelms the value of the dollar. I know this is a tough concept, but look at it this way. I need you to think hard now. Suppose we played a game where you wagered $1 and everyone else wagered $2. This is a coin-flip game, and anyone has to flip tails to beat you and win (or share) the pot. The game can’t get any simpler. Now, you’re thinking, "Hey, how can this be bad? My opponent has only a fifty-fifty chance of flipping tails. If he does, I’ll lose $1. But if he doesn’t, I’ll win $2." It’s clear that on average, you’ll gain $1 in profit every two flips (winning $2 once and losing $1 once). So, it’s worth 50 cents to you for that opponent to flip. Fine. Now someone else wants to play. Is this good? Well, let’s think. In order to beat two players, you need to have both of them flip heads. If one flips tails, he will win the pot, and you will lose your dollar. If both flip tails, they will split the pot, and you will lose your dollar. But, let’s see. There are four ways the coins can fall: heads-heads, heads-tails, tails-heads, and heads-heads. If you played long enough, on average you would win only one out of four times (when you saw heads-heads). So, you’d lose $1 three times and win $4 once ($2 from each opponent) in four "deals." So, again, you win $1. But this time it took you four tries, on average, and the value was only 25 cents per try. Clearly, you were better off without the second player. But, wait, it gets worse. What if a third opponent joins in? You are now going to win just one out of eight times. The results can be heads-heads-heads, heads-heads-tails, heads-tails-heads, tails-heads-hears, tails-tails-tails, tails-tails-heads, tails-heads-tails, or heads-tails-tails. Of these eight possible outcomes, only heads-heads-heads is good for you. So, on average, for every eight "deals" against three opponents you will win $6 once ($2 from each of three opponents) and lose $1 seven times. Now, you’re actually averaging a loss of 12½ cents a try. And it gets worse and worse, my friends, as you add players until your bet is virtually worthless. This same concept applies with your king-high straight flush, if you play against everyone in the world and deal from an endless deck. You will be getting five billion to one odds, but you won’t win once in five billion times. Not even close! What does this prove? Lots, like: 1. You can’t play poker rationally without an ante. Only a pat royal flush could wager, and it could only be rationally called by another pat royal flush. 2. Very strong hands fare well against a single caller. 3. Very strong hands fare better as you add a second, third, and fourth caller. 4. At some point, there can be so many callers that even the second-best possible hand loses money. 5. You really can be better off theoretically if there are sensible players in your game, limiting your field of opponents to the right number of weak callers. (This doesn’t often happen in real poker games with the usual number of players, though.) 6. There is such a thing as "implied collusion," which is a point Andy Morton makes. If everyone understands that they will split up the king-high straight flush’s money if they all call, they can profit by doing so. Let’s probe deeper. Let’s say you had an ante less game involving everyone in the world and you were awarded a pat king-high straight by the rules. That’s the good news. The bad news is that you must make a $1,000 blind bet. Picture it. Everyone else, in turn, can either pass or match your $1,000 and be dealt five random cards in an attempt to beat your hand. Those are all the rules. No drawing. No raising. Call and get five cards for $1,000 or fold. So, now you think to yourself, "This is ridiculous! I’m just going to get my $1,000 back. Nobody’s going to be stupid enough to call." And for awhile, it looks like you’re right. Ten opponents pass. Then 50 opponents, Then 100. But, finally, the 171st opponent calls. His wife is astonished, having already folded. "What the hell are you doing, Harry!" she shrieks. "I feel lucky, " he mumbles. "This is great!" you think. I just made $1,000!" Then you think – "Well, almost $1,000. Of course, there’s one chance in 649,740 that he’ll be dealt a royal. Or he might tie me." (Remember, the cards are dealt randomly from an infinite deck, because there are so many people in the world, so the cards you hold don’t influence the probabilities of anyone catching a royal.) Eighty more opponents pass. Then someone else feels lucky, too. "Gosh," you think. "Now I’m almost certain to win TWO thousand dollars!" The action goes on and on, everyone deciding what to do when it’s they’re turn. After several days, you begin to get nervous. You’ve got 450,000 callers, and the pot has grown to $450,001,000. You think, "Gee whiz, I’ve got only about a fifty-fifty chance of winning now." And your assessment is right (because of a formula I have previously discussed, but isn’t worth delving into now). But you’re still going to make a huge profit. But, as the days go by, it gets to that point where there are too many callers. The odds of you winning have diminished faster than the expected pot size, and now each additional caller costs you more and more. It’s exactly the same as the coin-flip example. Exactly. Here is the post I made to rec.gambling.poker after Andy Morton died: I don’t know whether I ever had the pleasure of meeting Andy Morton personally, although I’m told he sometimes played at Hollywood Park while I was there. He did, however, e-mail me (a copy appears at the end of this post.) He said that he agreed with me about 90 percent of the time. Your post must be in reference to the other 10 percent. :-) I, too, am saddened by his death, and the passages you quoted show much brilliance that — if expanded over the years — would have been welcome by the ever-evolving family of poker analysts. It is simply one of the most interesting and unexpectedly thoughtful pieces of poker thinking I’ve read recently. Thank you for sharing it. I will have to study the words more closely, but he says that he did agree with me at first, then, later, he didn’t. One way or the other, he was bound to be right, and in this case, I believe it was the former. There is nothing wrong with his argument. There are, of course, ways that a player can do the unprofitable thing and the benefit of this is directed toward a player other than yourself. I point this out in discussing draw poker (which is easy to understand). If you hold a pair of kings and four opponents are drawing to flushes of different suits, then the PRIMARY beneficiary (in rare cases the ONLY beneficiary) of the extra players is the person drawing to the BEST flush. The rest SHOULD BE getting good pot odds, but they aren’t, if you KNOW what the other hands are. Still, their problem isn’t with YOU and your pair of kings; it’s with simultaneously made flushes. If the weakest flush connects, YOU don’t care how many of the others make a flush (unless you make a "miracle" full house or better, in which case you hope they ALL connect). But the weakest flush, having connected, DOES care. The same is true of two small pair. If a lot of players are drawing to beat the hand, it’s a favorite against EACH opponent individually, but is a loser against ALL of them collectively. In part, I’ve made the same point that Andy made in what will seem to be a dissimilar way. I may have even made it on this forum years ago. Suppose there were no draw, just five-card poker, where you play what you get, and almost everyone in the world is dealt in from an infinite deck. There are over five billion contestants. You are dealt K-Q-J-10-9, all spades. You are more than a 600,000 to 1 favorite not to lose (win or tie) against any opponent you could randomly choose. And still you’d need to throw that nine away and try for a royal flush, if you could draw. But unfortunately, you’re stuck with the hand — and the loss — because there IS no draw. As I’m sure you understand, the pot goes to just one player, and that means the odds of your hand holding up become disproportionately greater with the addition of each opponent. So, if I told you in advance that you would be dealt a king-high flush and I let you choose the ideal number of opponents, that number can be calculated for maximum profit. There is possibly a simple way to resolve the larger argument about whether you want many weak callers, but not to everyone’s satisfaction. And I’ll leave it to others to actually do it. Start working down the list of best hold ’em hands: A-A, K-K, Q-Q, J-J, A-K suited, and so forth. Go as far as you want. You’ll need to write a simple match-up program to do this. Create a computer simulation to run each hand against nine random opponents. And, then, for each hand, examine the nine opponents and toss out the weakest five. Then, for each random match-up, also run the selected target hand against just the four remaining opponents. Then measure in which case the target hand did better, based on the actual share of the pots won versus the "fair share." You’ll have to select your own method, but I’m open to anything reasonable. Naturally, there is bias against the four-opponent trial, because there is presumably some dead money to be considered, and with four opponents, it is "divided" fewer ways. Also, this doesn’t adequately take into consideration the fact that some players will pass along the way, and that they may pass either correctly or incorrectly at THOSE stages, too. We’re just trying to keep it simple to see if there’s an obvious truth. We can make reasonable adjustments now, and later (if necessary). And we can set up elaborate simulations that actually play through the hands (although that will lead to elaborate and understandable arguments about how the chosen strategy skewed the results). Here’s what I think one will eventually discover by pursuing this analysis to its conclusion (assuming that’s possible):. 1. There are specific situations in which you’d rather have fewer than many opponents. 2. There are some situations in which there is a "perfect" number of opponents. More hurts profit; fewer hurts profit. (I have written and spoken at length about this, so I don’t wish to be pinned TOO firmly to the quote Andy cites from my column. I use different pieces of poker theory to illustrate different points at different times.) 3. In MOST situations, you will increase your profit if your field of opponents is larger, holding weaker average hands, than you will if your field of opponents is smaller, holding stronger average hands. 4. The money has to go SOMEWHERE. Therefore (excluding house rakes for convenience), all money lost by poor play ends up in opposing stacks. 5. Sometimes, certain hands, especially the best speculative hands, get a greater benefit from the money lost on poor play than do other hands. 6. Andy’s concept of "Implicit Collusion" is counterbalanced to some degree by "Implicit Shared Profit." What I mean is that you don’t usually enjoy a perfect knowledge, when you enter a pot, about WHICH hand is the most likely to be punished disproportionately by the weak entrants. But you do know that the weak entrants themselves will eventually be punished. I could go on, but I won’t. Abdul, I am very grateful that you published Andy’s post. And I am honored that he thought enough of my research to probe deeper It is, overall, solid reasoning, and I’m not sure — if he and I had the chance to sit down and talk — that we would disagree much. Also, I would like to acknowledge that I have read many of your posts and often find them brilliant, as well. I appreciate your contributions to this forum. Straight Flushes, Mike Caro There is so much more I’d like to say on this topic. But we don’t have the time or the space. So, this is what I believe: Even if the too-many-opponents theory does frequently apply in regular poker games, the act of trying to take advantage of it by limiting the field is likely to cost you money. That’s because you are apt to chase out the weak opponents who could supply profit on future betting rounds and leave yourself against your least desirable opponents. The cure can become worse than the disease. The bottom line is that Andy Morton is right. David Sklansky is also right. And I am always right. But, who didn’t know that? MC
__________________
Sell crazy someplace else, we're all stocked up here. |
|
|