to be completetly honest NEVER.... you never wanna go all-in nothing is ever for sure.. i mean maybe 10 percent of your bankroll 1 time a year.. but never more man trust me... u might get lucky but youll also get greedy and lose the next time...
When is it alright to go ALL-IN?
Collapse
X
-
twitchySBR High Roller
- 04-24-08
- 215
#71*My job is to provide you with winning picks!
Forum Plays: 22-25-1 (+.45 Units)Comment -
Mattn3236SBR Wise Guy
- 04-21-08
- 841
#72Go All-in whenever you want...screw the bankroll the only way to win REAL MONEY is to bet big or go home. NO guts NO gloryComment -
square1SBR Rookie
- 01-11-08
- 37
#73OK. Let's try to keep this tractable. Let's work out an example where utility is defined a function of consumption rather than wealth.
Let's imagine a 2 period model where at the start of each period the player decides how much to invest and at the end of the period the investment return is realized and he then determines how much to consume. We'll further assume that each investment is a binary outcome event.
So this gives us the following variables:
Let B = Initial bankroll (we can set this to 1 unit without loss of generality)Note that period 2 consumption isn't a variable insofar as the player will simply consume his entire bankroll. Putting it another way: "You can't take it with you."
Let I1 = Investment in period 1
Let Cw = Consumption in period 1 given period 1 investment was win
Let Cl = Consumption in period 1 given period 1 investment was loss
Let Iw2 = Investment in period 2 given period 1 investment was win
Let Il2 = Investment in period 2 given period 1 investment was loss
For simplicity we'll assume that both investments are identical, paying out at fractional odds f, and winning with probability p. We'll further assume that period 2 consumption is discounted at a rate of k (so 1 unit of utility in period 2 is worth k times as much as 1 unit of utility now -- for most players k < 1 ).
Period 2 starting bankroll given a win would be:Bw = B + I1 * f - CwAnd given a loss:
Bl = B - I1 - Cl
So period 2 consumption (in other words ending bankroll after the 2nd investment was realized) given a win/win would be:Cww = Bw + Iw2 * f
Period 2 consumption given a win followed by a loss would be:Cwl = Bw - Iw2
Period 2 consumption given a loss followed by a win would be:Clw = Bl + Il2 * f
Period 2 consumption given a loss/loss would be:Cll = Bl - Il2
So expected utility looks like this:
E(U) = p * ( U(Cw) + k * ( p * U(Cww) + (1-p) * U(Cwl) ) )
+ (1-p) * ( U(Cl) + k * ( p * U(Clw) + (1-p) * U(Cll) ) )
If we assume logarithmic utility then we know that period 2 investment will necessarily be the player's Kelly stake (you can't take it with you, remember?)
Iw2 = Bw/f * (p*f - (1-p))
Il2 = Bl/f * (p*f - (1-p))
So substituting in:
Cww = (B + I1 * f - Cw) * p * (2f - 1)
Cwl = (B + I1 * f - Cw) * (1-p) * (f+1)/f
Clw = (B - I1 - Cl) * p * (2f - 1)
Cll = (B - I1 - Cl) * (1-p) * (f+1)/f
So this gives us the following expected utility as a function of the decision variables Cw, Cl, and I1:E(U) = p*( log(Cw) + k * ( p * log((1 + I1 * f - Cw) * p * (2f - 1)) + (1-p)*log((1 + I1 * f - Cw) * (1-p) * (f+1)/f) ) )Differentiating wrt to Cw, Cl, and I1 and setting to zero gives us
+ (1-p)*( log(Cl) + k * (p*log((1 - I1 - Cl) * p * (2f - 1)) + (1-p)*log((1 - I1 - Cl) * (1-p) * (f+1)/f) ) )
0 = p*(k*(-((1 - p)/(1 + f*V - W)) - p/(1 + f*V - W)) + W^(-1))Solving then yields:
0 = (1 - p)*(L^(-1) + k*(-((1 - p)/(1 - L - V)) - p/(1 - L - V)))
0 = k*(1 - p)*(-((1 - p)/(1 - L - V)) - p/(1 - L - V)) + k*p*((f*(1 - p))/(1 + f*V - W) + (f*p)/(1 + f*V - W))
I1 = ( f*p - (1-p) ) / f
Cw = ( f*I1 + 1 ) / (1+k)
Cl = ( 1 - I1 ) / (1+k)
(I'll leave it as an exercise for the motivated reader to verify that is indeed a global maximum for f*p - (1-p) > 0, in other words for positive edge. I'll also note that this result is contingent on isoelastic utility, so partial Kelly would yield the same results, but another utility function would not.)
Of particular interest is the variable I1 (the amount invested at the start of period 1), which you'll note is simply the Kelly stake based solely on wealth. So in other words targeting consumption in Kelly leaves the solution completely unchanged! The investment amount is even independent of the discount rate. Now of course the more you discount future consumption (i.e., the lower the value of k) the more you'd choose to consume now but the discounting won't effect how much you choose to invest.
Now granted this is a rather simplified general example (although you'd find the same results even if you went out an infinite number of periods for k < 1 -- in other words even without the "You can't take it with you assumption) but the point is clear. Kelly staking of full bankroll is in this model completely consistent with maximizing utility of consumption and inconsistent with partial wealth Kelly maximization.
Now, let's consider another model, S for Square1.
Again we will have two periods, and again the agent will eat everything at the end of period 2. But this time, the agent will need to simultaneously decide investment and consumption during period 1. Then he will observe the investment's outcome, and consume whatever he has left. Letting k,f,and p have the same interpretations and defining B=1; then if c(1) and c(2) are consumption in the respective periods, and i is investment we have:
c(2) = 1 - c(1) + fi with probability p (a win); and
c(2) = 1 - c(1) - i with probability 1-p (a loss)
and the expected utility to be maximized w.r.t. c;i is as follows:
ln c(1) + d[p * ln(1 - c(1) + fi) + (1 - p) * ln(1 - c(1) - i)]
s.t.
0 <= c <= 1
0 <= i <= 1
c + i <= 1 (no borrowing)
Let's start by taking c(1) as given and maximizing with respect to i. Since ln c(1) is an affine shift, and d>0, this is equivalent to maximizing w.r.t i the following:
p * ln(1 - c(1) + fi) + (1 - p) * ln(1 - c(1) - i)
But this looks oddly familiar - it's just the Kelly problem given a bankroll of {1 - c(1)}! Thus, optimal i is given by:
i* = (1 - c(1)) * (p - (1-p)/f)
and we can obtain c(1)* by plugging in and re-differentiating. But I think we agree that i* is the primary variable of interest here, so I will leave that to the interested reader. The important thing is that c(1) is NOT going to be zero, since that would result in infinitely negative utility in the first period. That means the agent will play full Kelly, but with a bankroll of {1 - c(1)} - which is partial wealth Kelly.
So hopefully we agree that model S provides a theoretical justification for partial-wealth Kelly, just as model G provides a theoretical setting where partial-wealth Kelly is irrational.
If you're wondering why model S and model G are getting different results, the intuition is not complicated. Model G allows the agent to invest, then realize the gain/loss from investing, and then to consume. Model S requires the agent to invest and consume simultaneously first, and then realize the outcome of the investment. Thus, Model S prohibits the agent from investing and consuming the same resources in the same period, but Model G allows it. So Model S creates a tradeoff between consumption and investment that Model G lacks. Model G's agent used full-Kelly, since investment did not negatively impact current consumption. In Model S, investment does negatively impact current consumption, so the agent invests less.
Let me pre-emptively defend model S against two potential criticisms:
- Calling the results of model S partial-wealth Kelly is just an exercise in semantics. Once you've spent the money during period one, on c(1), all you have left is 1 - c(1). So, playing Kelly with 1 - c(1) IS full Kelly.
Wrong. At the point in time when you made the decision to invest the amount {1 - c(1)}, you could have invested any amount between 0 and 1. Clearly, your full wealth at that point was 1. You didn't "lose" the amount c(1) - you chose to do something else with it.
- Sporting events last about 3 hours, maybe 6 for certain baseball or hockey games. How realistic is it to assume you're making consumption decisions without knowing the outcome of your investments? Isn't model G better, where you invest, realize the results, then consume and repeat?
Yes and no, but more no than yes. If by consumption we mean buying the whole bar a round after a big win, then yes, model G is probably more accurate. But remember that consumption is very "sticky" - it's not that easy to adjust month-to-month, or even year-to-year. Nobody's going to move to a different house/apartment every time their bankroll undergoes a large swing. You have to buy cars, appliances, furniture, and other durable goods; and once you've bought them, you cannot liquidate them again without considerable annoyance and transaction costs. So I would argue on balance, it's the model where you observe the outcome first, and then are allowed to adjust that period's consumption that is the less realistic of the two.
We can further discuss the merits and shortcomings of model G against those of model S. But I do hope you're ready to retract your contention that a practitioner of partial-wealth Kelly lacks a valid theoretical micro-economic leg on which to stand. Of course it's not a perfect leg. But it's not a patently absurd leg either. It doesn't require crazy poorly-behaved preferences or depend on bizarre functional forms. It's really pretty reasonable.Comment -
DataSBR MVP
- 11-27-07
- 2236
#74Kelly bankroll is that minus all expenses needed for supporting desired life-style. I think that is the source of the argument.Comment -
donjuanSBR MVP
- 08-29-07
- 3993
#75to be completetly honest NEVER.... you never wanna go all-in nothing is ever for sure.. i mean maybe 10 percent of your bankroll 1 time a year.. but never more man trust me... u might get lucky but youll also get greedy and lose the next time...Comment -
square1SBR Rookie
- 01-11-08
- 37
#76
Kelly utility is the utility function that both implies and is implied by the conclusions of John Kelly's paper. While it's true that Kelly himself never appealed to the notion of utility per se, he did however make the a priori assumption that an agent acts to maximize the expected rate of growth of bankroll. This both implies and is implied by log utility. You've fully agreed with this.
But the point (and I believe I addressed this in my earlier post) is that losing one's entire bankroll needs to be infinitely bad for the Kelly conclusions to have force of logic. Why else would a player with a readily replenishable bankroll never choose to invest 100% of said bankroll (minimally defined) unless said bet won with certainty? Why else would a player choose inaction over a 99.9999% probability of multiplying his bankroll 50-fold with a 0.0001% probability of losing everything? Indeed why else would a player actually be willing to forfeit half his bankroll to avoid making such a monstrously +EV bet? Answer? Because a lost bankroll is infinitely bad.
A player can define his goals anyway he sees fit. But we should still be able analyze these goals using traditional utility theory and determine what preferences a player would need to have for them to be rational. To this end, I'm still waiting to hear of a nonpathological example of how a player's preferences would need to look for partial wealth Kelly to be rational.
If a player's goal were solely to maximize some subset of his bankroll for no reason other than "that's what he wanted to do" then sure -- partial wealth Kelly would work. But without making that rather contrived assumption -- I just don't see how one reaches partial wealth any other way.
Let's say a man is married to an absolute tyrant. (Don't we all know at least one man like this?) And let's say the tyrant is aware that the man has an inclination to gamble, and that the man is actually a quite skilled gambler. And the tryant says to the man "You may have $100 to gamble, but that's it. Ever. Even you collect cans and redeem them for the deposit, you're not throwing that money away on gambling. If you lose your $100, you never gamble again. Period". And let's say the man values money, but he values domestic tranquility far more, and is a coward, so he is unwilling to disobey the tyrant's orders, no matter the financial gain. And let's say he has ethical problems with murdering her or divorcing her or even gambling behind her back (he's very worried about his eternal soul). So again, no financial incentive could ever motivate him to do these things.
So he puts his $100 into an online sportsbook (matchbook would be a pretty good plan, given their lack of minimums). He analyzes his advantage. And now he has to decide how much to bet. He has 10,000 dollars in his checking account. What is rational behavior for this man? How should he size his bets?
Everyone's real world preferences extend far beyond money. No univariate utility function can even begin to address the following question:
How much shall I use to gamble, and how much shall I use on other things?Last edited by square1; 05-07-08, 07:53 PM. Reason: The last two sentences were poorly phrased originallyComment -
DataSBR MVP
- 11-27-07
- 2236
#77
Let's say a man is married to an absolute tyrant...Comment -
GanchrowSBR Hall of Famer
- 08-28-05
- 5011
#78Agreed. Let's call this model G for Ganchrow.
Now, let's consider another model, S for Square1.
Again we will have two periods, and again the agent will eat everything at the end of period 2. But this time, the agent will need to simultaneously decide investment and consumption during period 1. Then he will observe the investment's outcome, and consume whatever he has left. Letting k,f,and p have the same interpretations and defining B=1; then if c(1) and c(2) are consumption in the respective periods, and i is investment we have:
c(2) = 1 - c(1) + fi with probability p (a win); and
c(2) = 1 - c(1) - i with probability 1-p (a loss)
and the expected utility to be maximized w.r.t. c;i is as follows:
ln c(1) + d[p * ln(1 - c(1) + fi) + (1 - p) * ln(1 - c(1) - i)]
s.t.
0 <= c <= 1
0 <= i <= 1
c + i <= 1 (no borrowing)
Let's start by taking c(1) as given and maximizing with respect to i. Since ln c(1) is an affine shift, and d>0, this is equivalent to maximizing w.r.t i the following:
p * ln(1 - c(1) + fi) + (1 - p) * ln(1 - c(1) - i)
But this looks oddly familiar - it's just the Kelly problem given a bankroll of {1 - c(1)}! Thus, optimal i is given by:
i* = (1 - c(1)) * (p - (1-p)/f)
and we can obtain c(1)* by plugging in and re-differentiating. But I think we agree that i* is the primary variable of interest here, so I will leave that to the interested reader. The important thing is that c(1) is NOT going to be zero, since that would result in infinitely negative utility in the first period. That means the agent will play full Kelly, but with a bankroll of {1 - c(1)} - which is partial wealth Kelly.
So hopefully we agree that model S provides a theoretical justification for partial-wealth Kelly, just as model G provides a theoretical setting where partial-wealth Kelly is irrational.
If you're wondering why model S and model G are getting different results, the intuition is not complicated. Model G allows the agent to invest, then realize the gain/loss from investing, and then to consume. Model S requires the agent to invest and consume simultaneously first, and then realize the outcome of the investment. Thus, Model S prohibits the agent from investing and consuming the same resources in the same period, but Model G allows it. So Model S creates a tradeoff between consumption and investment that Model G lacks. Model G's agent used full-Kelly, since investment did not negatively impact current consumption. In Model S, investment does negatively impact current consumption, so the agent invests less.
The one problem I have with this is that it doesn't really seem to get you where you want to go. Sure, you'll be looking at a subset of bankroll each period ... but that subset will always be a constant fraction of total bankroll (isoelastic utility, remember).
So based on the S methodology had you say a 1 unit bankroll of which you needed to consume 10% this period for a functional BR of 90%), but then found yourself at 50% of bankroll, you'd only be able to consume 5% next period. Shoot up to 10,000% of initial and your consumption would be 1,000% of initial bankroll (although the percentages would vary based on the likely distribution of future betting opportunities).
The issue here is that this is not what most people mean when they refer to segmenting one's bankroll, where one sets aside X% of total wealth as "gambling money" and then eternally treats that as completely separate from "real money" (at least until next betting season). In no way does your above derivation serve to equate a Kelly bankroll with "what one can feel comfortable losing" now with total wealth "minus all expenses needed for supporting desired life-style" as another post has claimed.
Nevertheless, I do read what you're saying but if the above was really what you were getting at I'd have to consider it begging the question a bit. Sure, if you decide how much to consume prior to determining your P&L (which is certainly not what I generally do) then you need to consider that amount as a sunk cost which serves to reduce bankroll by a fixed percentage. It should come as no surprise at all that if a player needs to consume and if he further needs to make his consumption decision prior to knowing the outcome of his bet he's going to have to set some dollar value aside (prepaying, if you will) that will be determined on the quality of his coming future investment. Putting it another way c(o) isn't a constant but rather also a function of f and p. Does this really sound like partial wealth Kelly? Not to me.
But is that really partial wealth Kelly? "OK, I know I need to buy groceries later so let me net that out before this next bet." No. I don't think so. I don't think that that's what most people have in mind when they think of partial-wealth Kelly ("my Kelly Bankroll is only is $10,000 for the season and that's it!")
But I do think that there's probably a bit of middle ground that would involve weighting different segments of bankroll differently. (In other worsd attaching a likely discontinuous cost function to bankroll). The truth is that most real people do have kind of crazy cost functions associated with spending various bankroll segments. This is the nonlinearity of bankroll to which I've previously referred on this board and the nonlinearity to which you seem to be implying when you refer to "sticky consumption".
But I do hope you're ready to retract your contention that a practitioner of partial-wealth Kelly lacks a valid theoretical micro-economic leg on which to stand. Of course it's not a perfect leg. But it's not a patently absurd leg either. It doesn't require crazy poorly-behaved preferences or depend on bizarre functional forms. It's really pretty reasonable.Comment -
GanchrowSBR Hall of Famer
- 08-28-05
- 5011
#79Well, I will concede that logarithmic utility breaks down at the extremes. Much of economic theory does. But I'm still frustrated, because I don't think I've conveyed the idea behind what I'm trying to say yet.
Let's say a man is married to an absolute tyrant. (Don't we all know at least one man like this?) And let's say the tyrant is aware that the man has an inclination to gamble, and that the man is actually a quite skilled gambler. And the tryant says to the man "You may have $100 to gamble, but that's it. Ever. Even you collect cans and redeem them for the deposit, you're not throwing that money away on gambling. If you lose your $100, you never gamble again. Period". And let's say the man values money, but he values domestic tranquility far more, and is a coward, so he is unwilling to disobey the tyrant's orders, no matter the financial gain. And let's say he has ethical problems with murdering her or divorcing her or even gambling behind her back (he's very worried about his eternal soul). So again, no financial incentive could ever motivate him to do these things.
So he puts his $100 into an online sportsbook (matchbook would be a pretty good plan, given their lack of minimums). He analyzes his advantage. And now he has to decide how much to bet. He has 10,000 dollars in his checking account. What is rational behavior for this man? How should he size his bets?
I believe I brought up a very similar example either earlier in a different thread or perhaps in a different one.Comment -
square1SBR Rookie
- 01-11-08
- 37
#80
The one problem I have with this is that it doesn't really seem to get you where you want to go. Sure, you'll be looking at a subset of bankroll each period ... but that subset will always be a constant fraction of total bankroll (isoelastic utility, remember).
So based on the S methodology had you say a 1 unit bankroll of which you needed to consume 10% this period for a functional BR of 90%), but then found yourself at 50% of bankroll, you'd only be able to consume 5% next period. Shoot up to 10,000% of initial and your consumption would be 1,000% of initial bankroll.
The issue here is that this is not what most people mean when they refer to segmenting one's bankroll, where one sets aside X% of total wealth as "gambling money" and then treats that as completely separate from "real money".
Like I said, I don't think I've shown that perfectly, nor could I if I wanted to. I do think S is a reasonable framework that shows how and why that might be true.
So I do read what you're saying but I'd have to consider it begging the question a bit. Sure, if you decide how much to consume prior to determining your P&L (which is not what I did on Wall Street or what I do now -- On WS, I waited to find out my bonus and then figured out what I was gong to buy. Now, I wait to see how I do well each month or quarter or year and then at least for the most part make unplanned purchasing decisions based on that) then you need to consider that amount as a sunk cost which serves to reduce bankroll by a fixed percentage.
But is that really partial wealth Kelly? "OK, I know I need to buy groceries later so let me net that out before this next bet." No. I don't think so. I don't think that that's what most people have in mind when they think of partial-wealth Kelly ("my Kelly Bankroll is only is $10,000 for the season and that's it!")
But I think that there's probably a bit of middle ground that would involve weighting different segments of bankroll differently. The truth is that most real people do have kind of crazy cost functions associated with spending various bankroll segments. This is the nonlinearity of bankroll to which I've previously referred on this board and the nonlinearity to which you seem to be implying when you refer to "sticky consumption".
Again it depends on how we choose to define partial-wealth Kelly. Certainly I'm prepared to grant that if partial wealth Kelly refers to full wealth Kelly only adjusted by a fixed percent after each period for consumption in that period, then under the circumstances you've outlined it would be reasonable. A segmented bankroll, however, I'm not yet willing to accept.
Seriously, to go any further we'd have get more technical about "sticky consumption". (I'm not familiar with what you've written about bankroll nonlinearity, although it sounds like were getting at the same underlying principle). All this means is that to go from consumption level c to another consumption level c' incurs some transaction cost t. (Think of moving into a nicer house. Takes enormous amounts of time, and energy, and expense). But it creates discontinuities in the boundary conditions of the maximization problem, and solving those types of problems just isn't my idea of a good time.Comment -
square1SBR Rookie
- 01-11-08
- 37
#81
So the critic says "yeah, well, that just means log utility's not real" and I get that; only the thing is; both of the guys could have a log utility of wealth. It's their utility function concerning other stuff that differentiates them. But you're right; I don't have a coherent model to illustrate that, and if I did, it might well point to some other solution than partial-wealth Kelly.
I believe I brought up a very similar example either earlier in a different thread or perhaps in a different one.Comment -
GanchrowSBR Hall of Famer
- 08-28-05
- 5011
#82Well, I guess I'd phrase it slightly differently. The real issue is whether there is a trade-off between investment and consumption. But in order for that trade-off to exist, you almost have to be making the consumption/investment decisions at the same time, which implies you're choosing consumption before the investment outcome is realized.
Recall that in G, after a win, consumption looks like:
Cw = ( f*I1 + 1 ) / (1+k)And after a loss:
Cl = ( 1 - I1 ) / (1+k)
So after a win, investment (which I call 1st period, but you're calling 2nd period -- six of one) looks like:I1 = (1 + (1+k)*Cw) / f
And after a loss:I1 = 1 - (1+k)*Cl
But the point is that even in G an investment/consumption trade-off still exists. The more you choose to consume at the end of period 1 (i.e., the more you discounted future consumption) the less you'd have available to invest at the beginning of period 2. If you consumed nothing at the end of period 1 ( k = ∞) your Kelly Bankroll would be unchanged from the previous period's value.
Both S and G address the tradeoff and both serve to reduce investment for the sake of consumption (the lack of a pre-investment first-period consumption selection is but a minor wrinkle with G. You can start the model off with a C0 if you like). Remember that in S (just like in G) consumption is not a constant but rather a function of the quality of bets available.
Well, I'm not sure what I said to give the idea that the plan was to justify a "set-aside" bankroll. My interpretation of your post was that one should include in the Kelly bankroll the revenue one could potentially generate by, for example, forcing one's children into prostitution. I don't believe the theory calls for that, and that's what I hoped to show.
I'm not saying that in real life we shouldn't adjust for consumption but rather the way to do is (as both G&S) is by maximally construing bankroll, searching out investment opportunities, and then based on the availability of these betting opportunities we make a decision both on how much to invest and how much to consume. The better our slate of investment opportunities the more we'll put off consumption into the future.
While it's clear as I read back that you never put forth the notion of a "set-aside" bankroll, that was in reality the actually concept against which I was arguing with other members of the forum.
That you seem to agree that this is suboptimal for a nonpathological Kelly bettor leads me to believe we're probably fundamentally in agreement (although a G & S richer for the experience).
Seriously, to go any further we'd have get more technical about "sticky consumption". (I'm not familiar with what you've written about bankroll nonlinearity, although it sounds like were getting at the same underlying principle). All this means is that to go from consumption level c to another consumption level c' incurs some transaction cost t. (Think of moving into a nicer house. Takes enormous amounts of time, and energy, and expense). But it creates discontinuities in the boundary conditions of the maximization problem, and solving those types of problems just isn't my idea of a good time.
But again that not really the central point. Generally speaking the cost of transforming a some wide swath of a player's wealth into bankroll is essentially constant (and even more so for a professional bettor). Yeah, he probably should forgo mortgaging his house (except in the most extreme of examples), but I'm really just trying to illustrate the concept of a maximally construing a bankroll in such a manner as perhaps a limiting case.
But if you eliminate the frictional costs from consideration then the boundary conditions disappear and a Kelly bettor is right back at G&S only with a maximally construed bankroll.Comment -
square1SBR Rookie
- 01-11-08
- 37
#84A tradeoff exists between consumption and investment not only in S but also in G.
Recall that in G, after a win, consumption looks like:
Cw = ( f*I1 + 1 ) / (1+k)And after a loss:
Cl = ( 1 - I1 ) / (1+k)
So after a win, investment (which I call 1st period, but you're calling 2nd period -- six of one) looks like:I1 = (1 + (1+k)*Cw) / f
And after a loss:I1 = 1 - (1+k)*Cl
But the point is that even in G an investment/consumption trade-off still exists. The more you choose to consume at the end of period 1 (i.e., the more you discounted future consumption) the less you'd have available to invest at the beginning of period 2. If you consumed nothing at the end of period 1 ( k = ∞) your Kelly Bankroll would be unchanged from the previous period's value.
Both S and G address the tradeoff and both serve to reduce investment for the sake of consumption (the lack of a pre-investment first-period consumption selection is but a minor wrinkle with G. You can start the model off with a C0 if you like). Remember that in S (just like in G) consumption is not a constant but rather a function of the quality of bets available.
Let's consider the timelines, where I = invest, R = realize, C = consume and || is a period divider.
G: I - R - C || I - R - C
S: C/I - R || C
What do they look like without the period divider?
G: I - R - C - I - R - C
S: C/I - R - C
But let's think about the third and fourth elements of the G-list from above. After the second element R, the agent decides what to consume in period 1 and then what to invest in period 2. But really, he's making the decisions simultaneously regardless of whether the consumption takes place before, during, or after the investment - but it must take place before the fifth element R. No new information is gained between C and I. So, we could write as follows:
G: I - R - C/I - R - C
S: C/I - R - C
And if for some reason, G's first period's investment opportunity is cancelled:
G: C/I - R - C
S: C/I - R - C
So how did we obtain different results? Because we both cleverly chose our notation to support the point we were trying to make. In G, period 2's investment bankroll (though not investment itself) is considered to be total period 2 wealth, because period 1 consumption is not included in period 2 wealth. In S, I put the consumption/investment tradeoff in the same period - so calculating period 1's wealth includes the first C term, which makes investment bankroll different from wealth.
The multi-period model would look thusly:
C/I - R - C/I - R - C/I - R - C/I - R - C
where you eat everything during the last period.
So the key difference is: If the consumption bundle that was chosen simultaneously with the investment bundle is included in wealth, the investor appears to using partial-wealth Kelly. If wealth is determined only after excluding consumption by resetting the period, as in G, the investor appears to be using full-wealth Kelly.
You have to realize that the initial post to which you replied had been written in response to ideas from other posters (such as "Kelly bankroll is just what yo have on deposit at books" or "Kelly bankroll is just that portion of your wealth that you could comfortably lose -- these might make for good maxims, but they have little to do with teh reality of Kelly). My point was that generally speaking Kelly bankroll should be maximally construed. Obviously, I don't really believe a player should be selling his or her progeny (or anyone else's progeny for that matter) into sexual slavery to cover a bet, but that was obviously meant to be as illustrative of a point.
I'm not saying that in real life we shouldn't adjust for consumption but rather the way to do is (as both G&S) is by maximally construing bankroll, searching out investment opportunities, and then based on the availability of these betting opportunities we make a decision both on how much to invest and how much to consume. The better our slate of investment opportunities the more we'll put off consumption into the future.
While it's clear as I read back that you never put forth the notion of a "set-aside" bankroll, that was in reality the actually concept against which I was arguing with other members of the forum.
That you seem to agree that this is suboptimal for a nonpathological Kelly bettor leads me to believe we're probably fundamentally in agreement (although a G & S richer for the experience).
As far as the "set-aside" bankroll goes, it would be quite impossible to justify using anything remotely resembling standard economic theory. If we have two identical guys who decide to invest in the amount of 10k, and one guy wins $5000 as a result of sports "investing" and the other guy gets a $5000 bonus at work but breaks even on his sports investment; they are 100% interchangeable and must behave the same way in a utility-maximizing framework. Each one should make his future investment decisions assuming $5000 additional wealth relative to what he had before (subtracting any consumption, of course). Since the "set-aside" theory, as I understand it, involves them making different decisions, (the guy who won it will add all of it to his bankroll, but the guy who got the bonus will add none to his bankroll) trying to justify it via standard utility theory is hopeless.Comment -
WileOutSBR MVP
- 02-04-07
- 3844
#85
Comment -
donjuanSBR MVP
- 08-29-07
- 3993
#86"The NFIB estimates that over the lifetime of a business, 39% are profitable, 30% break even, and 30% lose money, with 1% falling in the "unable to determine" category."Comment -
CannonRestricted User
- 01-03-08
- 3329
#87When you have $54 in your account.Comment -
BrUno0SBR Wise Guy
- 03-30-08
- 574
#88Is it just me, or is this Kelly shit really hard to understand, i mean i have a general idea, but i'm still clueless, and i've read the whole post by Ganch and the wiki page......Comment -
WileOutSBR MVP
- 02-04-07
- 3844
#89Actually I disagree, I think it has a lot to do with what he said.
Just like sports gamblers, the majority of business owners either lose money (which usually means going bankrupt within your business i.e. losing your alloted bankroll), or break even in the long run.Comment -
calmSBR Hustler
- 01-04-08
- 82
#90Nice posts Ganch.
If you're (truly) a professional bettor with a $1,000,000 betting bankroll and $3,000,000 in savings you're probably better off assuming a $4MM bankroll with around a 5% Kelly fraction than a $1MM bankroll with around a 20% Kelly fraction (and in this simplified analysis we're admittedly neglecting the costs of transferring money from savings to betting accounts, as well as the opportunity costs (in terms of interest, dividends, capital appreciation, etc.) of removing capital from savings). Most of the time the difference between the two will be minimal, however where the contrast may well become apparent would be in times of several exceptionally good bets or during a period where a huge number of fairly good bets come along (such as, perhaps, during the Super Bowl or March Madness).
Also, a question:
Let's say I'm normally a 1/2 kelly bettor. But, if I found a true 100% guaranteed winner, I'd gladly bet everything I could get my hands on (full kelly). Then should I be varying my kelly fraction (between 1/2 and 1) based on each bet's probability of winning? Is there any way to model the optimal kelly fraction?Comment -
GanchrowSBR Hall of Famer
- 08-28-05
- 5011
#91I think you're off base here. There is absolutely no difference between the two, at least in terms of bet sizing and expected value. Yes the theoretical expected growth will be slightly different based on the different bankrolls, but the two bankrolls are truly identical, regardless of how you actually view them.
While it is approximately correct that in most typical cases a bettor with a Kelly multiplier of κ will wager κ * full Kelly stake this is not true in general and in certain situations (especially when edge is large compared to odds, or when a bettor is placing many simultaneous bets) this approximation can be off substantially.
For example, given odds of -2000 and an edge of 4.99%, a bettor with a $1,000,000 bankroll and a 20% Kelly multiplier would choose to wager $704,196 for an EV of $35,139. A bettor with a Kelly multiplier of 5% and a $4,000,000 bankroll, OTOH, would choose to wager $1,037,554 for an EV of $51,757.
Admittedly this is a rather extreme example, but I think the point is clear: while bet size is proportional to bankroll it is not (exactly) proportional to the Kelly multiplier and as such there is a difference between a bankroll of $4,000,000 with a multiplier of 5% and a bankroll $1,000,000 with a multiplier of 20%.
Also, a question:
Let's say I'm normally a 1/2 kelly bettor. But, if I found a true 100% guaranteed winner, I'd gladly bet everything I could get my hands on (full kelly). Then should I be varying my kelly fraction (between 1/2 and 1) based on each bet's probability of winning? Is there any way to model the optimal kelly fraction?
I'll point out that implicit within the Kelly methodology is the assumption that a bettor's Kelly multiplier always remains fixed regardless of bankroll size or bet flavor. There's no real reason to believe that this is true in reality and similarly no reason to believe that isoelastic utility functions in general represent any more than a convenient approximation of real world preferences.Comment -
calmSBR Hustler
- 01-04-08
- 82
#93No, this is untrue.
While it is approximately correct that in most typical cases a bettor with a Kelly multiplier of κ will wager κ * full Kelly stake this is not true in general and in certain situations (especially when edge is large compared to odds, or when a bettor is placing many simultaneous bets) this approximation can be off substantially.
For example, given odds of -2000 and an edge of 4.99%, a bettor with a $1,000,000 bankroll and a 20% Kelly multiplier would choose to wager $704,196 for an EV of $35,139. A bettor with a Kelly multiplier of 5% and a $4,000,000 bankroll, OTOH, would choose to wager $1,037,554 for an EV of $51,757.
Admittedly this is a rather extreme example, but I think the point is clear: while bet size is proportional to bankroll it is not (exactly) proportional to the Kelly multiplier and as such there is a difference between a bankroll of $4,000,000 with a multiplier of 5% and a bankroll $1,000,000 with a multiplier of 20%.
A bettor's Kelly multiplier corresponds to that bettor's preferences (specifically to his relative aversion to risk). You can't model a bettor's optimal Kelly fraction any more readily than you can model a bettor's optimal appreciation for chocolate versus vanilla.
I'll point out that implicit within the Kelly methodology is the assumption that a bettor's Kelly multiplier always remains fixed regardless of bankroll size or bet flavor. There's no real reason to believe that this is true in reality and similarly no reason to believe that isoelastic utility functions in general represent any more than a convenient approximation of real world preferences.
I was using: Bet Size as Percentage of Roll = (Kelly Fraction)*(Win Prob*Decimal Odds-1)/(Decimal Odds-1)
Can you explain why this is wrong, and how the correct formula looks?
Also, using your kelly calculator, I see that no matter the kelly fraction, a 100% bet calls for betting one's entire bankroll. That makes sense. So once I get the correct formula I guess that will be one less thing to worry about.Comment -
GanchrowSBR Hall of Famer
- 08-28-05
- 5011
#94
U(x;κ)=( κ (κ-1) )*x1 - 1/κ for κ≠1This implies dU dx = x-1/κ for all κ > 0
U(x;κ)=loge(x) for κ=1
These are the complete set of what are known as "isoelastic utility functions". Isoelastic utility functions exhibit the property of "constant relative risk aversion", meaning that risk tolerance is independent of absolute bankroll size. This means that one's aversion to a possible loss of some fraction f of bankroll would be the same irrespective of whether the player had a bankroll of $1,000 or $1,000,000. Its is important to note that the isoelastic utility functions are the only such functions to exhibit this property.
Relative risk aversion is encapsulated in a parameter known as the coefficient of relative risk aversion and is calculated as the negative of the ratio of the second derivative to the first derivative times the decision variable. In the case of κ-Kelly utility this coefficient takes on a value of 1 κ . This means that the higher a bettor's κ, the lower the (relative) aversion to risk.
For the solution to the κ-Kelly maximization problem to equal κ * the full Kelly stake, the partial Kelly utility would need to be:log(x/κ + B *(1- 1/κ))where B refers to the current bankroll. This would then imply a coefficient of relative risk aversion equal to x/(B*(κ-1) + x), which, due to the variable B appearing would mean relative risk aversion varied with bankroll for κ≠1.
Understand that there's no reason to necessarily assume relative risk aversion independent of bankroll, but probably because the concept is intuitively appealing (think of how many recreational bettors quote bet sizes in terms of "units") and allows for computational flexibility it's often taken as axiomatic\.
Anyway, the solution to the isoelastic single-bet κ-Kelly utility function is given by:( (wp)κ - (1-p)κ ) / ((wp)κ + w*(1-p)κ)where:
w = units won off a 1-unit bet (so in other words, decimal odds - 1 or "fractional odds"),
p = win probability, and
κ = Kelly multiplier.
You'll note that for κ=1, this reduces to the familiar full-Kelly solution: edge/(fractional odds).Comment
SBR Contests
Collapse
Top-Rated US Sportsbooks
Collapse
#1 BetMGM
4.8/5 BetMGM Bonus Code
#2 FanDuel
4.8/5 FanDuel Promo Code
#3 Caesars
4.8/5 Caesars Promo Code
#4 DraftKings
4.7/5 DraftKings Promo Code
#5 Fanatics
#6 bet365
4.7/5 bet365 Bonus Code
#7 Hard Rock
4.1/5 Hard Rock Bet Promo Code
#8 BetRivers
4.1/5 BetRivers Bonus Code