The problem is how to find odds in the markets, let say I am an option trader who bets in a spread who pays 30% of my bet and it says has 70% prob of winning and losing all my bet 30%
@@antonhelsgaun Well basically he says that risk aversion is BS since it can be proven by the Kelly criterion modelling that eventually over many runs, certain combinations of probability and payoff lead to ruin. There are the sweet spots with red dots but for example behavioral economists says that people wouldn't take a 70% 30% bet on their favor for 80% of their money because they are risk averse and here NNT says basically that it's not because of that, but instead that in a multi period model, (which means not only one bet but a series of them) if you bet 80% of your money with 70% 30% odds on your favor, in the long run you are going to lose all your money. So it's a shift of perspective, behavioral economists talk about a single bet, NNT talks about a series over multiple periods. So it's a game they play against each other, because still behavioral finance says that people wouldn't take a single bet for all their money if they were slight favorites because of loss aversion, but bets and payoffs are statistical, so their characteristics are driven from a series and not a single instance.
@@themomentcollector5402 You're wrong and the video is misleading. Behavioral finance is not necessarily bs. The reason people will refuse a 1 dollar bet that is in their favor is exactly because of risk aversion. So the video is lying that risk aversion theory is bs. In fact majority of people on a survey who would refuse such bet can afford to run that 1 dollar bet for at least 1000 times and 1 dollar is not 80% of their savings, it's not even 1% of their savings. So the reason they're not taking the bet is lack of financial understanding. The only exception is that some people are smart enough to refuse the bet if they are told they can run the test only once. Because increasing the number of bets is the only way to guarantees profit. I wouldn't take a 10 dollar bet with 70% chance of winning if I'm told that the bet can happen only once, but if I can do that bet 100 times, I would absolutely take the offer.
Laying out the math and the code for the simulation not only proves the point, it makes it clearer intuitively, you can see (experience) how it works. Cheers!
My conclusion from learning this and then exploring it further over the past few weeks: Throughout history, there exist at least a few bets where we, as a society, are all in. Therefore, we're all screwed eventually.
THANK YOU NNT!!! Recently read the revised version of black swan. I’ve never thought of investing until then I get a better understanding of managing risk and unforeseen situations. Let me just say that this video brought to my mind in such clarity how risk and bedding work with probability. I am not a statistician and my math skill is dismal. But as I’m sure you would say, I am one of your friend, as I read for knowledge and to expand what I know, and not Just for reading sake.
Thanks for prefacing the video with the analogy. EDIT: Are there any studies on the phenomenon of math becoming more interesting and understandable when it is taught because the student is a trader?
Probably someplace that exists. But most traders will have learned to avoid these kinds of bets from age-old heuristics handed down from mentor to apprentice over the decades...or more rarely, because they or someone they knew encountered the ruin problem themselves, and learned from it, and managed to secure enough capital to begin trading again. Few academics teach this.
Nasim, I was a year ago asked about taking a bet in an interview. I did not get the job because I took the bet blindly as anyone would given the odds. The feedback was that I did not understand risk. I ranked my brains on it for days on end. I did not realize Kelly was the answer.
It is weird to come to trading (of which I've done very little) after reading N N Taleb, as I'm sure many of his ideas are so obvious to traders, which is to say, real-world risk-takers.
Thanks for the vid! What is the formula to find out what sample of trades I need to be certain (with levels of confidence) of my average profit and win%?
Thanks for the video, I learnt a lot. Had to spend 15 minutes on Wikipedia to understand what you were saying. You make a very confusing and shocking remark about the $1 bet in the beginning without talking about how much money you have. It did a good job of piquing interest, but confuses the listener.
I'd like to read the book on Kelly Criterion since I read Mr. Taleb's books but I only know the 4 basic mathematic operations, interests and rule of 3. I also have a very basic statistics knowledge from a Marketing related crappy BA. Would someone please suggest basic material/courses I can take on math so I can correctly understand the book? Thank you.
Calculus 1-3, Linear Algebra, Calculus based Probability and Differential equations is minimum entry barrier. You can pick an Advanced Engineering Mathematics by Kreyszig which will cover all the basics.
The (oversimplified) gist of his point: Losing streaks exist. Even at only a 30% chance of losing, with 80% of your capital invested you can get wiped out very quickly by one or more bad strings of luck.
You will only go bust if you manage your bankroll poorly and reinvest too much of your winnings though (like a talented pro poker player who wins multiple tournaments but goes broke), right? If you take some winnings off the table, you can make profits in the long run? Thanks if someone can answer this!
Correct, if you quit while you're ahead, and never gamble after that for ever and ever, then you're ahead for ever and ever. But then you're not a persistent gambler, which is what "eventually" means. Persistence is key.
the most visible economic public figure on financial statistics and probabilities explains the whole point of diversification and gets 30,000 views. Random wrong fools on the internet get more views. Thank you Nassim!! Fooled by Randomness has sold me on diversification AND you have made me big money many times on my constant 1% dedicated to short positions. Downturns like we had this march are NOT black swans. they are rare, but since we know they are there, they are NOT black swans. It amazes me how many educated pundits don't get it. After a year, I am still absorbing the concept of Antifragility. It has kept me from investing in dinosaur businesses like GE BA VNQ in the name of diversity. I am still searching for antifragile investments to offset my allocaitions to TSLA SQ BTC . . . . Thank you and keep publishing your knowledge!!!!!!! You are a godsend to those of us who try to understand what you clearly know.
The chart and the code do not match, the chart seems to be y axis = odds and x axis to be payoff size (l), where as the code seems to be average return vs period, the explanation is for the chart but the code is producing a different chart
Interesting, and it explains why so many hedge funds blow up. He is a question I have though; how do the odds change if you don't go "all in" every time, but only bet the original amount while pocketing any wins? So if you bet $1 the first time and strike it lucky and win $1, but pull the $1 you won off the table and only bet the original $1 again, so you have $1 dollar on the table and $1 in the bank, and then you win another $1 but also put that dollar in the bank, but on the third roll of the dice you lose your original $1 dollar on the table but have $2 in the bank, does that change your odds? And then what happens if you take another $1 out of the bank and bet that, having lost the original dollar but now have $2 left? Of course I know what happens if you lose the original $1 on the first bet, which can also happen 30% of the time in the above example. I think this is the psychology casinos use to take all your money. Even if they have to pay out 97% of the time, they know you will keep betting it all and eventually the 3% comes around. That's also why if you walk out of there with a million in winnings, you get a free penthouse suite for your next visit. They know if you keep betting, they'll get it back, 3% a roll on average but generally all at once. So I guess a take away would be that whoever has the most cash that isn't on the table will eventually win.
@@borisibrahimovic5970 Thanks for the response. I think you are correct for a portfolio but I was thinking more the example where you might be betting a friend a dollar on something so that would be the increment. Or maybe it's poker night and you have a good hand but have to stretch to keep with the pot. Once you've won a bunch of your friend's money you wouldn't have to keep going "all in", but on the first hand you risk going bust.
I don't think enough information was given at the start of the video, since if the 70% bet of winning one dollar would be unfavourable it would assume that the one dollar would be more than 40% of one's portfolio
@@aldean5494 performing the kelly calculation with probability of 0.70, odds of 1 to 1, and a total portfolio of $10, the optimal kelly ratio is 40%. This means, any amount bet more than this and you risk bankruptcy in the long run. Since you are asking about betting 10%, the bet is favorable. In fact, betting 40% would be even better.
But this mainly goes for trades who have a profitable trading strategy yet don't use a proper stop loss right? Because if you're going for a Risk to reward trade of 1:5 with a stop loss of 3% of your account the chances of this happening to you is none right?
Interesting conclusions. That why it is often easier to make money in the stock market when it is an up trend? The variability in the payoff seems to be on of the major keys to understanding Kelly, as in the presentation? From what I gathered? Me, not being an expert.
The lose & win being L, or identical seems to be the issue: The implication of L being identical means this is quite limited, because the L is not identical in most cases. Even if a company does go bust, they get to borrow and play another day, because the win value over time is magnitudes greater than the lose value. So only equal size betting, means it's less broadly applicable than I might have feared. Obviously I could be wrong in my understanding here.
This begs the question: If the risk of ruin using the Kelly fraction is high, then what would the correct fraction be? To make the discussion productive, we should assume that there is a defined risk. That is to say, the amount you can win, any amount you can lose, Are known. This might occur in an iron condor, for example. The probability, is almost never known, at least with any certainty. However, the Kelley formula assumes probabilities are known and correct. Perhaps it is time to adjust the Kelly criterion for the case when probabilities are not known with 100% certainty.
Casinos have a tiny stake (as a percentage of their portfolio) in every bet they are involved in. They are not risking bankruptcy (except for a very remote probability).
Hi Nassim, What does the sensitivity analysis look like when payouts change from 70% = +$1 /30% = -$1 to 70% = +$1 / 30% = -$0.99 70% = +$1 / 30% = -$0.95 70% = +$1 / 30% = -$0.90 Does your conclusion that we "should not take the bet" still hold or is the possibility of total capital loss that introduces the counterintuitive result?
There is no counterintuitive result. Just a failure of some to grasp the correct strategy. Kelly strategy bets don't lead to ruin as long as the underlying assumptions are met (they typically aren't, but in most cases you can compensate). He is straw-manning.
i see the applications for a gamble or bet but would this really apply to investing with finance? i mean it assumes that we have some way of determining the probability of "going bust" from the start. i get why this is mathematically significant but if the input variables are that biased i just don't see this being any more applicable in finance but i am not a genius like this guy so please explain why im wrong haha
In my opinion it is the opposite. For betting it really doesn't make too much sense since no one bets 10,000 times, but investing and finance occurs over long periods of time with a lot of variability. The input variables are not biased. If you have a 70% chance of doubling your portfolio and a 30% chance of losing it all, it appears favourable even though in the long term it is guaranteed to go bust.
What an interesting insight demonstrated by rigorous mathematics! Bet more portion of your total wealth if the odds of winning are higher and less if lower. Can you modify this model for asymmetric returns on a winning or losing bet? This model from the (1+l)(1-l) portion assumes +100%l when won and -100%l when lost. Sometimes the bets are 1000% if you win and 100% loss when you lose(short-term options), or 10% when you win and -30% when you lose (US 10-yr). Maybe there can be a 3D graph to visualize that too? I think that would be really interesting to understand.
It looks like your model has every round as just one bet with 80% of your money, but wouldn't it be pretty safe if every dollar was a separate bet? Because then large amounts are less likely to fail than small amounts, and it will tend towards growing by 40% every turn, with less and less variation (relatively speaking) each round.
The math for inevitably going bust only checks out for single, large bets. If you are at $100 and you bet 80% every time, then 3 losses in a row will bring you below a $1 balance, so you can lose almost everything pretty fast. But if every dollar is an individual bet, then with $100 the probability of breaking even or losing money (if you bet everything) is 0.0022%. With $200 it's < 0.0001%
@@kalebbruwer Did you understand the video? I don't feel like Nassim cared about being clear. That's precisely the issue right? I don't think he ever mentioned that his player was making stupid big bets completely unnecessarily. Kelly's criterion gives exactly the advised size of a that bet. Does Nassim ever actually mention Kelly's criterion? The fact that your bet is constant could be problematic but I feel like when your portofolio starts big small bets are not that risky. Or is it? Is that what he's trying to show?
I think i am not getting the problem here. So what he is telling us is if someone offers us to to a 70-30% bet with 1$ on the line every time and repeat it 100.000 times we will get bankrupt? What does it even mean that the graph is below 0 on the right side?
If you bet your dollar means you bet 100% and with 70/30 odds you will lose eventually all. If you bet only 40 cents of your dollar and proportionally (same ratio bet/capital) future bets you are meant to survive in the long run, all things kept constant, with a CAGR about 7% or so. This is what I am understanding.
It's only BS if the PROPORTIONS of the bets (with respect to one's bankroll) stay the same as you described. However, if it's always the same absolute amounts 110$/100$ with 0.5/0.5 probabilities, then it's not BS. The probability of going bust in the latter case with a deep enough roll is virtually 0. In other words, Thaler's point at the end is not necessarily incorrect. In fact, the whole description in the beginning is sloppy because it should emphasize that your point is only correct if someone always bets a given percentage of his bankroll, not some fixed dollar amount. The distinction is crucial.
You are missing the point of the smoking analogy. Taleb argues that human beings don't think about risk in discrete, isolated events. We think about it as a process or a series of events. Your grandmother would not care if you had one cigarette, she cares that one cigarette is the first iteration in a process that will eventually give you cancer. So, when a person with a limited bankroll passes on a bet with favorable odds, it doesn't necessarily mean they are loss averse. It means, potentially, that they are considering how a series of such bets over their lifetime may well eventually ruin them. In the Thaler world, even caring about the size of the bet relative to the bank roll is seen as loss aversion; if the bet is favorable, why worry about the bankroll size? Instead, Taleb would argue if you are caring to optimally size the bet, you are (perhaps intuitively) aware of the tail risk of this process of betting and want to rationally reduce that risk.
btw ... it is natural to assume that a bet size would tend towards a consistent proportion of the bank roll. If you kept winning on this bet with favorable odds and your bankroll grew larger and larger AND you didn't adjust your bet size higher, the potential return on the bet relative to your bankroll will get smaller and smaller to the point where its no longer with your time to even make the bet. So, if your goal is to continue generating meaningful long-term returns, you will adjusted your bet size to a given percentage of your bankroll.
@@SpindicateAudio you're just patching up his incomplete description of the problem. Someone might want to always bet the same amount and accept a linear return. He's trash talking others but didn't describe the problem properly himself.
Actually from 6:10 on you get 0.93 (twice) and 0 (twice) for the 10000 rolls and 0 (four times) for the 100000 rolls not because it is actually zero, but because of numerical round-off. Your T-value is the average loss/gain per coin-toss and for large values of n the value should be about the 0.93. That's what you should also get in 100000 tosses and just means you lose 7% each toss. What you meant to say is that it is basically 0 when you take 0.93 to the n-th power.
@@pjauthur9869 The function K(N=L+W)=(1-l)^L*(1+l)^W may become very very small in the large N limit (N=100'000 for instance), if the parameters are chosen as such (p=.3 and l=.8), so Mathematica probably sets it to zero. Only afterwards he calculates the N'th root. In theory (by the law of large numbers) L/N should converge to p in probability and the function r=(1-l)^p*(1+l)^{1-p} - 1 is the one you see in the chart and r=0.93-1=0.07 for the parameters p=.3 and l=.8. K(100'000)=0.93^100'000=2*10^{-3152}.
He should've done the calculation for the example in Thaler's tweet. Reformulating his example, you are paying $100 for a 50% chance of winning $210. If $100 is greater than x% of your total wealth this is not a bet worth taking. My understanding of Taleb's position (for Thaler's experiment to be incorrect) is that any bet that is bad over the long term is necessarily bad over the short term. And judging from the red line in the graph you really want to only put a tiny position of your total wealth in a bet that has 50% chance of failure. Would have been nice if he had done the calculation because as far as I can tell, whether or not you should take that bet is dependent on your wealth, e.g. Homeless man with $100 probably shouldn't while Warrent Buffet may well should. Please if my interpretation is wrong let me know, just trying to understand the logic here.
@@moss_yass yes I am talking about Thaler's tweet being the amount of wealth you are betting everytime. Taleb himself said to consider the repetition of the event and not just a single event alone. Now the $100 bet is a fixed amount that you're betting every time, assuming that it's below that fixed portion of wealth which leads to ruin it's perfectly fine to keep taking the bet until you get to that threshold. The only thing I didn't intially consider is that in his case the amount you bet is always proportional to your wealth not some fixed value. However, my point still stands. According to his logic it may well be the case that you should take the bet.
@@bingbong2179 oh okay sorry about that I totally misinterpreted your comment. Yeah for sure there are definitely situations where that bet would be favorable. The fact that there are “optimal” solutions to this problem suggest that there are situations where the bet is favorable even in the long term.
Im gonna debate the principle here. If we come up with 1/4th of out net worth as the perfect size for the bet we obviously should adjust befor applying to the next bet. So if we lose, our new max bet should be the new adjusted 1/4th of our net worth. So the probability of going bust is matching zero and going negative is impossible. Am I missing something or do I not understand the principle? In my mind the argumentation in this video is flawed
at 3:30 you say "... below a certain level you will go bust" but I think you meant to say 'above'. These values of 50%, 80% are funny to sports bettors, 2% of bankroll is a more plausible figure if you have a slight advantage.
Sad I can't understand this math yet, but as an amateur trader, I think this means the profit factor, or expected risk reward ratio is more important than the win rate?
It is about betsize and winrate. If you have a 50% winrate you will go bust in the long run even if you only risk 5% of your account - and that's why many traders recommend to put 1% at risk max. If you have a 70% winrate you will go bust in the long run if you risk 80% of your account per trade. Winrate is very important.
@@joaorosas9598 I don't think so. You can make general assumptions like "trading with the trend has an above 50% winrate in the long run" but you can't tell by how much. And it really depends if you are short term or long term investor. I think the only people that know their winrate, are the ones that traded the same index or currency pair for 5+ years and got at least 1000+ trades. Everyone else is just guessing. I can for example toss a coin 20 times and get 5 tails and 15 heads but that doesn't mean I have a 75% winrate.
@@joaorosas9598 I think you could... You'd want to work out what the market thinks the win rate is; you'd need to decompose the yield into risk-free return (by looking at risk-free assets), and other spreads such as default risk, illiquidity premium, hype etc. This is probably the tricky part and a lot of it is finger-in-the-air stuff. You _should_ then be able to convert the default spread into a probability. I'm no expert so I may well be wrong.
Basically hes saying if you keep betting big %s of your bankroll you will eventually go bust, even if the wagers are continually high probability bets.
@@matta5749 @matta5749 If you have a better answer to Jonny's question, why not post that instead of trolling me with your condescension. If you don't have a better answer, then sit back and let the grown-ups discuss things maturely.
Couldn't it be possible that would never be the sweet spot? Let's say that the hypothetical sweet spot for double (or ever so slightly more) or nothing is 2% betting size as an example. How is anything below a 2% betting size then ever justified?
On a wager with a 1-to-1 payoff, your expected value is equal to the difference between your win probability and your loss probability. Your ideal wager is going to be equal to your expected value, since the variance is 1. Thus, for the "sweet spot" (kelly wager) to be 1%, you would need to have a 50.5% chance to win with a 49.5% chance to lose. If you're talking about a non-1-to-1 payoff wager, then it depends on the probability to win AND the payoff. In other words, there exist an infinite amount of combinations such that the kelly "sweet spot" is 1% of assets per wager. If you want to make the calculations yourself, the formula is: Expected_Value / Payoff.
@@pretendcampus5410 The "sweet spots" are continuously distributed. This is easy to recognize as a pure coin flip has a 0% optimal sweet spot, and anything that's positive EV is greater than 0, with no two different odds sharing the same sweet spot.
So, a simple betting/allocation strategy is to not continuously bet on anything with less than a 55% chance of winning, and then bet no more of a percentage of your portfolio than the red dot, e.g., no more than 50% of your bank on a bet with a .75 probability of winning, no more than 40% of your bank on a bet with a .7 probability of winning. Betting less than this reduces risk significantly at the cost of not maximizing returns. Is that correct?
But the example explained here only works if you win or lose a percentage of your money. The particuar case he is talking about is losing an absolute amount of money not a percentage of it, if I gain 110 bucks when I win but loose 100 and there's a 50% chance, over time I will win money: V_n=V_0+110*W-100*L -> (V_n-V_o)/N=110*W/N-100*L/N if we aproximate W/N=(1-P) and L/N=P, then we get: V_n-V_o=N*( 110*(1-p)-100*p )=N*(110-110p-100p)= N*(110 - 210p) With this we can see that the amount of money we gain (as a function of trials N), trends upwards (if p
Think kind of. Appears the main point is optimizing bet or investment size relative to bank roll. Depending on true odds and specifically risk of loss. With also minimizing or eliminating chance of 100% loss of bank roll over time. Which is where tail risk probably comes in. Don't feel the explanation here is that great. For instance, if you have true 70/30 odds and only bet 1% of bankroll. Risk is mainly irrelevant. May have draw downs. Even severe draw downs. But over time must make money. What Kelly says is that at 1% of bankroll bet is only not the size that will generate optimal profit. Not that it won't make money. Think maybe the hierarchy of concept importance here of % of bank at risk, then true odds, and then tail risk could have been made clearer.
@Jimmy Two Times There's no logistical symmetry between a casino vs players. The casino process does not at all fit the model of Taleb's demonstration. Extinction, for example. Or selective bankroll replenishment.
@Jimmy Two Times Taleb proves in this video that you will "go bust". You going bust is equivalent to the house (casino) winning your bankroll and ending the process by your extinction. "You always lose" is the converse of "the house always wins", so Taleb has proved both statements. The outcome of persistence is singular (bettor always loses) because the game is not symmetric between the parties. The betting party and the bet-taking party are not in the same positions. One is the player and one is the system of play. Favorable odds are irrelevant to the outcome under unbounded persistence, because an infinite bankroll always beats a finite bankroll, given enough persistence. What Kelly optimizes is not avoiding ruin, but the peak expected handle on the way to the inevitable ruin state. Kelly is like Keynes: in the long run we're all dead, but here is a strategy to maximize the action along the way to ruin.
Mathematically speaking he is correct but i do think the way he presented it is misleading. From what I understood this is the equivalent of saying if you flip a coin a million time maybe at some point you will flip 20 heads in a row ( unlikely but possible). Also, Kelly Criterion is not a theorem thus no proof is required. Its just an optimal approach.
Thank you, Mr. Taleb, for elucidating this. I am not well-versed in Academic Finance, but I know that the theories being taught, are theoretical hogwash. Just by judging the number of market busts that have occurred in the last 100 years, due to severe market distortions. I don't understand why people don't get your insights. They should eat their grandmother's recipe more often.
Is the point here that the probability of consecutive losses that break you increase with more periods? I’m going to have to watch this a few more times. I’m not getting the difference.
Think the point of Kelly here is that the combination of chance of loss plus % of total bankroll bet increases chance of going bust. Either quickly or especially over time. Like if you consistently bet 100% of bank at 90/10 odds but must play 15 times. Versus 1% of bank at 60/40 odds but must play 100k times. Whole Kelly goes further and says there is a % of bank bet size for each true odds level that generates optimal return. Which apparently the graph shows but doesn't seem to be discussed.
In the Ole Peters / Thaler tweet example, Thaler is wrong because he didn't state what % of his bank roll $100 represents ? if it represents too high % then he will go bust with repeated bets, surely Thaler is willing to admit that much ?
Take away: Don't repeatedly invest all (or alot) your money in events where you have a chance to lose everything cause you will eventually lose. How to help this diversify (i.e don't put all your eggs in one basket) once you win the first bet (if you're lucky) keep your cost. Then re-invest only what you win. That way worst case you will be left where you started. Its sensible money managemnet. I think this example proves that repeatedly taking big bets on outcomes where you can lose everything is a mugs game. Eventually, you will lose it all. If these events are independent I still think the BF single time horizon can be useful because I think a lot of individuals make decisions with one-time horizon and then revaluate their position in the next time period. Kind like crap chess players. The smart ones are looking one step ahead.
Do I understand correctly that this implies that dollar cost averaging might be advantageous to lump sum investing asap, which is often considered 'better' albeit more volatile?
I didn't exactly get the model. For instance i don't don't get how you can ever go broke in this model. If I risk 80% of my account each time, then as long as I have more than 0 dollars in my account I will still have money left for the next round. For instance let's consider a bet with 51% odds of doubling and 49% of halving. Then you should engage all your money in every round (at least for very large time discount factor).
Jimmy Two Times Not necessarily true. Many casinos have many games where they do NOT take the favourable bet, and still win out due to ruin theory. I’m an actuary btw
var pocket = 10 var probWin = 70/100; var ticketCost = 1; var probLoss = 1 - (probWin) var running = 100; var temp; for (var i = 0; i < running; i++) { pocket -= ticketCost temp = Math.random() < probWin if (temp) { pocket += 2; } else { pocket -= 2 } } console.log(pocket) // its super hard to get a positive result from that, but just by look at the code you may be inclined to don't think that
It's not at all counterintuitive that a 30% chance of losing everything will wipe you out quickly. Few people would board a plane with a 30% risk of crashing no matter the potential upside. But the loss doesn't have to be all or none. If the loss is a smaller proportion of the amount at risk, then it makes sense to risk a greater share of the total endowment. When losses and gains are down to a few percent of the amount risked per unit time, it's pretty safe to go all in.
1 unit is the total amount you invest in the thing. Not related to any currency. Can be anything. If you put 1000USD to invest in stocks, those 1000USD are your unit.
A friend of mine had a good way of winning in Las Vegas and it illustrates the importance of factors outside the mathematical model. (There's a twist, so read please the whole thing.) Here it is: first, bet a dollar at the roulette wheel on red or black. If you win, keep your dollar win, and bet one dollar again. If you lose, bet two dollars. Double your bet after each loss. There are two factors: the table limit and the "00" which is neither red nor black and is the house's win. Of course, mathematically the expected value is negative, so how did my friend win? Simple: the drinks are free or very cheap (or they were in 1998). The value of the alcohol he expected to drink was far less than his expected loss. And he expected to lose 100% of his gambling budget.
how are these results different from or based on normal bankroll management known in poker for example. couldnt quite follow the math so i was wondering if there are new insights that could be applied for something like poker or investments in generel
You say 70/30 always loses in the long run, no matter what percentage of your fund you stake. Did you have a min stake in these calcs (something like x% of original pot, or 1 dollar/unit)? If you don't it you can never reach zero. Are you implying that all risk eventually loses? If that is the case then we had better never cross the road again, because eventually you will get run over. The odds of crossing the road safely might be 99.9%, but if you do enough trials you will eventually get run over.
Yes, it is the same mathematical proof as gambler's ruin or the impossibility of a gambling system. But Kelly strategy is typically misunderstood as a magic system to prevent ruin, so it is useful to debunk that specific betting system. Academic and finance types are typically doing the misunderstanding, hence Taleb punctures their pseudo-intellectualism.
@@RichardKinch Thanks for the response, I think you're right I focused on the math and pretty graph and missed the context of the talk. I'm oblivious that there is even a world of academic risk assessment. Terrific beautiful subject and little wonder it needs to be repeatedly hammered home. On a few occasions I've tried to discuss where a strategy is mistaken (financial stuff) and I found my clarity and charisma were not sufficient to overcome their optimism. Can't make a horse drink!
@@RichardKinch No, this video is not disputing Kelly at all - it is reinforcing it. Taleb shows that with some non-optimal (over-) allocation you can experience ruin on a favourable bet.
So basically, know when to quit. All bets lead to eventual ruin over a long enough interval because our assets are finite and we might end up with a losing streak long enough that could end us. Did I get it right?
No. The problem is that if you always allocate too much for the level of risk, you will eventually be ruined. If you allocate too little, you won't be ruined, but you will make a lower return than you could have. There's a sweet spot in between. The counter intuitive part is that taking excessive risk will ruin you even if each bet has favourable odds.
@@Beach_comber You're wrong. There is no sweet spot. Ruin is inevitable when you persist. If you reduce wagers to avoid risking the last dollar, your bankroll will wither to trivial levels and stay there.
@@RichardKinch - i think the video is very unclear. You go bust if you continuously bet MORE than the optimal Kelly , even if the odds are in your favour on each bet.