Originally <a href='http://www.sportsbookreview.com/forum/showthread.php?p=26258044'>posted</a> on 09/15/2016:

Proof for you naysayers. Here is a sample of the notes from just ONE of his presentations. Now you see what I mean about the boot camp/graduate level seminar analogy. Actually I think boot camp would have been easier! Everyone literally dropped from exhaustion at the end of the day.

Another look at one of the steps leading to the Fundamental Formula of Gambling:
1 – DC = (1 – p)NWe can express the probability as p = 1 / N; e.g. the probability of getting one point face when rolling a die is 1 in 6 or p = 1 / 6; the probability of getting one roulette number is 1 in 38 or p = 1 / 38. It is common sense that if we repeat the event N times we expect one success. That might be true for an extraordinarily large number of trials. If we repeat the event N times, we are NOT guaranteed to win. If we play roulette 38 consecutive spins, the chance to win is significantly less than 1!
1 – DC = (1 – 1 / N) N I noticed a mathematical limit. I saw clearly: lim((1 – (1 / N)) N) is equal to 1 / e (e represents the base of the natural logarithm or approximately 2.71828182845904...). Therefore:

1 – DC = 1 / e and
DC = 1 — (1 / e) The limit of 1 — (1 / e) is equal to approximately 0.632120558828558...
I tested for N = 100,000,000... N = 500,000,000 ... N = 1,000,000,000 (one billion) trials. The results ever so slightly decrease, approaching the limit … but never surpassing the limit!
When N = 100,000,000, then DC = .632120560667764...
When N = 1,000,000,000, then DC = .63212055901829...
(Calculations performed by SuperFormula.exe, function C = Degree of Certainty (DC), then option 1 = Degree of Certainty (DC), then option 2 = The program calculates p.)
If the probability is p = 1 / N and we repeat the event N times, the degree of certainty DC is 1 — (1 / e), when N tends to infinity. I named this relation Ion Saliu's Paradox of N Trials.


  • Soon after I published my page on Theory of Probability, Ion Saliu Paradox (in 2004), the adverse reactions were instantaneous. I even received multiple hostile emails from the same individual! Basically, they considered my (1 / e) discovery as idiocy! “You are mathematically challenged”, they were cursing! Guess, what? I saw in 2012 an edited page of Wikipedia (e constant) where my (1 / e) discovery is considered correct mathematics. Of course, they do not give me credit for that. Nor do they demonstrate mathematically the (1 / e) relation — because they don't know the demonstration (as of March 21, 2012)! You can see the mathematical proof right here, for the first time. I created a PDF file with nicely formatted equations:
  • Mathematics of Ion Saliu Paradox.

How long is in the long run? Or, how big is the law of BIG numbers? Ion Saliu's paradox of N trials makes it easy and clear. Let's repeat the number of trials in M multiples of N; e.g. play one roulette number in two series of 38 numbers each. The formula becomes:

1 – DC = (1 – 1 / N)NM = {(1 – 1 / N)N}M = (1 / e)M Therefore, the degree of certainty becomes:

DC = 1 – (1 / e)M If M tends to infinity, (1 / e)M tends to zero, therefore the degree of certainty tends to 1 (certainty, yes, but not in a philosophical sense!)
Actually, relatively low values of M make the degree of certainty very, very nearly 100%. For example, if M = 20, DC = 99.9999992%. If M = 50, the PCs of the day calculate DC = 100%. Of course, they can't approximate more than 18 decimal positions! Let's say we want to know how long it will take for all pick-3 lottery combinations to be drawn. The computers say that all 1000 pick-3 sets will come out within 50,000 drawings with a degree of certainty virtually equal to 100%.


  • Ion Saliu's Paradox of N Trials refers to randomly generating one element at a time from a set of N elements. There is a set of N distinct elements (e.g. lotto numbered-balls from 1 to 49). We randomly generate (or draw) 1 element at a time for a total of N drawings (number of trials). The result will be around 63% unique elements and around 37% duplicates (more precisely named repeats). Let's look at the probability situation from a different angle. What is the probability to randomly generate N elements at a time and ALL N elements be unique?
    Let's say we have 6 dice (since a die has 6 faces or elements); we throw all 6 dice at the same time (a perfectly random draw). What is the probability that all 6 faces will be unique (i.e. from 1 to 6 in any order)? Total possible cases is calculated by the Saliusian sets (or exponents): 66 (6 to the power of 6) or 46656. Total number of favorable cases is represented by permutations. The permutations are calculated by the factorial: 6! = 720. We calculate the probability of 6 unique point-faces on all 6 dice by dividing permutations to exponents: 720 / 46656 = 1 in 64.8.
    We can generalize to N elements randomly drawn N at a time. The probability of all N elements be unique is equal to permutations over exponents. A precise formula reads:

    Probability of unique N elements = N! / NN