November 30th, 2024

Square Roots and Maxima

The article explains the CDFs of the maximum of two independent uniform random variables and the square root of a uniform variable, emphasizing numerical simulations, Python code, and careful execution in analysis.

Read original articleLink Icon
Square Roots and Maxima

The article discusses a mathematical concept involving the cumulative distribution functions (CDFs) of the maximum of two independent random variables and the square root of a uniformly distributed random variable. It references a YouTube video by Grant Sanderson from 3Blue1Brown, which explains that for three independent random variables uniformly distributed between 0 and 1, the CDF of the maximum of two variables equals the CDF of the third variable. The author illustrates this concept by generating random numbers, applying the maximum and square root functions, and plotting the resulting CDFs. The results from simulations with varying sample sizes (20, 100, and 10,000) show that the sample CDFs converge towards the analytical solution. The author also emphasizes the utility of numerical checks in validating analytical results and provides Python code for generating the plots. The article concludes by highlighting the importance of computational tools in statistical analysis and the potential pitfalls of numerical methods if not executed carefully.

- The CDF of the maximum of two uniform random variables equals the CDF of the square root of a uniform random variable.

- Numerical simulations can effectively validate analytical results in probability theory.

- The article provides Python code for generating CDF plots using random samples.

- Increasing sample sizes leads to convergence of sample CDFs to the analytical solution.

- The author emphasizes the importance of careful execution in numerical methods to avoid errors.

Link Icon 5 comments
By @dahart - 2 months
Either I haven’t seen this before, or forgot it, but it’s surprising because I use the sum of independent uniform variables every once in a while — the sum of two vars is a tent function, the sum of three is a smooth piecewise quadratic lump, and the sum of many tends toward a normal distribution. And the distribution is easy calculated as the convolution of the input box functions (uniform variables). Looking it up just now I learned the sum of uniform variables is called an Irwin-Hall distribution (aka uniform sum distribution).

The min of two random vars has the opposite effect as the max does in this video. And now I’m curious - if we use the function definition of min/max — the nth root of the sum of the nth powers of the arguments — there is a continuum from min to sum to max, right? Are there useful applications of this generalized distribution? Does it already have a name?

By @prof-dr-ir - 2 months
If X1...Xn are independently uniformly distributed between 0 and 1 then:

P(max(X1 ... Xn) < x) =

P(X1 < x and X2 < x ... and Xn < x) =

P(X1 < x) P(X2 < x) ... P(Xn < x) =

x^n

Also,

P(X^{1/n} < x) = P(X < x^n) = x^n

I guess I am just an old man yelling at clouds, but it seems so strange to me that one would bother checking this with a numerical simulation. Is this a common way to think about, or teach, mathematics to computer scientists?

By @keithalewis - 2 months
Front page material? P(max{X_1, X_2} <= x) = P(X_1 <= x, X_2 <= x) = P(X_1 <= x) P(X_2 <= x) = xx. P(sqrt(X_3) <= x) = P(X_3 <= x^2) = x^2. It is late in the day when midgets cast long shadows.
By @gxs - 2 months
Just a side comment on what a great little video.

Short, to the point, and the illustrations/animations actually helped convey the message.

Would be super cool if someone could recommend some social media account/channel with collections of similar quality videos (for any field).

By @ndsipa_pomu - 2 months
Matt Parker's video on Square Roots and Maxima: https://www.youtube.com/watch?v=ga9Qk38FaHM