June 25th, 2024

The Magic of Participatory Randomness

Randomness is vital in cryptography, gaming, and civic processes. Techniques like "Finger Dice" enable fair outcomes through participatory randomness, ensuring transparency and trust in provably fair games.

Read original articleLink Icon
The Magic of Participatory Randomness

Randomness plays a crucial role in various fields like cryptography, gaming, civic processes, and algorithms. Participatory randomness involves multiple parties collaborating to generate random outcomes, as seen in techniques like "Finger Dice" by game designer Ben Robbins. This method allows for fair and reliable results, ensuring equal probabilities for all outcomes. It can be applied in situations where traditional random number generators are unavailable or untrustworthy. The concept of provably fair games, rooted in cryptographic principles, emphasizes transparency and trust among participants. While casinos may have the upper hand in traditional gambling, provably fair games allow players to verify the fairness of the system. Collaborative random number generation methods like Finger Dice showcase how interactive proofs can ensure fairness without relying on blind trust. These techniques offer insights into the broader applications of randomness in various fields beyond gaming, highlighting the importance of understanding and implementing fair and reliable processes.

Related

Reconstructing Public Keys from Signatures

Reconstructing Public Keys from Signatures

The blog delves into reconstructing public keys from signatures in cryptographic schemes like ECDSA, RSA, Schnorr, and Dilithium. It highlights challenges, design choices, and security considerations, emphasizing the complexity and importance of robust security measures.

Do not confuse a random variable with its distribution

Do not confuse a random variable with its distribution

In probability theory, a random variable and its distribution are distinct concepts. The random variable depends on random outcomes, while the distribution shows variation patterns. Changing variables or probabilities alters distributions. Examples clarify shared distributions with different outcomes, crucial for interpreting probabilities and simulations.

Tracing garbage collection for arenas

Tracing garbage collection for arenas

Tracing garbage collection in systems programming languages like C++, Rust, and Ada is compared to reference counting. A simplified tracing garbage collection approach inspired by Mark-and-Sweep is proposed for languages like Zig or Odin.

Is Everything BS?

Is Everything BS?

Rory Sutherland emphasizes combining behavioral science and creativity for effective problem-solving. He advocates for a balanced approach, highlighting the significance of psychological insights alongside traditional methods to address various challenges successfully.

Real Chaos, Today – Randomised controlled trials in economics

Real Chaos, Today – Randomised controlled trials in economics

Randomized Controlled Trials (RCTs) in economics face controversy due to limitations like external validity challenges, ethical concerns, and potential diversion from structural issues. Critics question their effectiveness and cost impact on policymaking.

Link Icon 2 comments
By @philsnow - 4 months
> Reducing the set of options works all the way down to a d2, which is equivalent to flipping a coin.

I have seen this framed as "evens or odds": instead of asking "heads or tails", you ask "evens or odds". Each person reveals either one or two fingers, and the caller wins if the sum of fingers is in the class of numbers they called.

It's like rock-paper-scissors but it takes a deterministic amount of time (one round) whereas RPS can take arbitrarily long if the participants are deep in each others' meta.

By @hoytech - 4 months
Great article! I have some miscellaneous comments, in no particular order.

The digital commitment scheme as described has a subtle vulnerability. Suppose there are two participants, A and B. B decides to wait until A's hash is shared. Once A shares it, B shares that same hash as their commitment. Then, after A reveals their number, B also "reveals" that same number.

Since the described protocol is to XOR these revealed numbers together, the resulting output will be 0 and B will have known that in advance, meaning it was not fair. With N honest participants and 2N (or more) dishonest colluding participants it gets worse. Consider participants A (honest) and B and C (dishonest/colluding). B "cancels out" A's result as described above, and then C's result is XORed in, allowing the dishonest participants to chose whatever arbitrary output bits they like.

There are several solutions, of course. You could specifically check for and disallow duplicated commitments, or use something other than XOR to combine. Note that addition is not a great choice here: since N+N is even, the least-significant bit is certain to be 0. Better would be to sort the numbers and then hash their concatenation.

Another serious problem with commitment schemes in practice is last-revealer bias. After everyone has committed, and everyone has revealed except for the last participant, that last participant knows the final outcome and nobody else does. If the resulting random number is disadvantageous to the last revealer for any reason, they may simply choose to not reveal it. Then what does the protocol do? After a timeout period, use all the values except for the unrevealed one? If so, the last-revealer gets "another chance", and has managed to bias the result in their favour. This also gets worse with colluding participants. After everyone has revealed except the dishonest cartel of N participants, they have 2^N possible outcomes they can choose from, by choosing which combination of their N numbers to reveal.

The only general solution to this is to have some sort of punishment for failure to reveal (slashing your stake, in blockchain lingo).

> A random stream's fairness can also be compromised if it continues to be used beyond the point of a player's decision. For example, if you decide to play a game of Monopoly over the internet then you will probably have to re-seed the RNG on each player's turn.

An interesting solution to this is a "progressive reveal scheme". Each participant generates a random number and hashes it. And then they hash that hash, and keep doing this N times. Then each commits to the Nth hash with the usual commit-reveal scheme. Afterwards, at every turn in the game, each player reveals their previous pre-image, which they're locked in to (assuming a collision-resistant hash function).

> the rejection sampling method described earlier will sometimes require that you "re-roll" a die

The average number of attempts you'll have to make is 1/P, where P is the probability of any one being accepted. The wikipedia article shows this using calculus, but I wrote up a much simpler derivation here: https://metacpan.org/pod/Session::Token#EFFICIENCY-OF-RE-ROL...

> The Fisher-Yates shuffle [...] guarantees equal probability of all possible orderings.

A somewhat counter-intuitive observation is that the number of possible orderings often exceeds the number of possible seeds for your PRNG. For example, even if you're using 256 bits of seed, lists of size 58 or greater cannot possibly have all possible orderings equally probable, simply because 58! > 2^256.

> If you're comfortable with the idea that physical or analog finger dice can simulate a coin flip by holding up either one or two fingers

I found it strange the author feels the need to introduce and describe "finger dice" in such an elementary way. This game is universally known as odds and evens, and has been played continuously at least since the days of ancient Greece: https://en.wikipedia.org/wiki/Odds_and_evens_(hand_game)