Re: Adding entropy from external source into random number generator - how?

From: Mark Murray <markm_at_FreeBSD.org>
Date: Sun, 27 Mar 2022 10:01:08 UTC

> On 26 Mar 2022, at 17:29, freebsd-lists@sensation.net.au wrote:
> 
> Hi all. I was pointed to this mailing list, so I hope my query is reasonably on topic.
> 
> I've developed simple firmware on a microcontroller which uses the values of multiple floating analog inputs to generate random numbers. I'd like to use this as an external source to add entropy into a FreeBSD system.

OK. Good.

> I think the best way to do it would be to call random_harvest_queue(...), but what do I use as the source enum (see /usr/include/sys/random.h)? ENTROPYSOURCE, I guess?

Add a new one for your source.

> I believe it's also possible to open /dev/random for write to inject entropy, and I'm sure I saw mention of this being available around 12.0R, but I cannot find any mention of that scenario in the man pages.

This is for userland sources. If you are in-kernel, use random_harvest_queue(9), and be careful that you don't run at high rate - i.e. if your harvester spends a lot of time waiting for its source, then good, otherwise sleep to keep the rate down to a trickle. You don't need more than a maybe a few tens of harvested events per second maximum. If your source is good, even ten events per second would be excessive.

> I guess the other question to ask is whether ~45 kilobytes per second of additional entropy is even useful in a typical situation? There's no strict security requirement or anything like that, it's really just a fun project that I'm hoping to actually use. :) All entropy is good entropy, right?

What's your threat model?

Guessing 256 bits by brute force alone is such a good approximation to impossible in human timeframes that even a demigod would not bother trying. Supplying that much entropy per second may be good for generating "true" randomness only if you believe the accumulator and generator were broken cryptographically, but for everyday use that would be excessive by very many orders of magnitude.

Having an idea about how good your source is, would be a useful exercise. A basic and easy measurement would be to calculate the Shannon entropy of your source. This will give an estimate of the equivalent number of bits of entropy that it supplies, under the conditions of your measurement. See https://en.wikipedia.org/wiki/Entropy_(information_theory) <https://en.wikipedia.org/wiki/Entropy_(information_theory)> - H(X) is the Shannon entropy, measured in bits if b = 2 (see lower down in that page for the definition).

M
--
Mark R V Murray