cvs commit: src/sys/modules/random Makefile src/sys/dev/random
randomdev.h randomdev_soft.c randomdev_soft.h yar
Richard Coleman
richardcoleman at mindspring.com
Mon Apr 12 16:58:09 PDT 2004
David Malone wrote:
>>I think the old /dev/random caused more problems than it solved. Most
>>apps just used /dev/urandom to avoid all the end-user questions about
>>the blocking.
>
> I largely agree.
>
>>And the beauty of the Yarrow PRNG is that as long as you have enough
>>initial entropy to get started, you can pull as many bytes as you want
>>and still remain cryptographically strong (within some very high limit
>>of like 2^120 bytes before the next re-keying).
>
> It is still no good for generating keys that have more unpredictable
> bits than Yarrow's internal state, unless you can be sure that it
> has reseeded. For example, the Yarrow paper notes that there is no
> point using Yarrow-160 for generating 256 bit block cipher keys and
> that using it for things like one time pads are right out.
>
> David.
Well, the original Yarrow-160 paper was based on using 3DES and SHA1
(hence the 160). But the version of Yarrow called Fortuna (chapter 10
of Practical Cryptography) uses AES with 256-bit key, and SHA-256. A
quick check of the /dev/random code seems to indicate that Mark is using
something similar (although it looks like he is using Yarrow style
entropy estimation).
But I see what you are saying. The internal state of the generator
never has more than 256 bits of entropy. So, you cannot create
something with more entropy than that.
There are probably ways you could up this limit by keeping multiple key
schedules (think of it as interleaving multiple OFB streams). But this
is overkill for most practical situations.
As to the question of how to integrate high speed entropy sources, I
can't really give any suggestions there. I'm interested to see what is
the final outcome.
Richard Coleman
richardcoleman at mindspring.com
More information about the cvs-src
mailing list