[keyanalyze-discuss] The web of trust tightens over time
Werner Koch
wk at gnupg.org
Tue Aug 5 10:37:02 CEST 2003
On Mon, 4 Aug 2003 16:11:19 -0400, Jason Harris said:
> David, Werner, would you care to hazard a guess? 20,814 keys are in the
> 2003-07-27 strong set. Assume a single 2.x GHz x86 CPU and ~250,000
> signatures.
Don't try it. I won't give an estimation as I know that the algorithm
is too worse.
> Would splitting such a large keyring help any? Are there any optimizations
> that could be done?
No. The obvious optimization is to allow random access to the keyring
and have an index. Years ago I tried this with a gdb based key
storage but it was too hard to maintain. gpgsm uses a new key storage
format which allows for an index and random access to each keyblock
without parsing all the precedeeng keyblocks. It is X.509, though.
Part of gpg 1.9 is to replace the keyrings with that new system. I am
pretty sure that this will boost the performance so that you can try a
--check-trustdb on 20000 keys.
> Would the best-case scenario be 20,814 individual keyrings with the key
> being checked located first in the keyring and the signing keys located
> in order based on their keyid?
No, the algorithm ist simply:
for all-keyrings do
for all-keys-in-keyring do
foo
Salam-Shalom,
Werner
--
Werner Koch <wk at gnupg.org>
The GnuPG Experts http://g10code.com
Free Software Foundation Europe http://fsfeurope.org
More information about the Gnupg-devel
mailing list