Security through obscurity (was: OpenPGP Smartcard recommendations)

Peter Lebbing peter at digitalbrains.com
Tue Aug 23 14:36:23 CEST 2016


On 23/08/16 12:51, Karol Babioch wrote:
> However for me this mostly applies to the cryptographic concepts itself
> and maybe software implementing them, not necessarily to physical
> devices that have to withstand various forms of physical attacks. When
> it comes to the real world, I'm not sure if this concepts holds
> completely true, though.

My main argument is this: white-hat security researchers would have a
difficult time assessing your design when it is closed.

But that /you/ can't just go and download the design from Yubico does
not mean that nobody can access (steal) or reverse-engineer it. And the
people who do this are often the people you least want to have the
design. They're not the white-hats[1].

People might think that "not available == not available". I don't think
that's accurate.

Lastly, depending on where the company is situated, governments can play
a nefarious role, looking at the design, mandating backdoors. An open
design is better equiped against government mandates than a closed design.

> If the revelations of Snowden taught us anything, than that it is hard
> to implement crypto correctly in the real world. Yes, RSA, AES and such
> are probably computationally secure enough, so that even the NSA cannot
> break it. However, they don't have to, because in the real world there
> are easier ways.
> 
> For the most part, not attacking the crypto system itself is the
> weakness, but various side channels and other indirect vectors like
> that.

So you don't just want to know "this device does RSA", and "RSA is open
and secure". You want to know "it is correctly implemented". When you
can get good security experts to just download your code and design,
they can verify that. When you close the design, your only option is to
approach a handful of researchers and have them look at it under contract.

Bugs might still go unnoticed. Open access is not a panacea. But
"security through obscurity" IMHO pretty much means there are much more
bad guys looking into it than there are good guys.

> This costs them time and
> money, and hence reduces the potential revenue.

But they can sell their knowledge and expertise on a black market...

> [...], but in the end it always comes down to trust, e.g. in this
> case: Do you trust Yubico to have done everything that is reasonably
> possible to protect your keys.

I think they did it to the best of their ability. But I'd like it
independently verified that they actually didn't introduce a bad bug.
All software has bugs.

> Once again I'm not sure if the real world is black-and-white like this.
> Just making something "open source" is not making it secure. Take the
> recent PRNG vulnerability in GnuPG as example (CVE-2016-6313). It was
> there for nearly two decades, and nobody spotted it.

At least, none of the good guys did. I'm not trying to scare anyone. I'm
just saying that if the source was closed, there would be a lot less
good guys to ever do the spotting, and it would be harder to spot for
them. It makes it harder for the bad guys to spot the bugs by not having
the source, but it also makes it harder for the good guys.

> Obviously the same is true for all certifications and a hardware device
> is not secure, just because it is certified. Unfortunately these
> certifications (e.g. things like Common Criteria) are basically the only
> thing we have to make sure a product is designed with security in mind.

This is what I was actually referring to by "a certification stating
some party thinks you're secure"; I just put it a bit starkly for
effect. I'm not convinced CC EAL-blah actually means much. I get this
itchy "designed by committee" feeling, and a feeling of imposed
bureaucracy. So if Common Criteria are actually promoting security
through obscurity, which is what I think the blog post is alluding to...
meh. Stuff your criteria. Just look at in what ways card payments are
broken. Pick any recent Chaos Communication Congress, you'll find a talk
about it, I think. These systems are Common Criteria certified, yet you
can buy a second-hand payment terminal on eBay, enter a store
indentifier that is printed on every receipt of a store, and start
reimbursing payments to cards from that store, payments that were never
done in the first place. It's the reverse of paying at the store: you
put in your card and the store pays you, on your bank account. Methinks
Common Criteria missed a criterium.

> Once again, I'm playing the devil's advocate here. I'm in no way, shape
> or form connected with Yubico and do not want to defend them, but I
> think arguments can be made for both sides here.

Oh, definitely. But I'm convinced that in security, both algorithms and
implementations, an open design is always better than a closed
design.[2] I don't see the algorithm and the implementation as
completely disjoint in this respect. All of the stuff that protects the
secret knowledge, whether it is algorithmic through multiplying the two
secret primes of RSA or the implementation which must prevent leaking
knowledge in random padding; all of this stuff is vital in keeping
secrets secret. Any part failing is catastrophical.

I've read about way too many goofs to still trust that something is
"secure" when the manufacturer claims it is. I just grudgingly have to
accept that there is quite a limit to what is openly available at the
moment, or at what price in comfort and features. You always have to
compromise, and everybody needs to decide for themselves where that
compromise lies. It would be very preferable if that decision was based
on an accurate understanding of the matter, though... :( (I'm quite
worried about all these so-called "free" services on the internet. I
mean the "gratis" ones).

Peter.

[1] White-hats do reverse-engineer. For instance with voting systems,
which often turn out to have pretty lousy protection, even though the
manufacturer claimed all kinds of features.

[2] But I'm certainly not stating "any open design is better than a
closed design". That would be silly. There are certainly worse open
designs available than, say, a multiple-thousand-dollar Hardware
Security Module ;).

-- 
I use the GNU Privacy Guard (GnuPG) in combination with Enigmail.
You can send me encrypted mail if you want some privacy.
My key is available at <http://digitalbrains.com/2012/openpgp-key-peter>



More information about the Gnupg-users mailing list