I finished Bruce Schneier's Secrets
and Lies: Digital Security in a Networked World today. Despite the fact that it's getting
a little dated (2000) for a book dealing largely with computer and network
security, it was a fun and clear introduction to the fundamentals of topics to
which I needed a fun and clear introduction.
I'm much more familiar with basic security concepts and buzzwords —
cryptography primitives, entropy, unicity distance, one-way hash functions —
for having read the book. Readers get a moderate dose of Alice and Bob, and
plenty of funny anecdotes to explain different concepts, like observing from
outside the Pentagon a massive spike in late-night pizza deliveries shortly
before the United States began bombing Iraq as an example of traffic analysis.
Schneier is very quotable, and there were plenty of gems in this one. Two of my
favorites were, "In its defense Microsoft has claimed that it spent 500
people-years to make Windows 2000 reliable. I only reprint this number because
it serves to illustrate how inadequate 500 people-years are." and in reference
to stupid laws like the Digital Millennium Copyright Act (DMCA) and Uniform
Computer Information Transactions Act (UCITA) that try to prohibit reverse
engineering, "It's certainly easier to implement bad security and make it
illegal for anyone to notice than it is to implement good security."
Secrets and Lies is a very readable book, but I think a few more rounds of
editing would have improved it a great deal. Admittedly, being an editor I
pretty much always think this, but there are some particular clues in this one.
Schneier seems to be a strong proponent of something my music teacher used to
always tell me, "make your mistakes in time":
I have never before missed a publication deadline: books, articles, or
essays. I pride myself on timelines: A piece of writing is finished when
it's due, not when it's done.
The lack of editing shows mainly in the amount of repetition — and I
read this book slowly over the course of almost two years, so I'm sure there
was quite a bit more than what I noticed. I wouldn't be surprised if he
emphasized at least a dozen times that 128-bit keys are big enough for everyone
and that with larger than that, the user's passphrase and other attack vectors
make larger keys pointless.
There is also an unfortunate amount of self-promotion in the book (and possibly
promotion of some friends/colleagues) — at times it reads like a very
high-level marketing brochure. But at least Schneier is up front about it:
So, if this book seems a little self-serving, that's why. Both the book and
the new company grew from the same epiphany, that expert human detection
and response provides the best possible security. The book tracks my
thinking in reforming my company, and explains the service that we offer.
You can learn more about us at www.counterpane.com.
I'm not familiar enough with security to know whether he's right or wrong about
much of anything, but there were some statements that seem obviously wrong, or
at least unclear. For example, in discussing the importance of detection and
response, he says, "There's no way to detect the eavesdropping, so no response
is possible." Huh? He's talking about encrypted digital communications so
things like hearing clicks on the phone or locating bugging devices in a room
don't apply here, but even then, there are ways to detect eavesdropping — such
as watching out for leaks; people who were not supposed to be a party to the
encrypted communication knowing information that was not shared anywhere
outside that communication. It's like what police do, when they deliberately
don't share certain details about a crime, so that when they interrogate a
suspect they can ask questions about those hidden details to know if the
suspect was actually present at the scene.
I appreciate Schneier's defense of free and open source software as more likely
to be secure, and more consistent with the principles of how science should be
done. But I'm also a little disturbed by his eager defense of the idea of
warranties being applied to software, so that companies who write programs with
security holes that are exploited in the wild can be held accountable. These
two positions seem inconsistent to me. When it comes down to it, I'm a terrible
programmer, and if I have to be financially liable for damages caused by bugs
in my software, well, I'm just not going to share it with anyone. Mandatory
warranties would encourage more corporate proprietary software development, and
discourage grassroots development.
I'd recommend the book for anyone wanting an overview of the principles that
underlie security and insecurity on the internet. It's whet my appetite for
reading more about the security systems that I use on a regular basis,
primarily public-key encryption. It's even made me wish I knew anything at all
about math. I look forward to reading his next book, Beyond Fear.