Professional Documents
Culture Documents
by Bruce Schneier
Founder and CTO
BT Counterpane
schneier@schneier.com
http://www.schneier.com
http://www.counterpane.com
In this issue:
Third Annual Movie-Plot Threat Contest
The Security Mindset
News
The Feeling and Reality of Security
Web Entrapment
Schneier/BT Counterpane News
Speeding Tickets and Agenda
Seat Belts and Compensating Behavior
Internet Censorship
Comments from Readers
For this contest, the goal is to create fear. Not just any fear, but a
fear that you can alleviate through the sale of your new product idea.
There are lots of risks out there, some of them serious, some of them so
unlikely that we shouldn't worry about them, and some of them completely
made up. And there are lots of products out there that provide security
against those risks.
Your job is to invent one. First, find a risk or create one. It can be
a terrorism risk, a criminal risk, a natural-disaster risk, a common
household risk -- whatever. The weirder the better. Then, create a
product that everyone simply *has to* buy to protect him- or herself
from that risk. And finally, write a catalog ad for that product.
Here's an example, pulled from page 25 of the Late Spring 2008 Skymall
catalog I'm reading on my airplane as I write this:
"A Turtle is Safe in Water, A Child is Not! Even with the most vigilant
supervision a child can disappear in seconds and not be missed until
it's too late. Our new wireless pool safety alarm system is a must for
pool owners and parents of young children. The Turtle Wristband locks
on the child's wrist (a special key is required to remove it) and
instantly detects immersion in water and sounds a shrill alarm at the
Base Station located in the house or within 100 feet of the pool, spa,
or backyard pond. Keep extra wristbands on hand for guests or to
protect the family dog."
Entries are limited to 150 words -- the example above had 97 words --
because fear doesn't require a whole lot of explaining. Tell us why we
should be afraid, and why we should buy your product.
Entries due by May 1. Submit them as entries to the blog post. And
even if you don't want to enter, go read some of the submissions. You
people are frighteningly creative.
Blog post:
http://www.schneier.com/blog/archives/2008/04/third_annual_mo.html
Uncle Milton Industries has been selling ant farms to children since
1956. Some years ago, I remember opening one up with a friend. There
were no actual ants included in the box. Instead, there was a card that
you filled in with your address, and the company would mail you some
ants. My friend expressed surprise that you could get ants sent to you
in the mail.
This kind of thinking is not natural for most people. It's not natural
for engineers. Good engineering involves thinking about how things can
be made to work; the security mindset involves thinking about how things
can be made to fail. It involves thinking like an attacker, an adversary
or a criminal. You don't have to exploit the vulnerabilities you find,
but if you don't see the world that way, you'll never notice most
security problems.
I've often speculated about how much of this is innate, and how much is
teachable. In general, I think it's a particular way of looking at the
world, and that it's far easier to teach someone domain expertise --
cryptography or software security or safecracking or document forgery --
than it is to teach someone a security mindset.
You can see the results in the blog the students are keeping. They're
encouraged to post security reviews about random things: smart pill
boxes, Quiet Care Elder Care monitors, Apple's Time Capsule, GM's
OnStar, traffic lights, safe deposit boxes, and dorm room security.
The rest of the blog post speculates on how someone could steal a car by
exploiting this security vulnerability, and whether it makes sense for
the dealership to have this lax security. You can quibble with the
analysis -- I'm curious about the liability that the dealership has, and
whether their insurance would cover any losses -- but that's all domain
expertise. The important point is to notice, and then question, the
security in the first place.
The lack of a security mindset explains a lot of bad security out there:
voting machines, electronic payment cards, medical devices, ID cards,
internet protocols. The designers are so busy making these systems work
that they don't stop to notice how they might fail or be made to fail,
and then how those failures might be exploited. Teaching designers a
security mindset will go a long way toward making future technological
systems more secure.
The security mindset is a valuable skill that everyone can benefit from,
regardless of career path.
SmartWater
http://www.smartwater.com/products/securitySolutions.html
http://www.schneier.com/blog/archives/2005/02/smart_water.html
CSE484:
http://www.cs.washington.edu/education/courses/484/08wi/
http://cubist.cs.washington.edu/Security/2007/11/22/why-a-computer-security-course-
blog/
or http://tinyurl.com/3m94ag
CSE484 blog:
http://cubist.cs.washington.edu/Security/
http://cubist.cs.washington.edu/Security/category/security-reviews/
http://cubist.cs.washington.edu/Security/2008/03/14/security-review-michaels-
toyota-service-center/
or http://tinyurl.com/456b5y
Comments:
http://www.freedom-to-tinker.com/?p=1268
http://blog.ungullible.com/2008/03/hacking-yourself-to-ungullibility-part.html
or http://tinyurl.com/3fl9np
http://www.daemonology.net/blog/2008-03-21-security-is-mathematics.html
or http://tinyurl.com/34y2en
News
Bomb squad defuses turnip. Props to the writer who came up with the
first sentence of the story: "A raw turnip was at the root of a bomb
scare that last for hours at a law office."
http://ap.google.com/article/ALeqM5g5qxveGlCNPGT6iLRlEhEUbZcepAD8VDF0AO0
or http://tinyurl.com/37km5m
http://www.journalgazette.net/apps/pbcs.dll/article?AID=/20080315/LOCAL07/803150407
/1002/LOCAL
or http://tinyurl.com/2jug84
Comment on my blog from someone claiming to be the turnip mailer:
http://www.schneier.com/blog/archives/2008/03/bomb_squad_defu.html#c256420
or http://tinyurl.com/4z7nko
This sort of credit card fraud is nothing new, but it's rare to see
statistics of actual fraud.
http://www.schneier.com/blog/archives/2008/03/fraud_due_to_a.html
My guess is that it's an inside job.
Really good blog post on the future potential of quantum computing and
its effects on cryptography.
http://www.emergentchaos.com/archives/2008/03/quantum_progress.html
A quantum computer scientist responds:
http://scienceblogs.com/pontiff/2008/03/shor_calculations.php
If you're fearful, you think you're more at risk than if you're angry:
http://www.hks.harvard.edu/news-events/publications/insight/management/jennifer-
lerner
or http://tinyurl.com/3gflds
http://content.ksg.harvard.edu/lernerlab/pdfs/Lerner_2003_PS_Paper.pdf
This article from The Wall Street Journal outlines how the NSA is
increasingly engaging in domestic surveillance, data collection, and
data mining. The result is essentially the same as Total Information
Awareness.
http://online.wsj.com/article/SB120511973377523845.html
Barry Steinhardt of the ACLU comments.
http://www.dailykos.com/storyonly/2008/3/11/14380/5939/606/474351
More commentary:
http://blogs.zdnet.com/Ratcliffe/?p=334&tag=nl.e622
The U.S. has a new cyber-security czar, Rod A. Beckstrom, who has no
cyber-security experience.
http://www.washingtonpost.com/wp-
dyn/content/article/2008/03/19/AR2008031903125.html
or http://tinyurl.com/2yh2qv
http://arstechnica.com/news.ars/post/20080328-meet-the-new-us-cybersecurity-
czar.html
or http://tinyurl.com/2h53u6
Got an idea how to build a liquid bottle scanner? The TSA wants to give
you money.
http://www.gsnmagazine.com/cms/resources/business-opportunities/624.html
or http://tinyurl.com/2bo5c9
The Quantum Sleeper Unit: fearmongering and security theater at its finest.
http://www.qsleeper.com/
Australia may outlaw laser pointers, because they were used against
planes last month. I'm sure criminals also used cars in Australia last
week. Will the country ban them next? On the other hand, I'm sick and
tired of laser pointers myself. But the cats of Australia will be
terribly disappointed.
http://www.smh.com.au/news/national/lasers-face-import-
ban/2008/03/30/1206850709183.html
or http://tinyurl.com/4v3kzk
An eerily prescient article from The Atlantic in 1967 about the future
of data privacy and security. It presents all of the basic arguments
for strict controls on data collection of personal information, and it's
remarkably accurate in its predictions of the future development and
importance of computers as well as all of all of the ways the government
would abuse them. Well worth reading.
http://blog.modernmechanix.com/2008/03/31/the-national-data-center-and-personal-
privacy/
or http://tinyurl.com/2rg864
This labyrinth security lock is an April Fool's joke, but I want one.
http://www.thinkgeek.com/stuff/41/titaniumlabyrinth.html?cpg=70H
We finally have some actual information about the "liquid bomb" that was
planned by that London group arrested in 2006: "The court heard the
bombers intended to use hydrogen peroxide and mix it with a product
called Tang, used in soft drinks, to turn it into an explosive. They
intended to carry it on board disguised as 500ml bottles of Oasis or
Lucozade by using food dye to recreate the drinks' distinctive colour.
The detonator would have been disguised as AA 1.5 batteries. The
contents of the batteries would have been removed and an electric
element such as a light bulb or wiring would have been inserted. A
disposable camera would have provided a power source."
http://www.dailymail.co.uk/pages/live/articles/news/news.html?in_article_id=555465&
in_page_id=1770&ct=5
or http://tinyurl.com/2xnabh
Much commentary on my blog about the feasibility of this:
http://www.schneier.com/blog/archives/2008/04/the_liquid_bomb.html
The KeeLoq keyless entry system is used by Chrysler, Daewoo, Fiat,
General Motors, Honda, Toyota, Lexus, Volvo, Volkswagen, Jaguar, and
probably others. It's broken.
http://www.crypto.rub.de/keeloq/index.html
http://www.theregister.co.uk/2008/04/03/keeloq_master_key_found/
NSA has released its version of Linux. So, do you trust it?
http://www.upi.com/International_Security/Emerging_Threats/Briefing/2008/03/24/nsa_
releases_new_version_of_linux_software/9918/
or http://tinyurl.com/6bzc2f
NSA's guide to securing Linux:
http://www.nsa.gov/snac/os/redhat/rhel5-guide-i731.pdf
This is a great essay by a mom who let her 9-year-old son ride the New
York City subway alone, and the whole discussion is illustrative how we
overestimate threats against children:
http://www.schneier.com/blog/archives/2008/04/overestimating.html
Security is both a feeling and a reality, and they're different. You can
feel secure even though you're not, and you can be secure even though
you don't feel it. There are two different concepts mapped onto the same
word -- the English language isn't working very well for us here -- and
it can be hard to know which one we're talking about when we use the word.
Now we may or may not have the expertise to make those trade-offs
intelligently, but we make them anyway. All of us. People have a natural
intuition about security trade-offs, and we make them, large and small,
dozens of times throughout the day. We can't help it: It's part of being
alive.
The short answer is that people make most trade-offs based on the
*feeling* of security and not the reality.
I've written a lot about how people get security trade-offs wrong, and
the cognitive biases that cause us to make mistakes. Humans have
developed these biases because they make evolutionary sense. And most of
the time, they work.
The key here is whether we notice. The feeling and reality of security
tend to converge when we take notice, and diverge when we don't. People
notice when 1) there are enough positive and negative examples to draw a
conclusion, and 2) there isn't too much emotion clouding the issue.
Both elements are important. If someone tries to convince us to spend
money on a new type of home burglar alarm, we as society will know
pretty quickly if he's got a clever security device or if he's a
charlatan; we can monitor crime rates. But if that same person advocates
a new national antiterrorism system, and there weren't any terrorist
attacks before it was implemented, and there weren't any after it was
implemented, how do we know if his system was effective?
The other thing that matters is agenda. There are lots of people,
politicians, companies and so on who deliberately try to manipulate your
feeling of security for their own gain. They try to cause fear. They
invent threats. They take minor threats and make them major. And when
they talk about rare risks with only a few incidents to base an
assessment on -- terrorism is the big example here -- they are more
likely to succeed.
There are some complex feedback loops going on here, between emotion and
reason, between reality and our knowledge of it, between feeling and
familiarity, and between the understanding of how we reason and feel
about security and our analyses and feelings. We're never going to stop
making security trade-offs based on the feeling of security, and we're
never going to completely prevent those with specific agendas from
trying to take care of us. But the more we know, the better trade-offs
we'll make.
Web Entrapment
This seems like incredibly flimsy evidence. Someone could post the link
as an embedded image, or send out e-mail with the link embedded, and
completely mess with the FBI's data -- and the poor innocents' lives.
Such are the problems when the mere clicking on a link is justification
for a warrant.
http://www.news.com/8301-13578_3-9899151-38.html?tag=nefd.pop
http://yro.slashdot.org/yro/08/03/20/2323247.shtml
http://arstechnica.com/news.ars/post/20080323-rick-rolled-to-child-porn-youre-a-
pedophile-says-fbi.html
or http://tinyurl.com/2ffhs2
http://www.msnbc.msn.com/id/23710970
In any case, a new paper presents data that contradicts that thesis:
"This paper investigates the effects of mandatory seat belt laws on
driver behavior and traffic fatalities. Using a unique panel data set on
seat belt usage in all U.S. jurisdictions, we analyze how such laws, by
influencing seat belt use, affect the incidence of traffic fatalities.
Allowing for the endogeneity of seat belt usage, we find that such usage
decreases overall traffic fatalities. The magnitude of this effect,
however, is significantly smaller than the estimate used by the National
Highway Traffic Safety Administration. In addition, we do not find
significant support for the compensating-behavior theory, which suggests
that seat belt use also has an indirect adverse effect on fatalities by
encouraging careless driving. Finally, we identify factors, especially
the type of enforcement used, that make seat belt laws more effective in
increasing seat belt usage."
http://www.stanford.edu/~leinav/pubs/RESTAT2003.pdf
John Adams:
http://www.cato.org/pubs/pas/pa-335es.html
Internet Censorship
The first half of the book comprises essays written by ONI researchers
on the politics, practice, technology, legality and social effects of
Internet filtering. There are three basic rationales for Internet
censorship: politics and power; social norms, morals and religion; and
security concerns.
Some countries, such as India, filter only a few sites; others, such as
Iran, extensively filter the Internet. Saudi Arabia tries to block all
pornography (social norms and morals). Syria blocks everything from the
Israeli domain ".il" (politics and power). Some countries filter only at
certain times. During the 2006 elections in Belarus, for example, the
website of the main opposition candidate disappeared from the Internet.
In 1996, Barlow said: "You are trying to ward off the virus of liberty
by erecting guard posts at the frontiers of cyberspace. These may keep
out the contagion for some time, but they will not work in a world that
will soon be blanketed in bit-bearing media."
OpenNet Initiative:
http://www.opennet.net
http://www.schneier.com/blog
** *** ***** ******* *********** *************