Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456 Ross Anderson's Home Page
Many of my papers are available in html and/or pdf, but some of the
older technical ones are in postscript, which was the standard for
many years. You can download a postscript viewer from here. Also, by default, when I
post a paper here I license it under the relevant Creative Commons
license, so you may redistribute it but not modify it. I may
subsequently assign the residual copyright to an academic publisher.
Over the last few years, it's become clear that many systems fail not
for technical reasons so much as from misplaced incentives - often the
people who could protect them are not the people who suffer the costs
of faulure. There are also many questions with an economic dimension
as well as a technical one. For example, will digital signatures make
electronic commerce more secure? Is so-called `trusted computing' a
good idea, or just another way for Microsoft to make money? And what
about all the press stories about `Internet hacking' - is this threat
serious, or is it mostly just scaremongering by equipment vendors?
It's not enough for security engineers to understand ciphers; we have
to understand incentives as well. This has led to a rapidly growing
interest in `security economics', a discipline which I helped to found.
I maintain the Economics and
Security Resource Page, and my research contributions include the
following.
The
topology of covert conflict is rather topical - how can the police
best target an underground organisation given some knowledge of its
patterns of communication? And how might they in turn react to various
law-enforcement strategies? We present a framework combining ideas
from network analysis and evolutionary game theory to explore the
interaction of attack and defence strategies in networks. Although we
started out thinking about computer viruses, our work suggests
explanations of a number of aspects of modern conflict generally.
Why
Information Security is Hard - An Economic Perspective was the
paper that got information security people thinking about the subject.
It applies economic analysis to explain many phenomena that security
people had found to be pervasive but perplexing. Why do mass-market
software products such as Windows contain so many security bugs? Why
are their security mechanisms so difficult to manage? Why are
government evaluation schemes, such as the Orange Book and the Common
Criteria, so bad? This paper was presented at the Applications Security conference in
December 2001, and also as an invited talk at SOSP 2001.
The hot political issue is `Trusted Computing'. My `Trusted
Computing' FAQ analysed this Intel/Microsoft initiative to install
digital rights management hardware in every computer, PDA and mobile
phone. `TC' will please Hollywood by making it hard to pirate music
and videos; and it will please Microsoft by making it harder to pirate
software. But TC could have disturbing consequences for privacy,
censorship, and innovation. Cryptography
and Competition Policy - Issues with `Trusted Computing' is an
economic analysis I gave at WEIS2003 and
also as an invited talk at PODC 2003. TC will enable
Microsoft to lock
in its customers more tightly, so it can charge you more. The
proposed mechanisms could also have some disturbing consequences for
privacy, censorship, and innovation. There is also a shortened
version of the paper that has appeared in a special issue of Upgrade,
and a French
translation. I spoke about TC at the "Trusted
Computing Group" Symposium, at PODC, and at the Helsinki IPR
workshop. There's also a neat video clip. TC
is not just an isolated engineering and policy issue; it is related to the IP
Enforcement Directive on the policy front, and new content
standards such as DTCP, which will
be built into consumer electronics and also into PC motherboards.
The row about `Trusted Computing' was ignited by a paper I gave on the security
issues relating to open source and free software at a conference
on Open Source Software
Economics in Toulouse in June 2002. This paper has two parts, the
second of which is about TC and got press coverage in the New York Times, slashdot, news.com and The Register.
In the first part of my Toulouse
paper, I show that the usual argument about open source security -
whether source access makes it easier for the
defenders to find and fix bugs, or makes it easier for the
attackers to find and exploit them - is misdirected. Under
standard assumptions used by the reliability growth modelling
community, the two will exactly cancel each other out. That means that
whether open or closed systems are more secure in a given situation
will depend on whether, and how, the application deviates from the
standard assumptions. The ways in which this can happen, and either
open or closed be better in some specific application, are explored in
Open
and Closed Systems are Equivalent (that is, in an ideal world)
appear as a chapter in Perspectives
on Free and Open Source Software. There is also some press
coverage.
On
Dealing with Adversaries Fairly applies election theory (also
known as social choice theory) to the problem of shared control in
distributed systems. It shows how a number of reputation systems
proposed for use in peer-to-peer applications might be improved. It
appeared at WEIS
2004.
The
Economics of Censorship Resistance examines when it is better for
defenders to aggregate or disperse. Should file-sharers build one huge
system like gnutella and hope for safety in numbers, or would a loose
federation of fan clubs for different bands work better? More
generally, what are the tradeoffs between diversity and solidarity
when conflict threatens? (This is a live topic in social policy at the
moment - see David
Goodhart's essay, and a response in the Economist.)
This paper also appeared at WEIS 2004.
Our annual bash is the Workshop on Economics and Information
Security; the 2006 workshop will be here in Cambridge from June
26-28. The 2005 event was at Harvard and the papers are online. My Economics and Security
Resource Page provides a guide to the literature and to what's on. There is
also a web page on the economics of
privacy, maintained by Alessandro Acquisti.
Since about the middle of 2000, there has been an explosion of
interest in peer-to-peer networking - the business of building useful
systems out of large numbers of intermittently connected machines,
with virtual infrastructures that are tailored to the application. One
of the seminal papers in the field was The
Eternity Service, which I presented at Pragocrypt 96. I had been
alarmed by the Scientologists' success at closing down the penet
remailer in Finland, and had been personally threatened by bank
lawyers who wanted to suppress knowledge of the vulnerabilities of ATM
systems (see here
for a later incident). This taught me that electronic publications can
be easy for the rich and the ruthless to suppress. They are usually
kept on just a few servers, whose owners can be sued or coerced. To
me, this seemed uncomfortably like books in the Dark Ages: the modern
era only started once the printing press enabled seditious thoughts to
be spread too widely to ban. The Eternity Service was conceived as a
means of putting electronic documents as far outwith the censor's grasp as possible.
(The concern that motivated me has since materialised; a UK
court judgment has found that a newspaper's online archives can be
altered by order of a court to remove a libel.)
But history never repeats itself exactly, and the real fulcrum of
censorship in cyberspace turned out to be not sedition, or
vulnerability disclosure, or even pornography, but copyright.
Hollywood's action against Napster led to my Eternity Service
ideas being adopted by many systems including Publius and Freenet. Many of these
developments were described in an important book,
and the first
academic conference on peer-to-peer systems was held in March 2002
at MIT. The field has since become very active: here is a web
page of peer-to-peer conferences. See also Richard Stallman's
classic, The Right to
Read.
My contributions since the Eternity paper include:
I've been helping upgrade
the security of Homeplug, an industry standard for broadband communication over
the power mains. A paper
on what we did and why has been accepted for SOUPS. This is a good worked example
of how to do key establishment in a real peer-to-peer system. The core problem
is this: how can you be sure you're recruiting the right device to your
network, rather than a similar one nearby?
Sybil-resistant DHT
routing appeared at ESORICS 2005 and showed how we
can make peer-to-peer systems more robust against disrutpive attacks if
we know which nodes introduced which other nodes. The convergence of
computer science and social network theory is an interesting recent
phenomenon, and not limited to search and recommender systems.
Key
Infection - Smart trust for Smart Dust appeared at ICNP 2004 and presents a radically
new approach to key management in sensor and peer-to-peer networks. Peers
establish keys opportunistically and use resilience mechanisms to fortify
the system against later node compromise. This work challenges the old
assumption that authentication is largely a bootstrapping problem.
The Economics
of Censorship Resistance examines when it is better for defenders
to aggregate or disperse. Should file-sharers build one huge system
like gnutella and hope for safety in numbers, or would a loose
federation of fan clubs for different bands work better?
A New
Family of Authentication Protocols presented our `Guy Fawkes Protocol',
which enables users to sign messages using only two computations of
a hash function and one reference to a timestamping service. This led to
protocols for signing digital streams, used in systems like Freenet. Our paper also raises
foundational questions about the definition of a digital signature.
Peer-to-peer techniques are not just about creating virtual machines out
of many distributed PCs on the Internet, but apply also to other environments
where communication is intermittent. Mobile communications, personal area
networks and piconets are another rapidly developing field. The Resurrecting
Duckling: Security Issues for Ad-hoc Wireless Networks describes how to do
key management between low-cost devices that can talk to each other using
radio or infrared, and without either the costs or privacy problems of
centralised trusted third parties (there is also a later journal version of the
paper here).
The study of distributed systems which are hidden, deniable or difficult
to censor might be described as `subversive group computing'. Our seminal
publication in this thread was The Cocaine Auction
Protocol which explored how commercial transactions can be conducted
between mutually mistrustful principals with no trusted arbitrator, while
giving a high degree of privacy against traffic analysis.
The
Eternal Resource Locator: An Alternative Means of Establishing Trust
on the World Wide Web investigated how to protect naming and
indexing information and showed how to embed trust mechanisms in html
documents. It was motivated by a project to protect the electronic
version of the British
National Formulary, developed by colleagues at our medical
school. It followed work reported in Secure Books:
Protecting the Distribution of Knowledge, which describes a
project to protect the authenticity and integrity of electronically
distributed treatment protocols. Later work included Jikzi, an
authentication framework for electronic publishing, which works by
integrating ERL-type ideas into XML. There are both general
and technical
papers on Jikzi, and it led to products sold by a startup called Filonet.
The
XenoService - A Distributed Defeat for Distributed Denial of Service
described countermeasures to distributed denial of service attacks. The
XenoService is a network of web hosts that can respond to an attack on a site
by replicating it rapidly and widely. It uses Xenoservers,
developed at Cambridge for the distributed hosting of latency- and
bandwidth-critical network services. This technique now appears to be used by
hosting companies like Akamai.
I am now running a CMI project with
Frans Kaashoek and Robert Morris on building a
next-generation peer-to-peer system. I gave a keynote talk about this at the Wizards of OS
conference in Berlin; the slides are here.
Very many security system failures can be attributed to poorly designed
protocols, and this has been of interest to our team for many years. Some
relevant papers follow.
Here is a survey of
cryptographic processors, a shortened version of which appeared in the
February 2006 Proceedings of the IEEE. It describes much of our recent work in
API security and physical tamper-resistance.
API
Level Attacks on Embedded Systems provides more detail on this
aspect of our work. Even if a device is physically tamper-proof, it can
often be defeated by sending it a suitable sequence of transactions
which causes it to leak the key. We've broken pretty well every
security processor we've looked at, at least once. This line of
research originated at Protocols 2000 with my paper The
Correctness of Crypto Transaction Sets. There's more in my book. A recent paper, Robbing
the bank with a theorem prover shows how to apply some of
the tools of theoretical computer science to the problem, and some ideas
for future research can be found in Protocol
Analysis, Composability and Computation.
Programming
Satan's Computer is a phrase Roger Needham and I coined to express
the difficulty of designing cryptographic protocols; it has recently been
popularised by Bruce Schneier (see, for example, his foreword to my book). The problem of
designing programs which run robustly on a network containing a malicious
adversary is rather like trying to program a computer which gives subtly wrong
answers at the worst possible moments.
Robustness principles for public key protocols gives a number of attacks on
protocols based on public key primitives. It also puts forward some principles
which can help us to design robust protocols, and to find attacks on other
people's designs. It appeared at Crypto 95.
The Cocaine Auction
Protocol explores how transactions can be conducted between mutually
mistrustful principals with no trusted arbitrator, even in environments where
anonymous communications make most of the principals untraceable;
NetCard -
A Practical Electronic Cash Scheme presents research on micropayment
protocols for use in electronic commerce. We invented tick payments
simultaneously with Torben Pedersen and with Ron Rivest and Adi Shamir; we all
presented our work at Protocols 96. Our paper discusses how tick payments can
be made robust against attacks on either the legacy credit card infrastructure
or next generation PKIs.
The
GCHQ Protocol and its Problems points out a number of flaws in a key
management protocol widely used in the UK government, and in the French health
service. It was promoted by GCHQ as a European alternative to Clipper, until
we shot it down with this paper at Eurocrypt 97. Its vulnerabilities allow
traceless forgery of government documents and other bad stuff. Many of the
criticisms we developed here also apply to the more recent, pairing-based
cryptosystems.
The
Formal Verification of a Payment System describes the first use of formal
methods to verify an actual payment protocols, that was (and still is) used in
an electronic purse product (VISA's COPAC card). This is a teaching example I
use to get the ideas of the BAN logic across to undergraduates. There is
further information on the actual system in a technical
report, which combines papers that appeared at ESORICS 92 and Cardis 94.
Protocols have occasionally been the stuff of high drama. Citibank
asked the High Court to gag
the disclsoure of certain crypto
API vulnerabilities that affect a number of cryptographic
processors used in banking. I wrote to the judge opposing
the application. A gag
order was nonetheless imposed, although in slightly less severe
terms than those requested by Citibank. The trial was in camera, and
new information revealed about these vulnerabilities in the course of
the trial may not be disclosed in England or Wales. (Citi had wanted a
global ban.) Information
already in the public domain was unaffected. The vulnerabilities
were discovered by Mike
Bond and me while acting as the defence experts in a phantom
withdrawal court case, and independently discovered by the other
side's expert, Jolyon
Clulow, who has since joined us as a research student. They are of
significant scientific
interest, as well as being of great relevance to the rights of the
growing number of people who seem to be suffering phantom withdrawals
from their bank accounts worldwide. If Citi thought that this would
prevent knowledge of the problem spreading, they reckoned without the
New
Scientist, the Register,
Slashdot,
news.com, and
Zdnet.
I have been interested for many years in how security systems fail in real
life. This is a prerequisite for building robust secure systems; many security
designs are poor because they are based on unrealistic threat models. This work
began with a study of automatic teller machine fraud, and then expanded to
other applications as well. It now provides the central theme of my book.
Why Cryptosystems
Fail has probably been more widely cited than anything else I've written.
This version appeared at ACMCCS 93 and explains how ATM fraud was done in the
early 1990s. Liability and
Computer Security - Nine Principles took this work further, and examines
the problems with relying on cryptographic evidence. The recent introduction of
EMV ('chip and PIN') was supposed to fix the problem, but hasn't: Phish and
Chips documents protocol weaknesses in EMV, while The Man-in-the-Middle Defence shows how to turn protocol weaknesses to
advantage. For example, a bank customer can take an electronic attorney along
to a chip-and-PIN transaction to help ensure that neither the bank nor the
merchant rips him off. This story will run and run.
Here is a paper on combining
cryptography with biometrics, which shows that in those applications
where you can benefit from biometrics, you don't need a large
central database.(as proposed in the ID
card Bill). There are smarter, more resilient, and less
privacy-invasive ways to arrange things.
On a New
Way to Read Data from Memory describes techniques we developed that use
lasers to read out memory contents directly from a chip, without using the
read-out circuits provided by the vendor. This can defeat access controls and
even recover data from damaged devices. Collaborators at Louvain have developed
ways to do this using electromagnetic induction, which are also described. The
work builds on methods described in an earlier paper, on Optical
Fault Induction Attacks. This showed how laser pulses could be used to
induce faults in smartcards that would leak secret information; we can write
arbitrary values into registers or memory, reset protection bits, break out of
loops, and cause all sorts of mayhem. That paper made the front page of the New
York Times; it also got covered by the New
Scientist, slashdot
and Tech
TV. It was presented at CHES
2002.
After we discovered the above attacks, we developed a new, more secure,
CPU technology for use in smartcards and similar products. It uses
redundant failure-evident logic to thwart attacks based on fault induction or
power analysis. Our paper
on this technology won the best presentation award in April at Async 2002.
The latest journal paper on this technology, with recent test results, is here.
Our classic paper on hardware security, Tamper Resistance
- A Cautionary Note, describes how to penetrate the smartcards and secure
microcontrollers of the mid-1990s. It won the Best Paper award at the 1996
Usenix Electronic Commerce Workshop and caused a lot of controversy. Our second
paper on this subject was
Low Cost
Attacks on Tamper Resistant Devices, which describes a number of techniques
that low budget attackers can use. See also the home page of our hardware security
laboratory which brings together our smartcard and Tempest work, and our
page of links to
relevant off-site resources.
On the
Reliability of Electronic Payment Systems is another of the papers that
follow naturally from working on ATMs. It looks at the reliability of
prepayment electricity meters, and appeared in the May 1996 issue of the IEEE
Transactions on Software Engineering. An ealier version, entitled
Cryptographic
Credit Control in Pre-Payment Metering Systems, appeared at the 1995 IEEE
Symposium on Security and Privacy. Another paper on this subject is The design of future
pre-payment systems, which appeared at MATES 96 and discussed how we could
build a robust payment infrastructure to support utility networking in the UK
after deregulation.
On the
Security of Digital Tachographs looks at the techniques used to manipulate
the tachographs that are used in Europe to police truck and bus drivers' hours,
and tries to predict the effect of the planned introduction of smartcard-based
digital tachographs throughout Europe from the year 2000. This work was done
for the Department of Transport.
How to
Cheat at the Lottery is a paper reporting a novel and, I hope, entertaining
experiment in software requirements engineering. The lessons it teaches have
the potential to cut the cost of developing safety critical and security
critical software, and also to reduce the likelihood that specification errors
will lead to disastrous failures.
The Grenade Timer
describes a novel way to protect low-cost processors against denial of service
attacks, by limiting the number of processing cycles which an application
program can consume.
The Millennium
Bug - Reasons Not to Panic describes our experience in coping with the bug
at Cambridge University and elsewhere. This paper correctly predicted that the
bug wouldn't bite very hard. (Journalists were not interested, despite a major
press
release by the University.)
Murphy's law,
the fitness of evolving species, and the limits of software reliability
shows how we can apply the techniques of statistical thermodynamics to the
failure modes of any complex logical system that evolves under testing. It
provides a common mathematical model for the reliability growth of complex
computer systems and for biological evolution. Its findings are in close
agreement with empirical data. This paper inspired later
work in security economics.
Security
Policies play a central role in secure systems engineering. They provide a
concise statement of the kind of protection a system is supposed to achieve. A
security policy should be driven by a realistic threat model, and should in
turn be used as the foundation for the design and testing of protection
mechanisms. This article is a security policy tutorial.
Recent reports
of attacks on the standard hash function SHA have left Tiger, which Eli Biham and I designed in
1995, as the obvious choice of cryptographic hash function. I also worked with
Eli, and with Lars Knudsen, to develop Serpent - a candidate
block cipher for the Advanced Encryption
Standard. Serpent won through to the final of the competition and got the
second largest number of votes. Another of my contributions was founding the
series of workshops on Fast Software Encryption.
Other papers on cryptography and cryptanalysis include the following.
The Dancing
Bear - A New Way of Composing Ciphers presents a new way to
combine crypto primitives. Previously, to decrypt using any three out
of five keys, the keys all had to be of the same type (such as RSA
keys). With my new construction, you can mix and match - RSA, AES,
even one-time pad. The paper appeared at the 2004 Protocols Workshop;
an earlier version came out at the FSE 2004 rump session.
Two
Remarks on Public Key Cryptology is a note on two ideas I floated
at talks I gave in 1997-98, concerning forward-secure signatures and
compatible weak keys. The first of these has inspired later research
by others; the second gives a new attack on public key encryption
systems.
Two
Practical and Provably Secure Block Ciphers: BEAR and LION shows how to
construct a provably secure block cipher from a stream cipher and a hash
function. It had previously been known how to construct stream ciphers and hash
functions from block ciphers, and hash functions from stream ciphers; so our
constructions complete the set of elementary reductions. They also led to the
`Dancing Bear' paper above.
Tiger - A
Fast New Hash Function defines a new hash function, which we designed
following Hans Dobbertin's attack on MD4. This was designed to run extremely
fast on the new 64-bit processors such as DEC Alpha and IA64, while still
running reasonably quickly on existing hardware such as Intel 80486 and
Pentium (the above link is to the Tiger home page, maintained in Haifa by Eli
Biham; if the network is slow, see my UK mirrors of the Tiger paper, new and old reference
implementations (the change fixes a padding bug) and S-box generation
documents. There are also third-party crypto toolkits supporting Tiger,
such as that from Bouncy Castle).
Minding
your p's and q's points out a number of things that can go wrong with the
choice of modulus and generator in public key systems based on discrete log. It
elucidated many of the previously classified reasoning behind the design of the
US Digital Signature Algorithm, and appeared at Asiacrypt 96.
Chameleon - A New Kind of Stream Cipher shows how to do traitor tracing
using symmetric rather than public key cryptology. The idea is to turn a stream
cipher into one with reduced key diffusion, but without compromising security.
The effect is that a single broadcast ciphertext is decrypted to slightly
different plaintexts by users with slightly different keys. Thus users who
re-sell their copy of the plaintext in contravention of a licence agreement can
be traced. This paper appeared at the fourth workshop on Fast Software
Encryption in Haifa in January 1997.
Searching
for the Optimum Correlation Attack appeared at the second workshop on fast
software encryption. It shows that nonlinear combining functions used in
nonlinear filter generators can react with shifted copies of themselves in a
way that opens up a new and powerful attack on many cipher systems.
The
Classification of Hash Functions appeared at Cryptography and Coding 93. It
proves that correlation freedom is strictly stronger than collision freedom,
and shows that there are many pseudorandomness properties other than collision
freedom which hash functions may need.
A Faster
Attack on Certain Stream Ciphers shows how to break the multiplex shift
register generator, which is used in satellite TV systems. I found a simple
divide-and-conquer attack on this system in the mid 1980's, a discovery that
got me `hooked' on cryptology. This paper is a recent refinement of that work.
On
Fibonacci Keystream Generators appeared at FSE3, and shows how to break
`FISH', a stream cipher proposed by Siemens. It also proposes an improved
cipher, `PIKE', based on the same general mechanisms.
From the mid- to late-1990s, I did a lot of work on information hiding.
Soft Tempest: Hidden
Data Transmission Using Electromagnetic Emanations must be one of the more
unexpected and newsworthy papers I've published. It is well known that
eavesdroppers can reconstruct video screen content from radio frequency
emanations; up till now, such `Tempest attacks' were prevented by shielding,
jammers and so on. Our innovation was a set of techniques that enable the
software on a computer to control the electromagnetic radiation it emanates.
This can be used for both attack and defence. To attack a system, malicious
code can hide stolen information in the machine's Tempest emanations and
optimise them for some combination of reception range, receiver cost and
covertness. To defend a system, a screen driver can display sensitive
information using fonts which minimise the energy of RF emanations. This
technology is now fielded in PGP and eslewhere. You can download Tempest fonts
from here.
There is a followup
paper on the costs and benefits of Soft Tempest in military environments,
which appeared at NATO's 1999 RTO meeting on infosec, while an earlier version
of our main paper, which received considerable publicity, is
available here.
Finally, there's some attack software here, software you can use to
play your MP3s over the radio here, a press article
here and
information on more recent optical tempest attacks here.
Hollywood hopes that copyright-marking systems will help control the
copying of videos, music and computer games. This became high drama when a paper that showed how to break
the DVD/SDMI copyright marking scheme was pulled by its authors from the Information Hiding 2001, in
Pittsburgh, following legal threats from Hollywood. In fact, the chosen
technique - echo hiding - was among a number that we broke in 1997. The attack
is reported in our paper Attacks
on copyright marking systems, which we published at Info Hiding 1998. We
also wrote a survey
paper on information hiding, which is a good place to start if you're new
to the field. For the policy aspects, you might read Pam
Samuelson. There is much more about the technology on the web page of my
former student Fabien Petitcolas.
Another novel application of information hiding is the Steganographic File
System. It will give you any file whose name and password you know, but if
you do not know the correct password, you cannot even tell that a file of that
name exists in the system! This is much stronger than conventional multilevel
security, and its main function is to protect users against coercion. Two of
our students implemented SFS for Linux: a paper describing the details is here, while the code
is available here.
The threat by some governments to ban cryptography has led to a surge of
interest in steganography - the art of hiding messages in other messages. Our
paper On The
Limits of Steganography explores what can and can't be done; it appeared in
a special issue of IEEE JSAC. It is an extended version of Stretching the
Limits of Steganography, which appeared at the first international workshop
on Information Hiding, whose proceedings are here. I also started a bibliography
of the subject which is now maintained by Fabien Petitcolas.
The Newton
Channel settles a conjecture of Simmons by exhibiting a high bandwidth
subliminal channel in the ElGamal signature scheme. It appeared at Info Hiding
96.
Reliability leads naturally to medical informatics, a subject in which I've
worked off and on over the years. The UK government is building a national
database to hold everyone's medical records, which doctors
oppose. Ministers recently gave a guarantee of
patient privacy, about which GPs, NGOs and commentators are sceptical. There are radio pieces on the problems
here and here, comments here,
and earlier material here
and here.
An example of likely problems comes from a report that
the Real IRA penetrated the Royal Victoria Hospital in Northern Ireland and
used its electronic medical records to gather information on policemen to
target them and their families for murder. A particularly shocking case was
that of Helen Wilkinson, who needed to organise a debate in Parliament to get ministers to agree
to remove defamatory and untrue information about her from NHS computers.
Civil servants started pushing for online access to everyone's records in 1992
and I got involved in 1995, when I started consulting for the British Medical
Association on the safety and privacy of clinical information systems. Back
then, the police were given access to all drug prescriptions in the UK, after
the government argued that they needed it to catch the occasional doctor who
misprescribed heroin. The police got their data, they didn't catch Harold Shipman, and
no-one was held accountable.
The NHS slogan was initially `a unified electronic patient record, accessible
to all in the NHS'. The slogan has changed several times, and the strategy now
contains some words on confidentiality, but the goal remains the same. The
Health and Social Care
(Community Health and Standards) Act allowed the Government access to all
medical records in the UK, for the purposes of `Health Improvement'. It
removed many of the patient privacy safeguards in previous legislation. In
addition, the new contract
offered to GPs since 2003 moves ownership of family doctor computers to Primary
Care Trusts (that's health authorities, in oldspeak). There was a token
consultation on confidentiality; the Foundation
for Information Policy Research, which I chair, published a response to it (which was of course ignored).
The last time people pointed out that NHS administrators were helping
themselves illegally to confidential personal health information, Parliament
passed some regulations
on patient privacy to legalise those illegal practices that had been
brought to public attention. For example, the regulations compel doctors to
give the government copies of all records relating to infectious disease and
cancer. The regulations were made under an Act
that was rushed through in the shadow of the last election and that gives
ministers broad powers to nationalise personal health information.
Some relevant papers of my own follow. They are mostly from the 1995-6 period,
when the government last tried to centralise all medical records - and we saw
them off.
Security in Clinical Information Systems was published by the British
Medical Association in January 1996. It sets out rules that can be used to
uphold the principle of patient consent independently of the details of
specific systems. It was the medical profession's initial response to creeping
infringement of patient privacy by NHS computer systems.
An
Update on the BMA Security Policy appeared in June 1996 and tells the story
of the struggle between the BMA and the government, including the origins and
development of the BMA security policy and guidelines.
There are comments made
at NISSC 98 on the healthcare protection profiles being developed by NIST for
the DHHS to use in regulating health information systems privacy. The
protection profiles make a number of mistaken assumptions about the threats to
medical systems and of the kind of protection mechanisms that are
appropriate.
Remarks
on the Caldicott Report raises a number of issues about policy as it was
settled in the late 1990s. It notes particular problems with the NHS number
tracing service, which is open to large numbers of people in the NHS and can be
used to re-identify the poorly de-identified data used in medical research
and administration;
The
DeCODE Proposal for an Icelandic Health Database analyses a proposal to
collect all Icelanders' medical records into a single central database to
support genetic research and health service management. I evaluated this for
the Icelandic Medical Association and concluded in my report that the proposed
controls were inadequate. The company running it has since hit financial
problems but the
ethical issues remain, and Iceland's
Supreme Court recently allowed a woman to block access to her father's
records because of the information they may reveal about her. (These issues may
recur in the UK with the proposed biobank database.) I also wrote an analysis
of security targets prepared under the Common Criteria for the evaluation of
this database. For more, see BMJ correspondence,
the Icelandic
organisation leading opposition to the database, and an article by Einar Arnason.
Clinical
System Security - Interim Guidelines appeared in the British Medical
Journal on 13th January 1996. It advises healthcare professionals on prudent
security measures for clinical data. The most common threat is that private
investigators use false-pretext telephone calls to elicit personal health
information from assistant staff.
A
Security Policy Model for Clinical Information Systems appeared at the 1996
IEEE Symposium on Security and Privacy. It presents the BMA policy model to the
computer security community and had some influence in the formation to current
US health privacy legislation (the Kennedy-Kassebaum Bill, now HIPAA).
Problems
with the NHS Cryptography Strategy points out a number of errors in, and
ethically unacceptable consequences of, a report on
cryptography produced for the Department of Health. These comments formed the
BMA's response to that report.
An important paper is Privacy in clinical
information systems in secondary care which describes a hospital system
that implements the BMA security policy. The main government objection to our
policy was `it'll never work in hospitals'; this system, which is now running
at a number of sites, shows that hospital systems can indeed be made secure. It
is described in more detail in a special issue of the Health Informatics Journal on data security,
confidentiality and safety (v 4 nos 3-4, Dec 1998) which I edited. (This system
is due to be ripped out in October 2007 and replaced by a less capable system
that will give ministers access to everything.)
The same issue also contains a paper on Protecting Doctors'
Identity in Drug Prescription Analysis which describes a system designed to
de-identify prescription data properly for commercial use. This system led to
the `Source Informatics' court case, in which the UK government tried to
discourage its owner, now called IMS Health, from promoting it - as it would
have competed with much less privacy-friendly government systems. The
government lost: the Court of Appeal decided that personal health
information can be used for research and other secondary purposes without the
informed consent of patients, but provided that the de-identification is done
competently.
A first-class collection of links to papers on the protection of
de-identified data is maintained by the American Statistical
Association. Bill Lowrance wrote a good survey for the US
Department of Health and Human Services of the potential for using
de-identified data ro protect patient privacy in medical research, while a report by the
US General Accounting Office shows how de-identified records are handled much
better by Medicare than by the NHS. For information on what's happening in the
German speaking world, see Andreas
von Heydwolff's web site and Gerrit
Bleumer's European project links. Resources on what's happening in the USA - where
medical privacy is a much more live issue - include EPIC, the med-privacy mailing list
archives; the web sites run by Citizens for Choice in Health Care and Georgetown University (the latter has
a comprehensive survey of US
health privacy laws); and a report from the US National Academy of Sciences
entitled For the Record:
Protecting Electronic Health Information. Other resources include a report by
the US Office of Technology Assessment, and web pages by CPT and the Institute for Health Freedom.
John Curran said in 1790: ``The condition upon which God hath given liberty to
man is eternal vigilance; which condition if he break, servitude is at once the
consequence of his crime, and the punishment of his guilt''. After the crypto
wars of the 1990s, this is something we are all aware of!
I chair the Foundation for Information Policy
Research, which I helped set up in 1998. This body is concerned with
promoting research and educating the public in such topics as the interaction
between computing and the law, and the social effects of IT. We are not a lobby
group; our enemy is ignorance rather than the government of the day, and one of
our main activities is providing accurate and neutral briefing for politicians
and members of the press.
Our top priority in late 2003-early 2004 was the
IPR
enforcement directive, which has been succinctly
described as `DMCA
on steroids'. Thanks to lobbying by FIPR and others, there were amendments
with a positive effect - notably, by removing criminal sanctions and
legal protection for devices such as RFID tags - but other amendments extend its
scope still further. Previously, it would have forced Member
States to criminalise any serious commercial infringement of
intellectual property; now it will only apply vigorous civil remedies,
but will cover all infringements. So it looks like in future all
Member States will have to make it easy for record companies to
harrass children who swap a tune or a mobile phone ring-tone. This is
already a contentious
issue in the USA.: it will now come here too. Here is a critical
article
on the original proposal by a number of distinguished lawyers. This
Directive is also likely to have unpleasant
effects on the communications industry, on universities, on
libraries, on software compatibility, and maybe even on the single market -
the right to free trade within Europe, which is the very reason for
the EU's existence. This horrible law was supported by Microsoft
(which is about to be convicted
by the EU of anticompetitive behaviour), the music
industry and the owners of luxury brands such as Yves Saint
Laurent, while it was opposed by phone companies,
supermarkets, smaller software firms and the free software
community. Lawyers were sceptical,
as is the press - in Britain,
France and even America. Civil
liberties organisations were opposed, and the issue
is linked to a boycott of
Gillette. For the outcome of the plenary vote, and links to the
resulting press coverage, see my blog
This issue became live again recently, with an attempt by the
government to wrest back using
regulations much of what they conceded in parliament. FIPR fought
back, and extracted assurances
from Lord Sainsbury about the interpretation of regulations made
under the Export Control Act. This may seem boring and technical, but
is of considerable importance to British science and to academic
freedom in general. Without our campaign, much scientific
collaboration would have become technically illegal, leaving
scientists open to arbitrary harrassment by the state. Much credit
also goes to the Conservative frontbencher Doreen
Miller, Liberal Democrat frontbencher Margaret
Sharp, and President of the Royal Society Bob
May, who marshalled the crossbenchers in the Lords. We are very
grateful to them for their efforts.
FIPR also ran a successful campaign to limit the scope of the Regulation of Investigatory Powers
Act. Originally this would have allowed the police to obtain,
without warrant, a complete history of everyone's web browsing
activity (under the rubric of `communications data'); a FIPR amendment
limited this to the identity of the machines involved in a
communication, rather than the actual web pages.
Another example of first-class work by FIPR is a research project that
brought together legal and computing skills to deconstruct the fashionable
notion that `digital certificates' would solve the problems of e-commerce and
e-government. Anyone who thinks of buying such a beast, other than for purposes
of research or ridicule, should have a look at this
article
first.
My pro-bono work also includes sitting on Council, our University's
governing body. I stood for election because I was concerned about the
erosion of academic freedom under the previous administration. See,
for example a truly
shocking speech by Mike Clark at a recent discussion on IPR. Mike
tells how our administration promised a research sponsor that he would
submit all his relevant papers to them for prior review - without even
asking him! It was to prevent abuses like this that we founded the Campaign for Cambridge
Freedoms. Its goal was to defeat a proposal by the former Vice
Chancellor that most of the intellectual property generated by faculty
members - from patents on bright ideas to books written up from
lecture notes - would belong to the university rather than to the
person who created them. If this had passed, Cambridge would have
swapped one of the most liberal rules on intellectual property of any
British university, for one of the most oppressive anywhere. Over
almost four years of campaigning we managed to draw many of the teeth
of this proposal. A recent vote approved a
policy in which academics keep copyright but the University gets 85% of
patent royalties. The policy is howerer defective in many ways: for
example, it allows the University to do IPR deals without the consent
of affected staff and students. The authorities have undertaken to
introduce amendments: the detail will be fought over.
Finally, my freedom-oriented work includes a number of technical writings:
The Risks of Key
Recovery, Key Escrow, and Trusted Third-Party Encryption has
become perhaps the most widely cited publication on the topic of key
escrow. It examines the fundamental properties of current government
requirements for access to keys and attempts to outline the technical
risks, costs, and implications of deploying systems that would satisfy
them. It was originally presented as testimony to the US Senate, and
then also to the Trade
and Industry Committee of the UK House of Commons, together with
some
further testimony.
Comments on
Terrorism presents a brief critique of why many of the technical measures
that various people have been trying to sell since the 11th September
attacks are unlikely to work as promised;
The
Global Trust Register is a book which contains the fingerprints of
the world's most important public keys. It thus implements a top-level
certification authority (CA) using paper and ink rather than in an
electronic system. If the DTI had pushed through their original policy on mandatory licensing of
cryptographic services, this book would have been banned in the UK.
At a critical point in the lobbying, it enabled me to visit the
Culture Secretary and ask why his government wanted to ban my book.
This got crypto policy referred to Cabinet when otherwise it would
have been pushed through by the civil servants;
The Steganographic
File System will give you any file whose name and password you
know, but if you do not know the correct password, you cannot even
tell that a file of that name exists in the system. It is designed to
give a high level of protection against seizure of keys and data as
envisaged by the RIP bill.
Download the code from here.
The GCHQ
Protocol and its Problems points out a number of serious defects in
the protocol
that the British government uses to secure its electronic mail, and
which it is trying to arm-twist other organisations into using too.
This paper appeared at Eurocrypt 97 and it incorporates our replies to
GCHQ's response
to an earlier
version of our paper. Our analysis prevented the protocol from
being widely adapted throughout Europe, as the forces of darkness hoped;
as far as I know, its only use outside the UK public sector is in the
French health service. Its use even in the UK is now under
attack as its escrow of signing keys makes the retrospective
forgery of government documents easy, thus undermining the Freedom of
Information Act;
Crypto
in Europe - Markets, Law and Policy surveys the uses of
cryptography in Europe, looks at the technical and legal threats, and
discusses the shortcomings of public policy. It appeared at the
Conference on Cryptographic Policy and Algorithms, Queensland, July
1995. In it, I first pointed out that law enforcement communications
intelligence was mostly about traffic analysis - finding out who was
talking to whom - and criminal communications security was mostly
traffic security. This was considered heretical at the time but has
been confirmed since by the emergence of the prepaid mobile phone as
the main threat to police communications intelligence.
There is a page of material on the main policy
issues as they were in 1999, when I decided to stop maintaining my own web
pages on information policy and simply contribute to FIPR's instead. There's
also a leaked copy of the NSA Security
Manual that you can download (there is also latex source
for it).
Finally, here is my PGP
key. If I revoke this key, I will always be willing to explain why I have
done so provided that the giving of such an explanation is lawful. (For
more, see FIPR.)
Security engineering is about building systems to remain dependable in
the face of malice, error or mischance. As a discipline, it focuses on
the tools, processes and methods needed to design, implement and test
complete systems, and to adapt existing systems as their environment
evolves.
Security engineering is not just concerned with `infrastructure'
matters such as firewalls and PKI. It's also about specific
applications, such as banking and medical record-keeping, and about
embedded systems such as automatic teller machines and burglar alarms.
It's usually done badly: it often takes several attempts to get a
design right. It is also hard to learn: although there are good books
on a number of the component technologies, such as cryptography and
operating systems security, there's little about how to use them
effectively, and even less about how to make them work together. It's
hardly surprising that most systems don't fail because the mechanisms
are weak, but because they're used wrong.
My book is attempt to help the working engineer to do better. As well
as the basic science, it contains details of many typical applications
- and lot of case histories of how their protection mechanisms failed.
(Some of these are available in the research papers listed below, but
I've added many more.) It contains a fair amount of new material, as
well as accounts of a number of technologies (such as hardware
tamper-resistance) which aren't well described in the accessible
literature. There was a very
nice review in Information Security Magazine, and the other reviews have so
far been positive. Even the usually cynical Slashdot
crowd liked it. I hope you'll also enjoy it - and find it seriously useful!
I don't execute programs sent by strangers without good reason. So I don't
read attachments in formats such as Word, unless by prior arrangement. I also
discard html-only emails, as most of them are spam; and emails asking for
`summer research positions' or `internships', which we don't do.
If you're contacting me about coming to Cambridge to do a PhD,
please read the relevant web
pages first.