Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
Ross Anderson's Home Page
[go: Go Back, main page]


University of Cambridge Computer Laboratory foto

Ross Anderson

[Research] [Blog] [Politics] [My Book] [Music] [Contact Details]

What's New

I am hosting the Fifth Workshop on the Economics of Information Security (WEIS 2006) at Cambridge from June 26-28. See also my Economics and Security Resource Page.

We now have a Security Group Blog at www.lightbluetouchpaper.org. It is slowly replacing my personal blog.

2005 highlights included research papers on The topology of covert conflict, on combining cryptography with biometrics, on Sybil-resistant DHT routing, and on Robbing the bank with a theorem prover; and a survey paper on cryptographic processors, a shortened version of which appeared in the February 2006 Proceedings of the IEEE.

2004 highlights included papers on cipher composition, key establishment in ad-hoc networks and the economics of censorship resistance. I also lobbied for amentments to the EU IP Enforcement Directive and organised a workshop on copyright which led to a common position adopted by many European NGOs. Finally, I started a web page for out-of-copyright recordings and manuscripts of traditional music.


Research

I am Professor of Security Engineering at the Computer Laboratory. I supervise a number of research students - Jolyon Clulow, Hao Feng, Stephen Lewis, Tyler Moore, Shishir Nagaraja and Andy Ozment.Richard Clayton and Sergei Skorobogatov are postdocs. Mike Bond, Vashek Matyas and Andrei Serjantov are former postdocs. Jong-Hyeon Lee, Frank Stajano, Fabien Petitcolas, Harry Manifavas, Markus Kuhn, Ulrich Lang, Jeff Yan, Susan Pancho, Mike Bond, George Danezis, Sergei Skorobogatov, Hyun-Jin Choi and Richard Clayton have earned PhDs.

My other personal research interests include:

Many of my papers are available in html and/or pdf, but some of the older technical ones are in postscript, which was the standard for many years. You can download a postscript viewer from here. Also, by default, when I post a paper here I license it under the relevant Creative Commons license, so you may redistribute it but not modify it. I may subsequently assign the residual copyright to an academic publisher.

Economics of information security

Over the last few years, it's become clear that many systems fail not for technical reasons so much as from misplaced incentives - often the people who could protect them are not the people who suffer the costs of faulure. There are also many questions with an economic dimension as well as a technical one. For example, will digital signatures make electronic commerce more secure? Is so-called `trusted computing' a good idea, or just another way for Microsoft to make money? And what about all the press stories about `Internet hacking' - is this threat serious, or is it mostly just scaremongering by equipment vendors? It's not enough for security engineers to understand ciphers; we have to understand incentives as well. This has led to a rapidly growing interest in `security economics', a discipline which I helped to found. I maintain the Economics and Security Resource Page, and my research contributions include the following.

Our annual bash is the Workshop on Economics and Information Security; the 2006 workshop will be here in Cambridge from June 26-28. The 2005 event was at Harvard and the papers are online. My Economics and Security Resource Page provides a guide to the literature and to what's on. There is also a web page on the economics of privacy, maintained by Alessandro Acquisti.

Peer-to-Peer systems

Since about the middle of 2000, there has been an explosion of interest in peer-to-peer networking - the business of building useful systems out of large numbers of intermittently connected machines, with virtual infrastructures that are tailored to the application. One of the seminal papers in the field was The Eternity Service, which I presented at Pragocrypt 96. I had been alarmed by the Scientologists' success at closing down the penet remailer in Finland, and had been personally threatened by bank lawyers who wanted to suppress knowledge of the vulnerabilities of ATM systems (see here for a later incident). This taught me that electronic publications can be easy for the rich and the ruthless to suppress. They are usually kept on just a few servers, whose owners can be sued or coerced. To me, this seemed uncomfortably like books in the Dark Ages: the modern era only started once the printing press enabled seditious thoughts to be spread too widely to ban. The Eternity Service was conceived as a means of putting electronic documents as far outwith the censor's grasp as possible. (The concern that motivated me has since materialised; a UK court judgment has found that a newspaper's online archives can be altered by order of a court to remove a libel.)

But history never repeats itself exactly, and the real fulcrum of censorship in cyberspace turned out to be not sedition, or vulnerability disclosure, or even pornography, but copyright. Hollywood's action against Napster led to my Eternity Service ideas being adopted by many systems including Publius and Freenet. Many of these developments were described in an important book, and the first academic conference on peer-to-peer systems was held in March 2002 at MIT. The field has since become very active: here is a web page of peer-to-peer conferences. See also Richard Stallman's classic, The Right to Read.

My contributions since the Eternity paper include:

I am now running a CMI project with Frans Kaashoek and Robert Morris on building a next-generation peer-to-peer system. I gave a keynote talk about this at the Wizards of OS conference in Berlin; the slides are here.

Robustness of cryptographic protocols

Very many security system failures can be attributed to poorly designed protocols, and this has been of interest to our team for many years. Some relevant papers follow.

Protocols have occasionally been the stuff of high drama. Citibank asked the High Court to gag the disclsoure of certain crypto API vulnerabilities that affect a number of cryptographic processors used in banking. I wrote to the judge opposing the application. A gag order was nonetheless imposed, although in slightly less severe terms than those requested by Citibank. The trial was in camera, and new information revealed about these vulnerabilities in the course of the trial may not be disclosed in England or Wales. (Citi had wanted a global ban.) Information already in the public domain was unaffected. The vulnerabilities were discovered by Mike Bond and me while acting as the defence experts in a phantom withdrawal court case, and independently discovered by the other side's expert, Jolyon Clulow, who has since joined us as a research student. They are of significant scientific interest, as well as being of great relevance to the rights of the growing number of people who seem to be suffering phantom withdrawals from their bank accounts worldwide. If Citi thought that this would prevent knowledge of the problem spreading, they reckoned without the New Scientist, the Register, Slashdot, news.com, and Zdnet.


Reliability of security systems

I have been interested for many years in how security systems fail in real life. This is a prerequisite for building robust secure systems; many security designs are poor because they are based on unrealistic threat models. This work began with a study of automatic teller machine fraud, and then expanded to other applications as well. It now provides the central theme of my book.


Analysis and design of cryptographic algorithms

Recent reports of attacks on the standard hash function SHA have left Tiger, which Eli Biham and I designed in 1995, as the obvious choice of cryptographic hash function. I also worked with Eli, and with Lars Knudsen, to develop Serpent - a candidate block cipher for the Advanced Encryption Standard. Serpent won through to the final of the competition and got the second largest number of votes. Another of my contributions was founding the series of workshops on Fast Software Encryption.

Other papers on cryptography and cryptanalysis include the following.


Information hiding (including Soft Tempest)

From the mid- to late-1990s, I did a lot of work on information hiding.


Security of Medical Information Systems

Reliability leads naturally to medical informatics, a subject in which I've worked off and on over the years. The UK government is building a national database to hold everyone's medical records, which doctors oppose. Ministers recently gave a guarantee of patient privacy, about which GPs, NGOs and commentators are sceptical. There are radio pieces on the problems here and here, comments here, and earlier material here and here. An example of likely problems comes from a report that the Real IRA penetrated the Royal Victoria Hospital in Northern Ireland and used its electronic medical records to gather information on policemen to target them and their families for murder. A particularly shocking case was that of Helen Wilkinson, who needed to organise a debate in Parliament to get ministers to agree to remove defamatory and untrue information about her from NHS computers.

Civil servants started pushing for online access to everyone's records in 1992 and I got involved in 1995, when I started consulting for the British Medical Association on the safety and privacy of clinical information systems. Back then, the police were given access to all drug prescriptions in the UK, after the government argued that they needed it to catch the occasional doctor who misprescribed heroin. The police got their data, they didn't catch Harold Shipman, and no-one was held accountable.

The NHS slogan was initially `a unified electronic patient record, accessible to all in the NHS'. The slogan has changed several times, and the strategy now contains some words on confidentiality, but the goal remains the same. The Health and Social Care (Community Health and Standards) Act allowed the Government access to all medical records in the UK, for the purposes of `Health Improvement'. It removed many of the patient privacy safeguards in previous legislation. In addition, the new contract offered to GPs since 2003 moves ownership of family doctor computers to Primary Care Trusts (that's health authorities, in oldspeak). There was a token consultation on confidentiality; the Foundation for Information Policy Research, which I chair, published a response to it (which was of course ignored).

The last time people pointed out that NHS administrators were helping themselves illegally to confidential personal health information, Parliament passed some regulations on patient privacy to legalise those illegal practices that had been brought to public attention. For example, the regulations compel doctors to give the government copies of all records relating to infectious disease and cancer. The regulations were made under an Act that was rushed through in the shadow of the last election and that gives ministers broad powers to nationalise personal health information.

In the end, perhaps only a European law challenge can halt the slide toward surveillance. The regulations appear to breach the Declaration of Helsinki on ethical principles for medical research, and contravene the Council of Europe recommendation no R(97)5 on the protection of medical data, to which Britain is a signatory. There is a list of some more of the problems here, and a letter we've written to the BMJ here.

For deeper historical background, the best source may be an editorial from the British Medical Journal. There is a discussion paper on the problems that the bill could cause for medical and other researchers, and an impact analysis commissioned by the Nuffield Trust. The government claimed the records were needed for cancer registries: yet cancer researchers in many other countries work with anonymised data (there are papers on German cancer registries in Germany here and here, and some links from the website of the Canadian Privacy Commissioner.) See also the article in the Observer that brought this issue to public attention; a leader in the New Statesman; an article in The Register; a letter to the editor of the Times written by senior doctors; and the reports of the Parliamentary debate on the original bill in the Commons and the Lords.

Some relevant papers of my own follow. They are mostly from the 1995-6 period, when the government last tried to centralise all medical records - and we saw them off.

An important paper is Privacy in clinical information systems in secondary care which describes a hospital system that implements the BMA security policy. The main government objection to our policy was `it'll never work in hospitals'; this system, which is now running at a number of sites, shows that hospital systems can indeed be made secure. It is described in more detail in a special issue of the Health Informatics Journal on data security, confidentiality and safety (v 4 nos 3-4, Dec 1998) which I edited. (This system is due to be ripped out in October 2007 and replaced by a less capable system that will give ministers access to everything.)

The same issue also contains a paper on Protecting Doctors' Identity in Drug Prescription Analysis which describes a system designed to de-identify prescription data properly for commercial use. This system led to the `Source Informatics' court case, in which the UK government tried to discourage its owner, now called IMS Health, from promoting it - as it would have competed with much less privacy-friendly government systems. The government lost: the Court of Appeal decided that personal health information can be used for research and other secondary purposes without the informed consent of patients, but provided that the de-identification is done competently.

A first-class collection of links to papers on the protection of de-identified data is maintained by the American Statistical Association. Bill Lowrance wrote a good survey for the US Department of Health and Human Services of the potential for using de-identified data ro protect patient privacy in medical research, while a report by the US General Accounting Office shows how de-identified records are handled much better by Medicare than by the NHS. For information on what's happening in the German speaking world, see Andreas von Heydwolff's web site and Gerrit Bleumer's European project links. Resources on what's happening in the USA - where medical privacy is a much more live issue - include EPIC, the med-privacy mailing list archives; the web sites run by Citizens for Choice in Health Care and Georgetown University (the latter has a comprehensive survey of US health privacy laws); and a report from the US National Academy of Sciences entitled For the Record: Protecting Electronic Health Information. Other resources include a report by the US Office of Technology Assessment, and web pages by CPT and the Institute for Health Freedom.


Public policy issues

John Curran said in 1790: ``The condition upon which God hath given liberty to man is eternal vigilance; which condition if he break, servitude is at once the consequence of his crime, and the punishment of his guilt''. After the crypto wars of the 1990s, this is something we are all aware of!

I chair the Foundation for Information Policy Research, which I helped set up in 1998. This body is concerned with promoting research and educating the public in such topics as the interaction between computing and the law, and the social effects of IT. We are not a lobby group; our enemy is ignorance rather than the government of the day, and one of our main activities is providing accurate and neutral briefing for politicians and members of the press.

My pro-bono work also includes sitting on Council, our University's governing body. I stood for election because I was concerned about the erosion of academic freedom under the previous administration. See, for example a truly shocking speech by Mike Clark at a recent discussion on IPR. Mike tells how our administration promised a research sponsor that he would submit all his relevant papers to them for prior review - without even asking him! It was to prevent abuses like this that we founded the Campaign for Cambridge Freedoms. Its goal was to defeat a proposal by the former Vice Chancellor that most of the intellectual property generated by faculty members - from patents on bright ideas to books written up from lecture notes - would belong to the university rather than to the person who created them. If this had passed, Cambridge would have swapped one of the most liberal rules on intellectual property of any British university, for one of the most oppressive anywhere. Over almost four years of campaigning we managed to draw many of the teeth of this proposal. A recent vote approved a policy in which academics keep copyright but the University gets 85% of patent royalties. The policy is howerer defective in many ways: for example, it allows the University to do IPR deals without the consent of affected staff and students. The authorities have undertaken to introduce amendments: the detail will be fought over. Finally, my freedom-oriented work includes a number of technical writings:

There is a page of material on the main policy issues as they were in 1999, when I decided to stop maintaining my own web pages on information policy and simply contribute to FIPR's instead. There's also a leaked copy of the NSA Security Manual that you can download (there is also latex source for it).

Finally, here is my PGP key. If I revoke this key, I will always be willing to explain why I have done so provided that the giving of such an explanation is lawful. (For more, see FIPR.)


My Book on Security Engineering

cover

Now also available in Japanese and Chinese!

Security engineering is about building systems to remain dependable in the face of malice, error or mischance. As a discipline, it focuses on the tools, processes and methods needed to design, implement and test complete systems, and to adapt existing systems as their environment evolves.

Security engineering is not just concerned with `infrastructure' matters such as firewalls and PKI. It's also about specific applications, such as banking and medical record-keeping, and about embedded systems such as automatic teller machines and burglar alarms. It's usually done badly: it often takes several attempts to get a design right. It is also hard to learn: although there are good books on a number of the component technologies, such as cryptography and operating systems security, there's little about how to use them effectively, and even less about how to make them work together. It's hardly surprising that most systems don't fail because the mechanisms are weak, but because they're used wrong.

My book is attempt to help the working engineer to do better. As well as the basic science, it contains details of many typical applications - and lot of case histories of how their protection mechanisms failed. (Some of these are available in the research papers listed below, but I've added many more.) It contains a fair amount of new material, as well as accounts of a number of technologies (such as hardware tamper-resistance) which aren't well described in the accessible literature. There was a very nice review in Information Security Magazine, and the other reviews have so far been positive. Even the usually cynical Slashdot crowd liked it. I hope you'll also enjoy it - and find it seriously useful!

More ...


Contact details

University of Cambridge Computer Laboratory
JJ Thomson Avenue
Cambridge CB3 0FD, England

E-mail: Ross.Anderson@cl.cam.ac.uk
Tel: +44 1223 33 47 33
Fax: +44 1223 33 46 78

I don't execute programs sent by strangers without good reason. So I don't read attachments in formats such as Word, unless by prior arrangement. I also discard html-only emails, as most of them are spam; and emails asking for `summer research positions' or `internships', which we don't do.

If you're contacting me about coming to Cambridge to do a PhD, please read the relevant web pages first.