Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456 Economics and Security Resource Page
Do we spend enough on keeping `hackers' out of our computer systems? Do we not
spend enough? Or do we spend too much? For that matter, do we spend too little
on the police and the army, or too much? And do we spend our security budgets
on the right things?
The economics of security is a hot and rapidly growing field of research. More
and more people are coming to realise that security failures are often due to
perverse incentives rather than to the lack of suitable technical protection
mechanisms. (Indeed, the former often explain the latter.) While much recent
research has been on `cyberspace' security issues - from hacking through fraud
to copyright policy - it is expanding to throw light on `everyday' security
issues at one end, and to provide new insights and new problems for `normal'
computer scientists and economists at the other. In the commercial world, as in
the world of diplomacy, there can be complex linkages between security
arguments and economic ends.
Our annual bash is the Workshop on Economics and Information Security: the 2009 event will be in London from 24-25
June. See below for links to the workshops from
2002-8, for all the workshop papers to date, and for other conferences with
some security economics content.
The Security and Human
Behaviour workshop brings security engineers together with psychologists,
behavioral economists and others. This is great fun and hugely productive. See
the papers, liveblog and audio for 2009; and the papers, liveblog and audio for the first meeting in
2008.
Managing
Online Security Risks was one of the early pieces, and is still a good
introduction. Hal Varian shows how a range of problems, from bank fraud to
distributed denial-of-service attacks, result when the incentives to avoid
abuse are poorly allocated. An analysis of cash machine
fraud, for example, showed that banks in countries with strong customer
rights suffered less fraud; complaints could not be ignored or brushed aside,
so they took more care than in countries where it was harder for fraud victims
to complain.
Why Information
Security is Hard - An Economic Perspective was the paper that got
information security people thinking about the subject. I showed how economic
analysis explains many phenomena that security researchers had previously found
to be pervasive but perplexing. Why do mass-market software products such as
Windows contain so many security bugs? Why are their security mechanisms so
difficult to manage? Why for that matter are so many specialist security
products second-rate, with bad ones driving good ones out of the market? Why is
it hard for people to use security for competitive advantage - and how might
they? Why are government evaluation schemes, such as the Orange Book and the
Common Criteria, so bad? For that matter, why do government agencies concerned
with information warfare concentrate on offense rather than defense, even now
that the Cold War is over? (There is also an Italian
translation.)
Cryptographic
abundance and pervasive computing by Andrew Odlyzko was an early paper to
point out the economic and social limits on security technology - if a boss's
secretary cannot forge his signature, a digital security system is as likely to
subtract value as add it.
Cars, Cholera and Cows:
The Management of Risk and Uncertainty is a classic paper by John Adams on
why organisations (and in particular governments) tend to be more risk-averse
than rational economic considerations would dictate. One of the mechanisms is
adverse selection: the people who end up in risk management jobs tend to be
more risk-averse than average.
Electronic
Commerce: Who Carries the Risk of Fraud? (by Nick Bohm, Ian Brown and Brian
Gladman) documents how many banks have seen online banking, and information
security mechanisms such as cryptography and digital signatures, as a means of
dumping on their customers many of the transaction risks that they previously
bore themselves in the days of cheque-based and even telephone banking.
Deworming
the Internet looks at the incentives facing virus writers,
software vendors and computer users. Its author Douglas Barnes asks
what policy initiatives might make computers less liable to infection.
Economics,
Psychology and Sociology of Security by Andrew Odlyzko discusses a
number of ways in which cultural factors undermine the formal
assumptions underlying many security systems, and gives some insights
from evolutionary psychology; for example, we have specialised neural
circuits to detect cheating in social situations.
Adverse Selection
in Online 'Trust' Certifications by Ben Edelman shows that websites with
the Trust-e seal of approval are much
more likely to be malicious than uncertified websites. Crooks have a greater
incentive to buy certification than honest merchants, so if the vetting
process isn't strict enough your certification scheme can easily end up
certifying the reverse of what it seems to.
Privacy,
Economics and Price Discrimination tackles one of the thorniest
market-failure problems. Why is privacy being eroded so rapidly, despite many
people saying they care about it? Andrew Odlyzko's analysis puts much of the
blame on differential pricing. Technology is increasing both the incentives and
the opportunities for this. From airline yield management to complex and
constantly changing software and telecomms prices, differential pricing is
economically efficient - but increasingly resented by consumers. His paper The
Unsolvable Privacy Problem and its Implications for Security Technologies
develops the argument to personalised pricing. Conditioning
prices on purchase history, by Alessandro Acquisti and Hal Varian, analyses
the market conditions under which first-degree price discrimination will
actually be profitable for a firm.
Privacy and
Rationality: Preliminary Evidence from Pilot Data, by Alessandro Acquisti
and Jens Grossklags, studies the specific problem of why people express a high
preference for privacy when interviewed but reveal a much lower preference
through their behaviour both online and offline.
In Opt In Versus Opt
Out: A Free-Entry Analysis of Privacy Policies, Jan Bouckaert and Hans
Degryse compare the competitive effects of three customer privacy policies -
anonymity, opt-in and opt-out. Under certain assumptions, opt out is the
socially preferred privacy regime: the availability in the market of
information about the buying habits of most customers, rather than a few
customers, helps competitors to enter the market.
Who Signed Up for the
Do-Not-Call List?, by Hal Varian, Fredrik Wallenberg and Glenn Woroch,
analyses the FCC's telephone-sales blacklist by district. Privacy means
different things to different population groups, but this raises further
questions. For example, educated people are more likely to sign up, as one
would expect: but is that because rich households get more calls, because they
value their time more, or because they understand the risks better? In Financial Privacy for
Free?, Alessandro Acquisti and Bin Zhang apply a similar analysis to credit
reporting. In Is There a
Cost to Privacy Breaches? Alessandro Acquisti, Allan Friedman and Rahul
Telang look at the effect on companies' stock prices of reported breaches of
their customers' privacy.
Privacy,
Property Rights & Efficiency: The Economics of Privacy as Secrecy, by
Benjamin Hermalin and Michael Katz criticises the Chicago school view that
more information is better (if collected costlessly), and argues that privacy
can be efficient even when there is no `taste' for privacy per se. The authors
develop a general model which also challenges the Varian view that privacy
could be achieved by simply giving individuals property rights in information
about themselves. In the Hermalin-Katz model, an effective privacy policy may
need to ban information transmission or use. The flow of information between
trading partners can reduce ex-post trade efficiency when the increase in
information does not lead to symmetrically or fully informed parties.
On the Economics
of Anonymity studies why anonymity systems are hard to sell, and points out
some of their novel aspects. For example, honest players want some level of
free-riding, in order to provide cover traffic. So equilibria can also be
novel, and the ways in which they break down can be complex. We also have to
consider a wider range of principals - dishonest, lazy, strategic, sensitive,
and myopic - than in most of the markets that economists try to model. Anonymity Loves Company:
Usability and the Network Effect continues this analysis to show when a
user will prefer a weak but popular anaonymity system over a strong but
rarely-used one.
The Effect of
Online Privacy Information on Purchasing Behavior: An Experimental Study by
Janice Tsai, Serge Egelman, Lorrie Cranor and Alessandro Acquisti shows that by
making information about website privacy policies more accessible and salient
it is possible to get shoppers to pay more attention to it and even to pay a
premium for privacy.
Conformity or
Diversity: Social Implications of Transparency in Personal Data Processing
by Rainer Boehme studies whether making information widely available about the
bases on which decisions are taken about individuals will lead to more
conformity (because, in the absence of information asymmetries and strategic
interaction with others, the optimal behaviour becomes mainstream) or diversity
(as in the absence of transparency, individuals are herded together by
uncertainty and fear). He presents a model of how preferences and signaling
behaviour might interact.
In Information Governance:
Flexibility and Control through Escalation and Incentives, Xia Zhao and
Eric Johnson model the over- and under-entitlement to information that arise in
firms due to agency and path-dependence. They argue that to align employees'
interests with those of the firm, employers should use rewards as well as
penalties, and allow staff to escalate their own access rights when needed
within this framework.
Will Outsourcing IT
Security Lead to a Higher Social Level of Security? reports a study done by
Brent Rowe for the DHS on whether outsourcing improves or undermines security.
He concludes that it depends on what's outsourced (auditing, vulnerability
testing, monitoring, insurance, implementation or even system management). Most
firms outsource at least one service, and how much they buy in is sector
dependant; also, the more they outsource, the less their overall security
spend. Decisions also depend on scale economies and network effects.
Annual CSI-FBI surveys are
often cited by practitioners in the field. Survey results are
generally recognised to be unsatisfactory, but unfortunately we don't
have anything better at present. There are also various link farms, and an awful lot of hype.
In Models and Measures for
Correlation in Cyber-Insurance, Rainer Boehme and Gaurav Kataria
examine the effects of local versus global correlation on insurance markets.
They show that in many economically important cases (such as globally
correlated risks from the worldwide spread of a worm or virus) there may be no
market solution, as the insurer's cost of safety capital becomes too high.
The Economic
Impact of Role-Based Access Control is a study commissioned by the US
National Institute of Standards and Technology study to assess the economic
impact of an investment they made in promoting role-based access control. It
appears to be the first serious study that uses the return on investment to
assess research in the field.
Two papers, Economic Consequences
of Sharing Security Information (by Esther Gal-Or and and Anindya Ghose) An
Economics Perspective on the Sharing of Information Related to Security
Breaches (by Larry Gordon), analyse the incentives that firms have to share
information on security breaches within the context of the ISACs set up recently by the US government.
Theoretical tools developed to model trade associations and research joint
ventures can be applied to work out optimal membership fees and other
incentives. There are interesting results on the type of firms that benefit,
and questions as to whether the associations act as social planners or joint
profit maximisers.
Kevin Soo
Hoo's thesis was an interesting first attempt to bring some
econometrics to the field. It looks at what countermeasures might be
most cost-effective, given the FBI data. He also has an article
analysing the return on security investment, which he puts at an
unexciting 17-21 percent. (See press coverage here.)
There is also a US
government guide to doing risk assessment and cost-benefit
analysis.
The economic
cost of publicly announced information security breaches: empirical
evidence from the stock market, by Katherine Campbell, Larry Gordon,
Marty Loeb and Lei Zhou, provides an analysis of the effect of security
scares on share prices. There is a highly significant negative market
reaction for information security breaches involving unauthorized access
to confidential data, but no significant reaction when the breach does
not involve confidential information. Thus stock market participants
appear to discriminate across types of breach.
Economics of vulnerabilities
Is finding security
holes a good idea?, Eric Resorla argues that since large software
products such as Windows contain many security bugs, the removal of an
individual bug makes little difference to the likelihood that an
attacker will find another one later. But many exploits are based on
vulnerability information disclosed explicitly by researchers, or
implicitly when manufacturers ship patches. He therefore argues that,
unless discovered vulnerabilities are correlated, it is best to avoid
vulnerability disclosure and minimise patching.
In Optimal Policy for Software
Vulnerability Disclosure, Ashish Arora, Rahul Telang and Hao Xu
argue to the contrary. They produce a model in which neither instant
disclosure not non-disclosure is optimal; without disclosure, software
firms will have little incentive to fix bugs in later versions of
their products. Their model is based ona respresentative vulnerability
rather than on vulnerability statistics.
In Impact of Vulnerability
Disclosure and Patch Availability - An Empirical Analysis, Ashish
Arora, Ramayya Krishnan, Anand Nandkumar, Rahul Telang, and Yubao Yang
present empirical data to support the model of the above paper. While
vendors respond quickly to rapid disclosure, disclosure does increase the
number of attacks; and the number of reported vulnerabilities does
decline over time. They also find that open source projects patch more
quickly than proprietary vendors, and large companies patch more quickly
than small ones.
In Network
Security: Vulnerabilities and Disclosure Policy, Jay Pil Choi, Chaim
Fershtman and Neil Gandal model the conditions under which a company would
voluntarily disclose vulnerabilities in the absence of regulation, and in
which a mandatory disclosure policy might not necessarily be welfare
improving: it all depends on the proportion of customers who install
updates.
Timing
the Application of Security Patches for Optimal Uptime by Steve Beattie,
Seth Arnold, Crispin Cowan, Perry Wagle, and Chris Wright, provides a
quantitative analysis of a practical security management problem - how
long should you wait before you apply a security patch? Pioneers end
up discovering problems with patches that cause their systems to
break, but laggards are more vulnerable to attack. In a typical case,
a wait of between ten and thirty days seems about right.
Economics of Security
Patch Management, by Huseyin Cavusoglu, Hasan Cavusoglu and Jun Zhang,
compares liability and cost-sharing as mechanisms for incentivising vendors to
work harder at patching their software. It turns out that liability helps where
vendors release less often than optical, while cost-sharing helps where they
release more often. If you want to achieve better coordination at minimum
additional cost to the vendor, they should not be used together. Meanwhile, Competitive and Strategic
Effects in the Timing of Patch Release by Ashish Arora, Christopher Forman,
Anand Nandkumar and Rahul Telang shows that competition hastens patch release
even more than disclosure threat in two out of three studied strategies.
Open and
Closed Systems are Equivalent (that is, in an ideal world) is a paper by
Ross Anderson that examines whether openness helps the attacker or the defender
more. He shows that under standard assumptions used in reliability growth
models, openness helps both equally. There remain many factors that can break
symmetry and cause one or the other to be better in practice, but one should
look for them in the ways a system departs from the standard assumptions.
In Bug Auctions: Vulnerability
Markets Reconsidered, Andy Ozment applies auction theory to analyse how
vulnerability markets might be run better, and how they might be exploited by
the unscrupulous. Then Michael Sutton and Frank Nagle's paper, Emerging Economic Models for
Vulnerability Research, described the operation of iDefense and Tipping
Point, two companies set up to purchase vulnerabilities on the market. Vulnerability
markets by Rainer Boehme provides a short survey of the whole field.
In System
Reliability and Free Riding, Hal Varian discusses ways in which the defence
of a system can depend on the efforts of the defenders. Programming, for
example, might be down to the weakest link (the most careless programmer
introducing the fatal vulnerability) while the effectiveness of testing might
depend on the sum of everyone's efforts. There can also be cases where the
security depends on the efforts of an individual champion. These different
models have interesting effects on whether an appropriate level of defence can
be provided, and what policy measures are advisable.
The economics of
information security investment, by Larry Gordon and Marty Loeb, suggests
that a firm may often prefer to protect those information sets with middling
vulnerability, rather than the most vulnerable (as that may be too expensive);
and that to maximise the expected benefit, a firm might only spend a small
fraction of the expected loss.
On the Evolution of
Attitudes toward Risk in Winner-Take-All Games by Eddie Dekel and Suzanne
Scotchmer presents an evolutionary model of how winner-take-all conflicts such
as patent races (or for that matter battles for control of software standards)
select for risk-takers and lead to the extinction of risk-avoiders.
A BGP-based Mechanism
for Lowest-Cost Routing, by Joan Feigenbaum, Christos Papadimitriou, Rahul
Sami and Scott Shenker, shows how combinatorial auction techniques can be used
(at least in theory) to provide distributed routing mechanisms that are proof
against strategic behaviour by one or more of the participants.
Lawrence Ausubel's Ascending
Auctions with Package Bidding shows that certain types of combinatorial
auction can be solved efficiently if bidding is conducted through a trusted
proxy - a system that can be relied on to bid according to an agreed strategy.
The Communication
Complexity of Efficient Allocation Problems, by Noam Nisan and Ilya Segal,
shows that although one can solve the allocation problem using strategy-proof
mechanisms, the number of bits that must be communicated grows exponentially;
thus in many cases the best practical mechanism will be a simple bundled
auction. The paper also suggests that if arbitrary valuations are allowed,
players can submit bids that will cause communications complexity problems for
all but the smallest auctions.
Noam Nisan and Amir Ronen's seminal paper Algorithmic Mechanism
Design shows how distributed mechanisms can be designed that are
strategyproof, that is, participants cannot hope to gain an advantage by
cheating. This paper sparked off much recent research at the boundary between
theoretical computer science and economics.
There are two influential related papers by Geoffrey Heal and Howard
Kunreuther on security externalities, which extended ideas from information
security economics to much more general applications. Interdependent
Security discusses the many cases where my security depends on my
neighbour's - where worms can spread from one part of a comnpany to another,
fire from one apartment to another, and infection from one person to
another. In some cases there will be a temptation to free-ride off the efforts
of others, so it is hard to make security investment a dominant strategy. You
Can Only Die Once: Managing Discrete Interdependent Risks examines the more
general case and analyses the conditions under which various security problems
have equilibria that are not socially optimal.
Interactions of Security with Copyright and Digital Rights Management
Felix Oberholzer-Gee
and Koleman Strumpf's File-Sharing and Copyright
argues that file-sharing doesn't seem to harm overall social welfare in that
concert ticket sales have gone up by more than sales of recorded music have
fallen. The paper summarises much of the research and controversy kicked off by
an earlier paper of theirs, The Effect of
File Sharing on Record Sales -- An Empirical Analysis. That argued that
downloads do not do significant harm to the music industry: five thousand
downloads are needed to displace a single album sale, while high-selling albums
actually benefit from file sharing.
A Cost
Analysis of Windows Vista Content Protection asks some hard questions about
whether the new security mechanisms in Vista are worth it, and to whom. It
suggests Microsoft is imposing large costs on hardware suppliers, under cover
of protecting Hollywood content, but in reality as a lock-in play to control
content distribution.
It follows logically from the `Trusted Computing'
Frequently Asked Questions, which provided the first critical survey of
Trusted Computing, and Cryptography and
Competition Policy - Issues with `Trusted Computing' which developed an
economic analysis that first suggested that Microsoft stoods to gain much more
than Hollywood - with the quick win being to lock in users of Microsoft Office
more tightly, thus enabling its price to be raised (or cut less) in the face of
competition.
Fetscherin and Vlietstra's DRM and
music: How do rights affect the download price? shows that the prices of
music tracks sold online are mostly determined by the rights granted to the
purchaser - including the right to burn, copy or export the music - and also by
the label and the location.
Ivan Png's Copyright:
A Plea for Empirical Research attacks Oberholzer and Strumpf, citing six
other studies that did indeed show a negative correlation between downloads and
CD sales. It also examines the Eldred case and looks at the incentive effects
of copyright law on the production of movies.
Yooki Park and Suzanne Scotchmer's Digital Rights
Management and the Pricing of Digital Products argues that DRM does not
have to be perfect - the cost of circumvention needn't be raised above the
monopoly price; that technical protection may still yield more revenue than
legal protection, as it may never expire; and that separate DRM systems may
yield higher prices than a shared system, because of the greater incentives
for, and effects of, circumvention. It also looks at how the structure of a DRM
consortium such as the TCG might promote, or inhibit, collusive behaviour among
content vendors.
Hal Varian's New
Chips Can Keep a Tight Rein on Consumers provides a concise introduction to
the problems that strict usage control mechanisms create for innovation
policy. A certain level of reverse engineering for compatibility is an
important brake on the abuse of monopoly power, especially in information goods
and services markets whose incumbents try hard to manipulate switching costs by
controlling compatibility.
In Cruel, Mean or
Lavish?: Economic Analysis, Price Discrimination and Digital Intellectual
Property Jamie Boyle argues that the next target of the copyright lobby,
after cracking down on fair use, will logically be the doctrine of first sale:
the right to resell, lend, or even criticise a book (or film or software
product) will be increasingly limited by contract and by technical
means. Publishers may try to control their aftermarkets using arguments about
the economics of price discrimination.
In The Law and
Economics of Reverse Engineering, Pam Samuelson and Suzanne Scotchmer
describe what may go wrong if some combination of technical and legal
restraints can be made to undermine the right to reverse engineer software
products so as to make other products compatible with them. It provides the
theoretical and scholarly underpinnings for much of the work on the
anti-competitive effects of the DMCA, copyright control mechanisms, and
information security mechanisms applied to accessory control
applications. There is also a shorter
paper that applies the lessons of the main paper to the DeCSS case.
Open
Source Software Projects as User Innovation Networks expands on this. Eric
von Hippel shows how most of the innovations that spur economic growth are not
anticipated by the manufacturers of the platforms on which they are based; the
PC, for example, was conceived as an engine for running spreadsheets. If IBM
had been able to limit it to doing that, a huge opportunity would have been
lost. Furthermore, technological change in the IT goods and services markets is
usually cumulative. If security technology can be abused by incumbent firms to
make life harder for people trying to develop novel uses for their products,
this will create all sorts of traps and perverse incentives.
In Security
and Lock-In: The Case of the U.S. Cable Industry, Tom Lookabaugh and Doug
Sicker discuss an existing case history of an industry's development being
affected by security-related technical lock-in. US cable industry operators are
locked in to their set-top-box vendors; and although they can largely negotiate
to offset the direct costs of this when committing to a suppler, the indirect
costs are large and unmanageable. In particular, innovation suffers. Cable is
falling behind other platforms, such as the internet, as the two platform
vendors don't individually have the incentives to invest in improving their
platforms.
Trusted
Computing, Peer-To-Peer Distribution, and the Economics of Pirated
Entertainment, by Stuart Schechter, Rachel Greenstadt and Mike Smith, shows
how trusted computing technology can aid the pirates as well as the Hollywood
guys. TC platforms will, if they perform as advertised, provide much more
robust platforms for hosting peer-to-peer file-swapping services; they will be
very much less vulnerable to the service denial attacks currently deployed by
the content industry against services such as gnutella, grokster and
kazaa.
In Privacy
Engineering for Digital Rights Management Systems, Joan Feigenbaum, Michael
Freedman, Tomas Sander and Adam Shostack discuss why the economic motivations
of the various players lead to serious difficulties in deploying privacy
technology for DRM.
The
underground economy: priceless by Rob Thomas and Jerry Martin of Team Cymru
was the first paper to explore the underground economy from studying it
directly by monitoring IRC chat rooms. In recent years online criminals have
established an efficient division of labour, just like in Adam Smith's pin
factory. This paper explains how the villains' pin factory works.
In An
Inquiry into the Nature and Causes of the Wealth of Internet Miscreants,
Jason Franklin, Vern Paxson, Adrian Perrig, and Stefan Savage provides a more
systematic analysis of the underground economy by studying IRC channels and
collecting a lot of data about online criminals' trade in social security
numbers, credit card numbers and other goodies.
The Impact of
Incentives on Notice and Take-down by Tyler Moore and Richard Clayton
compares a variety of notice and take-down regimes for removing content on the
Internet. They find that phishing is removed fastest, but the the banks are
much slower to remove mule-recruitment websites. It turns out that child sexual
abuse images are slowest of all to be removed, due to the division of
responsibility for removal along national lines.
In Examining the Impact of
Website Take-down on Phishing, Tyler Moore and Richard Clayton find wide
variation in the effectiveness of the responses of different actors to
phishing, and empirically demonstrate the impact of attacker innovation in the
form of longer website lifetimes for rock-phish and fast-flux attacks.
In Crime Online: Cybercrime and illegal
innovation, Howard Rush, Chris Smith, Erika Kraemer-Mbula and Puay Tang
describe the specialisation that has accelerated the development of online
crime since about 2004, just as Adam Smith's pin factory epitomised the same
tendency in the late 18th century.
A study
of security economics in Europe has recently been published by the
European Network and Information Security Agency. It applies security economics
research to synthesise a series of policy options for dealing with cyber risk
and online policy issues in Europe.
Do Data Breach
Disclosure Laws Reduce Identity Theft? by Sasha Romanosky, Rahul Telang and
Alessandro Acquisti studies the effects of the security breach disclosure laws
now in force in many US states, and concludes that the case for their
effectiveness has not been proven.
In Reinterpreting the
Disclosure Debate for Web Infections, Oliver Day, Rachel Greenstadt and
Brandon Palmen provide another analysis of data on electronic crime, in this
case the distribution of malware on infected web hosts. They show a high
concentration of infected hosts at poor-performing ISPs, and find evidence of
attackers moving to previously untargeted ISPs as others clean up their act.
Why the Security
Market has Not Worked Well is a chapter from a 1990 study by the NAS
Computer Science and Technology Board which provides an early analysis of the
`computer security problem'. It blames the rapid pace of technological (and
particularly architectural) change, the comparatively slow pace of government
market interventions (through procurement and evaluation programs), export
controls, a lack of consumer understanding of the risks, and the very limited
recourse that US customers have against vendors of faulty software.
Improving
Information Flow in the Information Security Market describes the efforts
of the US government over the last couple of decades to tackle a perceived
market failure in the security business - the lemons problem, whereby bad
products drove out good ones. The attempted fix was a government-sponsored
evaluation scheme (the Orange Book), but that was not without its own
problems.
In The Economic
Impact of Regulatory Information Disclosure on Information Security
Investments, Competition, and Social Welfare, Anindya Ghose and Uday Rajan
discuss how the implementation of US legislation such as Sarbanes-Oxley,
Gramm-Leach-Bliley and HIPAA has placed a disproportionate burden on small and
medium sized businesses, largely through a one-model-fits-all approach to
compliance by the big accounting firms. They show how mandatory investment in
security compliance can have a number of unindented consequences including
distorting security markets and reducing competition.
The European Union has proposed a Network
Security Policy that sets out a common European response to
attacks on information systems. This starts using economic arguments
about market failure to justify government action in this sector. The
proposed solutions are rather familiar, involving everything from
consciousness raising to Common Criteria evaluations; but the use of
economic analysis could be significant for the future.
The Center for Strategic and International Studies has a very good study of the risks of
cyber-terrorism which goes a long way to debunk the scaremongering and hype
about the vulnerability of critical infrastructures to digital attack.
The Brookings Institute has published a short paper
on the economic effects of security interdependency, and a longer book
chapter on the economics of homeland security - what should be the roles of
government and the private sector in financing precautions against
terrorism?
Economics
and Security in Statecraft and Scholarship explains why a web search on
`economics' and `security' turns up few interesting documents on international
affairs. The two were considered closely linked until 1945; thereafter nuclear
weapons were thought to decouple national survival from economic power, while
the USA established a pattern of confronting the USSR over security, and Japan
and the EU over trade. This caused Washington bureaucrats to split into a
`security' camp and a `political economy' camp; academics studying
international relations followed suit. Bill Clinton started to get the
bureaucrats working together again from about 1995, but the academics are still
lagging somewhat.
Closing the Phishing
Hole - Fraud, Risk and Nonbanks reports research commissioned by the US
Federal Reserve for their biennal Santa
Fe Conference on bank regulation. This paper identified speedy asset
recovery as the most effective deterrent to online fraud, which is made easier
by systems like Western Union that make the recovery of stolen funds more
difficult.
Nonbanks and Risk in
Retail Payments by Stuart Weiner, Richard Sullivan and Simonetta Rosati
followed up with an analysis of the role played by nonbanks in US payment
systems more generally; a very large part of the infrastructure is now
outsourced.
The
topology of covert conflict by Shishir Nagaraja and Ross Anderson examines
how the police can best target an underground organisation given some knowledge
of its patterns of communication, and they in turn might react, using a
framework combining ideas from network analysis and evolutionary game
theory. Nagaraja's The Economics of
Covert Community Detection and Hiding extended this work from active
attacks on networks to passive surveillance, and studied best strategies for
both surveillance and countersurveillance in networks where the adversaries
have bounded resources.
In The Economics
of Mass Surveillance, George Danezis and Bettina Wittneben apply these
network analysis ideas to privacy policy; traffic analysis conducted
against just a few well-connected militant organisers can draw a surprising
number of members of a subversive organisation into the surveillance net.
In The Economics of
Digital Forensics, Tyler Moore explains how the interests of vendors
diverge from those of law enforcement. For example, mobile phone vendors prefer
proprietary interfaces, which makes data recovery from handsets difficult;
recovery tools exist only for the most common models. Criminals should buy
unfashionable phones, while the police should prefer open standards.
"Proof-of-Work"
Proves Not to Work by Ben Laurie and Richard Clayton shows that the
spam-blocking schemes that rely on getting mail senders to perform some
computational task are unlikely to solve the spam problem: there are many
legitimate senders with less available compute power per message than many
spammers can obtain from the compromised hosts they use.
In Modelling
Incentives for Email Blocking Strategies, Andrei Serjantov and Richard
Clayton analyse the incentives on ISPs to block traffic from other ISPs with
many infected machines, and back this up with data. They also show how a number
of existing spam-blocking strategies are irrational and counterproductive.
In Inadvertent
Disclosure - Information Leaks in the Extended Enterprise, Eric Johnson and
Scott Dynes study inadvertent data leaks of sensitive information (personal and
corporate) through P2P file sharing, and also find that some users are
explicitly searching for sensitive documents leaked through such mechanisms.
In Mental Models of
Computer Security Risks, Farzaneh Asgharpour, Debin Liu and Jean Camp show
that people's mental models of computer security risks vary substantially
according to their expertise in the subject. The models implicit in much of the
literature are different again. This diversity has implications for risk
communication.
Evaluating the Wisdom of
Crowds in Assessing Phishing Websites by Tyler Moore and Richard Clayton
challenges the fashionable approach of turning decisions over to end
users on the Internet. Letting users vote on what websites are evil creates
many opportunities for abuse because of the huge variance in participation
rates.
The event to aim for if you want to keep up with research in this field and get
to know people is WEIS - the Workshop on the Economics of Information
Security. WEIS
2010 will be in Harvard on June 7-8.
The first of these workshops, WEIS
2002, took place at UC Berkeley;
WEIS 2008 was held at
Dartmouth; and WEIS 2009was held at
UCL in London.
These links give you access to all the conference papers.
Other relevant conferences include:
The Security and Human
Behaviour workshop brings security engineers together with psychologists,
behavioral economists and others. See the papers, liveblog and audio for 2009; and the papers, liveblog and audio for the first meeting in
2008.
NetEcon
looks at general problems of economics of networks, but some papers examine
dependability, It started off as the workshop on the economics of peer-to-peer
systems in 2003 at
Berkeley, then 2004 at Harvard,
then 2005 in Philadelphia. It then
became NetEcon
2006 and was followed by NetEcon 2007.
Some relevant papers turn up in Toulouse at SoftInt, the biennial
Conference on the Economics of the Software and Internet Industries. See also
the proceedings of predecessor conferences in 2002 (on Open Source Software
Economics), 2005 and 2007. There are also
occasional security-economics papers at the NET Institute Conference,while the Society for Economic
Research on Copyright Issues holds annual workshops.
Information
Rules, by Carl Shapiro and Hal Varian, is a good introduction to economics
for computer scientists. It focuses on the specific problems and opportunities
of IT goods and services markets, and the characteristics that tend to make
them different from the market for potatoes - such as the combination of high
fixed costs and low marginal costs, network externalities, technical lock-in
and standards wars. It is pitched at the level of an educated general
reader. If you want the mathematical detail too, read Varian's Intermediate
Microeconomics.
Security Engineering
by Ross Anderson is a good introduction for economists (and others) to secure
systems engineering. It covers not just technologies such as crypto and
`infrastructure' matters such as firewalls and PKI, but a number of specific
applications, such as banking and medical record-keeping, and embedded systems
such as automatic teller machines and burglar alarms. It brings out the fact
that most systems don't fail because the mechanisms are weak, but because
they're used wrong, and provides economic explanations for a number of these
failures.
Secrets
and Lies by Bruce Schneier is a more populist book in the same theme. It
discusses how things go wrong and what sort of organisational measures are
advisable to contain them. It debunks the idea that security problems can be
fixed by focussing on purely technical measures such as cryptography.
Economic
Behavior in Adversity by Jack Hirshleifer is a set of essays
from the early days of conflict theory. It starts off from early work at
Rand on how societies and economies recover from disaster; in an attempt
to plan for World War 3, Rand economists looked at the aftermath of
tragedies from World War 2 to the Black Death. This led to work on a
broader front from evolutionary game theory through the interplay of law
and economics to hindrance strategies in general. (These are where a
competitor concentrates not on running faster, but on making its
adversaries run slower.)
The
Dark Side of the Force: Economic Foundations of Conflict Theory is a more
recent set of essays by Jack Hirshleifer, looking at such topics as the causes
of war, why it is not always true that the rich get richer and the poor poorer,
and why the technology of conflict is absolutely essential to such
questions. The decisiveness of conflict matters; so does whether its outcome
depends on the absolute or relative difference of effort between the
combatants. The evolution of strategies, for both conflict and cooperation, is
growing in its perceived importance.
Risk
by John Adams is the classic study of why people and organisations are
sometimes more risk-averse than would seem rational, and sometimes more
risk-loving. For example, mandatory seat-belt laws did not reduce road traffic
casualties overall, but merely shifted them from vehicle occupants to
pedestrians and cyclists. Adams explains this by a `risk thermostat': people
compensate for an increased feeling of safety by driving faster. In general,
behaviour is governed by the probable costs and benefits of possible actions as
perceived through filters formed from experience and culture. This work exposes
the rather shaky foundations of much current risk assessment work.
The
Future of Ideas by Larry Lessig is an important and influential description
of the effects that increasing technical protection of copyright is likely to
have on a range of fields, from academic and intellectual life through the
competitiveness of markets and the level of innovation. He argues that the
overprotection of digital rights is an error: private land is more valuable if
it is separated from other private land by public roads, sewers and other
utility rights-of way. Its value is also enhanced by the existence of public
parks.
Paul Resnick's web
page on reputation systems has links to a lot of research that bears on
incentives and their relationship with dependability in many systems
An article in Wired
describes the low ratio of vulnerabilities to exploits and the
attention-seeking nature of many vulnerability reports prior to about 2004
NATO has been running annual colloquia on the interaction
between economics and national security, with a particular emphasis on Eastern
Europe. There's a summary by
Martin Spechler of the 1999 workshop