Ross Anderson passed away in March 2024. (Obituaries)

We preserve here the content of his personal web space. If you notice any problems, please contact pagemaster@cl.cam.ac.uk.


University of Cambridge Computer Laboratory University of Edinburgh foto

Ross Anderson

[Research] [Blog] [Videos] [Politics] [My Book] [Music] [Seminars] [Contact Details]

Machine Learning needs Better Randomness Standards: Randomised Smoothing and PRNG-based attacks shows that the randomness tests long used to check random number generators for use in cryptographic key generation are inadequate for machine learning, where some applications make heavy use of random inputs about which very specific assumptions are made (accepted for Usenix 2024)

Defacement Attacks on Israeli Websites is a measurement study of attacks by Palestinian sympathisers on Israeli websites since the Hamas attack on Israel (CW blog). Getting Bored of Cyberwar is a similar study of how pro-Ukrainian hackers responded to the Russian invasion of their country by attacking Russian websites, and pro-Russian hackers then responded (AP SC Magazine The Record)

No Easy Way Out: the Effectiveness of Deplatforming an Extremist Forum to Suppress Hate and Harassment is a measurement study of the industry attempt to take down Kiwi Farms in 2022-23. This holds a number of practical lessons for people interested in online censorship, as well as raising legal and philosophical issues with the approach taken by the UK's Online Safety Bill (The Register; accepted for Oakland 2024)

The Curse of Recursion: Training on Generated Data Makes Models Forget asks what will happen to GPT-{n} once most of the content online is generated by previous models. We show that the use of model-generated content in training leads to irreversible defects in subsquent model generations as the tails of the original distributions disappear, leading to model collapse (The Atlantic, Wall Street Journal, New Scientist, Venture Beat, Business Insider blog)

One Protocol to Rule Them All? On Securing Interoperable Messaging analyses the EU DMA mandate for messaging systems interoperability. This will vastly increase the attack surface at every level in the stack (blog Register Schneier).

Threat Models over Space and Time: A Case Study of E2EE Messaging Applications shows how Signal Desktop and WhatsApp Desktop are insecure; an opponent with temporary access to your laptop, such as a border guard or an intimate partner, can make this access persistent.

Chat Control or Child Protection debunks the arguments used by the intelligence community that "because children" we needed the Online Safety Bill which gave Ofcom the power to mandate snooping software in your phone (blog). The same arguments were used to support the so-called Child Sex Abuse Regulation which thankfully failed in the European Parliament (blog evidence video) – our big policy win of 2023.

Cambridge forced me to retire in September 2023 when I turned 67, a policy of unlawful age discrimination against which we are campaigning. I am now 20% at Edinburgh and (officially) 20% at Cambridge. I'm teaching a course in Security Engineering at Edinburgh to masters students and fourth-year undergrads, and the lecture videos are now all online (as are the lecture videos and notes for my first-year undergrad course on Software and Security Engineering at Cambridge).

timeline ...

Research

The research students I advise are Bill Marino, Eleanor Clifford, Lawrence Piao, Jenny Blessing, Nicholas Boucher, Anh Viet Vu, and David Khachaturov. My RAs are Richard Clayton and Hridoy Dutta. I also work with Robert Brady. My former RAs are Sergei Skorobogatov, Lydia Wilson, Franck Courbon, Maria Bada, Yi Ting Chua, Ben Collier, Helen Oliver, Ildiko Pete, Daniel Thomas, Alice Hutchings, Sergio Pastrana, David Modic, Sven Übelacker, Julia Powles, Ramsey Faragher, Sophie van der Zee, Mike Bond, Vashek Matyas, Steven Murdoch, Andrei Serjantov and Alex Vetterl. My former students Jong-Hyeon Lee, Frank Stajano, Fabien Petitcolas, Harry Manifavas, Markus Kuhn, Ulrich Lang, Jeff Yan, Susan Pancho-Festin, Mike Bond, George Danezis, Sergei Skorobogatov, Hyun-Jin Choi, Richard Clayton, Jolyon Clulow, Hao Feng, Andy Ozment, Tyler Moore, Shishir Nagaraja, Robert Watson, Hyoungshick Kim, Shailendra Fuloria, Joe Bonneau, Wei-Ming Khoo, Rubin Xu, Laurent Simon, Kumar Sharad, Shehar Bano, Dongting Yu, Khaled Baqer, Alex Vetterl, Mansoor Ahmed and Ilia Shumailov have earned PhDs.

I'm teaching three Cambridge courses in 2023-24: the undergraduate course in Software and Security Engineering and graduate courses in Computer Security and Cybercrime. I also organise our security seminars and help run the Cambridge Cybercrime Centre.

My research topics include:


Machine learning and signal processing

The detection and manipulation of patterns, both overt and covert, has many applications, and the field is being refreshed by the recent revolution in neural networks.


Sustainability of security

Our computers and communications use several percent of global energy, and have secondary costs too – particularly if you have to throw things away for lack of software updates. I also have a long-standing interest in energy management and have more recently been looking at the energy wasted by cryptocurrency mining and at the prevention of wildlife crime. (Incidentally, this website is entirely static – no ads, trackers, javascript or even cookies. The estimated carbon cost per page view is 0.07g compared with over 2g for a typical commercial web page.)

  • Making security sustainable is the new grand challenge for computer science: designing software so that durable goods such as cars can last longer (video of talk at 36C3 blog).
  • Standardisation and Certification in the Internet of Things is an analysis of what happens to safety regulation once we get software everywhere. It informed EU directive 2019/771 which requires firms selling goods with digital components to maintain the software for at least two years, or for the reasonable expectation of the customer if longer. This will probably mean ten years for cars and white goods (blog).
  • Privacy for Tigers describes work we did to stop wildlife aggregation sites being exploited by poachers.
  • Bitcoin Redux examines what’s gone wrong in the world of cryptocurrencies, whose mining wastes colossal amounts of energy; financial regulators bear some of the blame for failing to enforce existing laws that would have prevented some of the worst abuses (blog). It follows on from Making Bitcoin Legal, where we presented a better way of tracing stolen bitcoin (blog video).
  • What you get is what you C describes a compiler plugin we wrote to make it easier to maintain crypto code by expressing programmer intent.
  • DigiTally is a prototype payment system we built to extend mobile phone payments to areas of less developed countries with no phone service.
  • The UK smart meter project looks set to waste £20bn without saving any energy. Here are papers on the technical security and security economics of smart meters, on their privacy, and on their deployment.
  • On the Reliability of Electronic Payment Systems describes work I did to help develop prepayment utility metering, which made possible the electrification of millions of homes in less developed countries. The STS standard we developed is now used in 400m meters in over 100 countries.


Economics, psychology and criminology of information security

Incentives matter as much as technology for the security of large-scale systems. Systems break when the people who could fix them are not the people who suffer the costs of failure. So it's not enough for security engineers to understand cryptomathematics and the theory of operating systems; we have to understand game theory and microeconomics too. I pioneered the discipline of security economics which is starting to embrace privacy economics, security psychology and criminology too.

There are two relevant workshops I helped establish: Security and Human Behaviour workshop which brings together security engineers and psychologists, while the Workshop on Economics and Information Security is where you meet everyone working in security economics.


Peer-to-Peer and social network systems

One of the seminal papers in peer-to-peer systems was The Eternity Service, which I invented in response to growing Internet censorship, The modern era only started once the printing press enabled seditious thoughts to be spread too widely to ban. But when books no longer exist as tens of thousands of paper copies, but as a file on a single server, will courts be able to order them unpublished once more? (This has since happpened to newspaper archives in Britain.) So I invented the Eternity Service as a means of putting electronic documents beyond the censor's grasp. It inspired second-generation censorship-resistant systems such as Publius and Freenet; one descendant is wikileaks. But the killer app turned out to be not sedition, or even pornography, but copyright. Hollywood's action against Napster led to our ideas being adopted in filesharing systems; they are now re-emerging in the Internet of Things.

Work since the Eternity paper includes the following.


Reliability of security systems

I have been interested for many years in how security systems fail in real life; many security designs are poor because they are based on unrealistic threat models. I started with a study of ATM fraud, and expanded to other applications one after another. This provides a central theme of my book. I also have a separate page on bank security which gathers together all our papers on fraud in payment systems.

The papers on physical security by Roger Johnston's team are also definitely worth a look; see also an old leaked copy of the NSA Security Manual.


Robustness of cryptographic protocols

Many security system failures are due to poorly designed protocols, and this has been a Cambridge interest for many years. Some relevant papers follow.

Protocols have been the stuff of high drama. Citibank asked the High Court to gag the disclosure of certain crypto API vulnerabilities that affect a number of systems used in banking. I wrote to the judge opposing this; a gagging order was still imposed, although in slightly less severe terms than Citibank had requested. The trial was in camera, the banks' witnesses didn't have to answer questions about vulnerabilities, and new information revealed about these vulnerabilities in the course of the trial may not be disclosed in England or Wales. Information already in the public domain was unaffected. The vulnerabilities were discovered by Mike Bond and me while acting as the defence experts in a phantom withdrawal court case, and independently discovered by the other side's expert, Jolyon Clulow, who later joined us as a research student. They are of significant scientific interest, as well as being relevant to the rights of the growing number of people who suffer phantom withdrawals from their bank accounts worldwide. Undermining the fairness of trials and forbidding discussion of vulnerabilities isn't the way forward (press coverage by the Register).


Cryptography, including quantum cryptography

Lots of people don't believe quantum crypto is practical. I also don't believe the security proofs offered for entanglement-based quantum cryptosystems, because they assume that the strange behaviour observed in the Bell tests must result from nonlocal action. But it can also emerge from pre-existing long-range order. One explanation, advocated by Nobel prizewinner Gerard 't Hooft, is the cellular automaton interpretation of quantum mechanics; see his keynote talk at EMQM 2015. I have done some work with Robert Brady to develop another line of inquiry.

  • Maxwell's fluid model of magnetism shows that a wavepacket travelling along a phase vortex in an Eulerian fluid obeys Maxwell's equations, is emitted and absorbed discretely, and can have linear or circular polarisation. What's more, the measured correlation between the polarisation of two cogenerated wavepackets is exactly the same as predicted by quantum mechanics, and observed in the Bell tests (blog press).
  • If you're new to this subject, a good starting point is to watch the video of Yves Couder's beautiful bouncing-droplet experiments, and then read our paper Why bouncing droplets are a pretty good model of quantum mechanics. This shows how droplets bouncing on a vibrating fluid bath obey two-dimensional analogues of Maxwell's equations and a version of Schrödinger's equation.
  • For the hard math, which explains how fermionic quasiparticles obeying Dirac's equation can arise in a bosonic fluid, see this paper; another paper that may be relevant is here. And here's a video of my talk at the 2015 Crossing conference, and another video on the various ways in which provable security fails (including the quantum case).

In the 1990s I worked with Eli Biham and Lars Knudsen to develop Serpent – a candidate block cipher for the Advanced Encryption Standard. Serpent got the second largest number of votes.

Other papers on cryptography and cryptanalysis include the following.

  • The Dancing Bear – A New Way of Composing Ciphers presents a new way to combine crypto primitives. Previously, to decrypt using (say) any three out of five keys, the keys all had to be of the same type (such as RSA keys). With my new construction, you can mix and match - RSA, AES, even one-time pad. The paper appeared at the 2004 Protocols Workshop; an earlier version came out at the FSE 2004 rump session.
  • Two Remarks on Public Key Cryptology is a note on two ideas I floated at talks I gave in 1997-98, concerning forward-secure signatures and compatible weak keys. The first of these has inspired later research by others; the second gives a new attack on public key encryption.
  • Two Practical and Provably Secure Block Ciphers: BEAR and LION shows how to construct a block cipher from a stream cipher and a hash function. We had already known how to construct stream ciphers and hash functions from block ciphers, and hash functions from stream ciphers; so this paper completed the set of elementary reductions. It also led to the "Dancing Bear" above.
  • Tiger – A Fast New Hash Function defines a new hash function, which we designed following Hans Dobbertin's attack on MD4. This was designed to run extremely fast on the new 64-bit processors such as DEC Alpha and IA64, while still running reasonably quickly on existing hardware such as Intel 80486 and Pentium (the above link is to the Tiger home page, maintained in Haifa by Eli Biham; if the network is slow, see my UK mirrors of the Tiger paper, new and old reference implementations (the change fixes a padding bug) and S-box generation documents. There are also third-party crypto toolkits supporting Tiger, such as that from Bouncy Castle).
  • Minding your p's and q's points out a number of things that can go wrong with the choice of modulus and generator in public key systems based on discrete log. It elucidated some of the previously classified reasoning behind the design of the US Digital Signature Algorithm, and appeared at Asiacrypt 96.
  • Chameleon – A New Kind of Stream Cipher shows how to do traitor tracing using symmetric rather than public-key cryptology. The idea is to turn a stream cipher into one with reduced key diffusion, but without compromising security. A single broadcast ciphertext is decrypted to slightly different plaintexts by users with slightly different keys. This paper appeared at Fast Software Encryption in Haifa in January 1997.
  • Searching for the Optimum Correlation Attack shows that nonlinear combining functions used in nonlinear filter generators can react with shifted copies of themselves in a way that opens up a new and powerful attack on many cipher systems. It appeared at the second workshop on fast software encryption.
  • The Classification of Hash Functions showed that correlation freedom is strictly stronger than collision freedom, and shows that there are many pseudorandomness properties other than collision freedom which hash functions may need. It appeared at Cryptography and Coding 93.
  • A Faster Attack on Certain Stream Ciphers shows how to break the multiplex shift register generator, which is used in satellite TV systems. I found a simple divide-and-conquer attack on this system in the mid 1980's, a discovery that got me "hooked" on cryptology. This paper is a refinement of that work.
  • On Fibonacci Keystream Generators appeared at FSE3, and shows how to break "FISH", a stream cipher proposed by Siemens. It also proposes an improved cipher, "PIKE", based on the same general mechanisms.
  • Tree Functions and Cipher Systems appeared in 1991; it points out a weakness in a proprietary cipher that was later developed into this.

Another of my contributions was founding the series of workshops on Fast Software Encryption.


Security of Clinical Information Systems

The safety and privacy of clinical systems have been a problem for years. Recent scandals include the Google DeepMind case (exposed by my then postdoc Julia Powles) where the Royal Free Hospital gave Google a million patients' records that they shouldn't have; and the care.data affair where a billion records – basically all hospital care episodes since 1998 – were sold to 1200 firms worldwide, in a format that enabled many patients to be re-identified. It wasn't much better under the previous Labour government, which had a series of rows over thoughtless and wasteful centralisation. There is now an NGO, MedConfidential, which monitors and campaigns for health privacy.

The NHS has a long history of privacy abuses. Gordon Brown's own medical records were compromised while he was prime minister, but the miscreant got off scot-free as it was "not in the public interest" to prosecute him. In another famous case, Helen Wilkinson had to organise a debate in Parliament to get ministers to agree to remove defamatory and untrue information about her from NHS computers. The minister assured the House that the libels had been removed; months later, they still had not been. There is now an NGO set up specifically to campaign for health privacy, medConfidential.org.

Here are my most recent papers on the subject.

Civil servants started pushing for online access to everyone's records in 1992 and I got involved in 1995, when I started consulting for the British Medical Association on the safety and privacy of clinical information systems. Back then, the police were given access to all drug prescriptions, after the government argued that they needed it to catch doctors who misprescribed heroin. The police got their data, but they didn't catch Harold Shipman, and no-one was held accountable. The NHS slogan in 1995 was `a unified electronic patient record, accessible to all in the NHS'. The BMA campaigned against this, arguing that it would destroy patient privacy:

  • Security in Clinical Information Systems was published by the BMA in January 1996. It sets out rules that uphold the principle of patient consent independently of the details of specific systems. It was the medical profession's initial response to the safety and privacy problems posed by centralised NHS computer systems.
  • An Update on the BMA Security Policy appeared in June 1996 and tells the story of the struggle between the BMA and the government, including the origins and development of the BMA security policy and guidelines.
  • There are comments made at NISSC 98 on the healthcare protection profiles being developed by NIST for the DHHS to use in regulating health information systems privacy, which made a number of mistaken assumptions about threats and protection mechanisms.
  • Remarks on the Caldicott Report raises a number of issues about the report of the Caldicott Committee, which was set up by the Major government to kick the medical privacy issue into touch until after the 1997 election. Its members failed to understand that medical records from which the names have been removed, but where NHS numbers remain, are not anonymous – as large numbers of NHS staff need to map names to numbers in order to do their jobs.
  • Information technology in medical practice: safety and privacy lessons from the United Kingdom provided an overview of the safety and privacy problems we encountered in UK healthcare computing in the mid-90s for readers of the Australian Medical Journal.
  • The DeCODE Proposal for an Icelandic Health Database analyses a proposal to collect all Icelanders' medical records into a single database. I evaluated this for the Icelandic Medical Association and concluded that the proposed security wouldn't work. The company running it soon hit financial problems and later filed for bankruptcy. The ethical issues were a factor: Iceland's Supreme Court allowed a woman to block access to her father's records because of the information they may reveal about her (see analysis). This effectively killed the vision of having the whole population on a database. I also wrote an analysis of security targets prepared under the Common Criteria for the evaluation of this database. See also BMJ correspondence and an article by Einar Arnason.
  • Clinical System Security – Interim Guidelines appeared in the British Medical Journal on 13th January 1996. It advises healthcare professionals on prudent security measures for clinical data. The most common threat is that private investigators use false-pretext telephone calls to elicit personal health information from assistant staff.
  • A Security Policy Model for Clinical Information Systems appeared at the 1996 IEEE Symposium on Security and Privacy. It presents the BMA policy model to the computer security community in a format comparable to policies such as Bell-LaPadula and Clark-Wilson. It had some influence on later US health privacy legislation (the Kennedy-Kassebaum Bill, now HIPAA).
  • NHS Wide Networking and Patient Confidentiality appeared in the British Medical Journal in July 1995 and set out some early objections to the government's health network proposals.
  • Patient Confidentiality &ndash At Risk from NHS Wide Networking went into somewhat more detail, particularly on the security policy aspects. It was presented at Health Care 96.
  • Problems with the NHS Cryptography Strategy points out a number of errors in, and ethically unacceptable consequences of, a report on cryptography produced for the Department of Health. These comments formed the BMA's response to that report.

In 1996, the Government set up the Caldicott Committee to study the matter. Their report made clear that the NHS was already breaking confidentiality law by sharing data without consent; but the next Government just legislated (and regulated, and again) to give itself the power to share health data as the Secretary of State saw fit. (We objected and pointed out the problems the bill could cause; similar sentiments were expressed in a BMJ editorial, and a Nuffield Trust impact analysis, and BMJ letters here and here. Ministers claimed the records were needed for cancer registries: yet cancer researchers work with anonymised data in other countries – see papers here and here.) There was a storm of protest in the press: see the Observer, the New Statesman, and The Register. But that died down; the measure has now been consolidated as sections 251 and 252 of the NHS Act 2006, the Thomas-Walport review blessed nonconsensual access to health records (despite FIPR pointing out that this was illegal – a view later supported by the European Court). A government committee, the NHS Information Governmance Board, was set up oversee this lawbreaking, and Dame Fiona is being wheeled out once more. Centralised, nonconsensual health records not only contravene the I v Finland judgement but the Declaration of Helsinki on ethical principles for medical research and the Council of Europe recommendation no R(97)5 on the protection of medical data.

Two health IT papers by colleagues deserve special mention. Privacy in clinical information systems in secondary care describes a hospital system implementing something close to the BMA security policy (it is described in more detail in a special issue of the Health Informatics Journal, v 4 nos 3-4, Dec 1998, which I edited). Second, Protecting Doctors' Identity in Drug Prescription Analysis describes a system designed to de-identify prescription data for commercial use; although de-identification usually does not protect patient privacy very well, there are exceptions, such as here. This system led to a court case, in which the government tried to stop its owner promoting it – as it would have competed with their (less privacy-friendly) offerings. The government lost: the Court of Appeal decided that personal health information can be used for research without patient consent, so long as the de-identification is done competently.

Resources on what's happening in the USA include many NGOs: Patient Privacy Rights may have been the most influential, but see also EPIC, the Privacy Rights Clearinghouse, the Citizens' Council on Health Care, the Institute for Health Freedom. and CDT. Older resources include an NAS report entitled For the Record: Protecting Electronic Health Information, a report by the Office of Technology Assessment, a survey of the uses of de-identified records for the DHHS, and a GAO report on their use in Medicare. As for the basic science, see my book chapters on Boundaries and on Inference Control.


Public policy issues

I've been involved over the years with academic freedom, and with digital rights more generally.

I chair the Foundation for Information Policy Research, the UK's leading Internet policy think tank, which I helped set up in 1998. We are not a lobby group; our enemy is ignorance rather than the government of the day, and our mission is to understand IT policy issues and explain them to policy makers and the press. We had a conference for our 25th anniversary in 2023 (blog), another for our 20th in 2018 (blog), and here are the issues as we saw them in 2008 and 1999. Some highlights of our work follow.

  • Thirty Years of Crypto Wars: the great result of 2023 was that we beat the Chat Control proposal in the European Parliament. This involved dozens of NGOs lobbying for over a year backed by academics from a number of countries. One of my contributions was Chat Control or Child Protection, which analyses the arguments used by GCHQ that they should circumvent the end-to-end crypto in messenger apps "to protect children" and shows that they are not consistent with the evidence; and Bugs in our Pockets: The Risks of Client-Side Scanning, a technical study of the risks involved in mandatory scanning of people's phones and other devices for illegal materials, as proposed in various forms by the US and UK governments, the EU and originally Apple, who have at least had the sense to recant (blog). But the fight continues. One Protocol to Rule Them All? On Securing Interoperable Messaging analyses the EU DMA mandate for messaging systems interoperability. This will vastly increase the attack surface at every level in the stack (blog).
  • That in turn updated a 2015 paper on the same topic, Keys Under Doormats, which argues that the push by the UK and US governments for exceptional access to all computer and communications data is wrong in principle and unworkable in practice (see also this video and this followup).
  • In 2016, we organised the tenth Scrambling for Safety workshop on their Investgatory Powers Bill while it was on its way through Parliament. The chaos after the Brexit vote, plus May's appointment as Prime Minister, allowed this bill to get through Parliament unscathed. The European Court of Justice has already found that its data retention provisions contravene human rights but the government ignored this, and the Australian government followed suit.
  • What Goes Around Comes Around is a chapter I wrote for a book by EPIC, on whose advisory board I sit.
  • I first got engaged in technology policy thanks to attempts in the 1990s by governments to control the use of cryptography. In 1995, I wrote Crypto in Europe – Markets, Law and Policy, the first paper to point out that law enforcement communications intelligence was mostly about traffic analysis and criminal communications security was mostly traffic security. The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption became the most widely-cited publication on key escrow; it was originally presented as testimony to the US Senate, and then also to the Trade and Industry Committee of the UK House of Commons, together with a further piece I wrote, The Risks and Costs of UK Escrow Policy.
  • The GCHQ Protocol and its Problems pointed out a number of serious defects in the protocol that the British government used to secure its electronic mail. Our analysis stopped the protocol being more widely adopted; the government is still trying to push its successor, which still suffers much the same problems. The government also proposed mandatory licensing of certification authorities, so we compiled The Global Trust Register – a certification authority implemented in paper and ink rather than electronics. Our book would have been banned by the new law – which enabled us to visit Culture Secretary Chris Smith at a critical point and get it on the Cabinet agenda. What we achieved with this campaign was to limit the scope of the Regulation of Investigatory Powers Act. Originally this would have allowed the police to obtain, without warrant, a complete history of everyone's web browsing activity, as ‘communications data’. Our ‘Big Browser amendment’got the House of Lords to limit this to the identity of the machines involved in a communication, rather than the full URLs. But the RIP Act still made it into law and has had a number of the bad effects we predicted.
  • These issues revived in the 2000s with GCHQ's Interception Modernisation Programme, a plan to centralise all traffic data first in a central database (under Blair and Brown) and then in a system of federated databases maintained by communications service providers. FIPR wrote various papers on related matters, and when the Coalition Government brought its Communications Data Bill, we organised resistance. The bill was dropped after the Lib Dems finally vetoed it.
  • Privacy has come under attack not just from the spooks but from the world of Big Data.The collection, linking and use of data in biomedical research and health care: ethical issues is a report we wrote for the Nuffield Bioethics Foundation: what happens to health privacy in a world with cloud-based medical records and pervasive genomics? (blog Guardian Indy Science).

    In 2009, our Database State report on the failings of public-sector IT in Britain, and how to fix them, got massive press coverage: the BBC, the Guardian (also here), the Mail (also here), the Independent, the Telegraph and Liberty Central. This followed an earlier ICO report on children's databases. Both the Lib Dems and the Conservatives promised to kill or change at least some of these systems; after they won power in the 2010 election their coalition agreement spelled the end of the ContactPoint children's database, and of ID cards. The subsequent review by my FIPR colleague Eileen Munro also sealed the fate of eCAF, another central children's database system.

  • Sustainability interacts in various ways with information security, notably in the sustainability of software; but see also my talk on Privacy for Tigers.
  • Brexit affects us in numerous ways. Brexit and technology explained how the Brexit debate largely ignored network externalities, which could make the damage worse. Brexit and Cambridge assesses the likely costs to the University (blog posts).
  • Waste of Public Money is another objection to the bad government systems that undermine our privacy. Other wasteful systems include smart meters which look set to cost billions without achieving anything useful (blog).
  • Identity Cards were a clever political move by Blair; they divided the Conservatives, so Blair promised to do them for almost a decade and never got round ot it. I testified to the Home Affairs committee in 2004 that they would not work as advertised, and contributed to the LSE Report that spelled this out in detail. I wrote various previous pieces in response to government identity consultations, on aspects such as smartcards and PKI.
  • Internet Censorship is a growing problem, and not just in developing countries; I've been on the receiving end more than once. In 1995, I invented the first censorship-resistant system, the Eternity Service; this was a precursor of later file-sharing systems (see above), and we've also written on the economics of censorship resistance. But despite the technical difficulties and collateral costs of content filtering, governments aren't giving up. From 2006 to 2008, I was a principal investgator for the OpenNet Initiative which monitors Internet filtering worldwide. Shifting Borders reviewed the state of play in late 2007, and appeared in Index on Censorship. Tools and Technology of Internet Filtering goes into more technical detail. The political action now is about Internet blocking.
  • Consumer Protection: FIPR also brought together legal and computing experts to deconstruct the fashionable late-1990s notion that ‘digital certificates’ would solve all the problems of e-commerce and e-government. Anyone inclined to believe such nonsence should read Electronic Commerce – Who Carries the Risk of Fraud?. Other work in this thread include FIPR's responses to consultations on smartcards, the electronic signature directive and the ecommerce bill.

    More recently we have seen the erosion of consumer rights as a result of the introduction of chip and PIN cards. The technical sections above describe how frauds happen; the flip side of the story is how the banks escape liability. Our analysis of the failings of the Financial Ombudsman Service remains unanswered; see also FIPR's submission on Personal Internet Security (with which the House of Lords basically agreed) and the National Payments Plan. FIPR now takes the view that the only way to fix consumer protection is to replace public action with private action, by changing the rules on costs so that consumers can enforce their rights in court without risking horrendous costs orders if they lose.

  • Export Control: In 2001-02, FIPR persuaded the Lords to amend the Export Control Bill. This bill was designed to give ministers the power to license intangible exports. It was the result of US lobbying of Tony Blair in 1997; back then, UK crypto researchers could put source code on our web pages while our US colleagues weren't allowed to. In its original form, its provisions were so broad that it would have given ministers the power of pre-publication review of scientific papers. We defeated the Government in the House of Lords by 150-108, following a hard campaign – see press coverage in the BBC, the New Scientist, the Guardian and the Economist, and an article on free speech I wrote for IEEE Computing. But the best quote I have is also the earliest. The first book written on cryptology in English, by Bishop John Wilkins in 1641, remarked that ‘If all those useful Inventions that are liable to abuse, should therefore be concealed, there is not any Art or Science which might be lawfully profest’

    This issue revived in 2003, with a government attempt to wrest back by regulation much of what they conceded in parliament. FIPR fought back and extracted assurances from Lord Sainsbury about the interpretation of regulations made under the Act. Without our campaign, much scientific collaboration would have become technically illegal, leaving scientists open to arbitrary harrassment. Much credit goes to the Conservative frontbencher Doreen Miller, Liberal Democrat frontbencher Margaret Sharp, and the then President of the Royal Society Bob May, who made his maiden speech in the Lords on the issue and marshalled the crossbenchers. We are very grateful for their efforts.

  • Trusted Computing was a focus in 2002-03. I wrote a Trusted Computing FAQ, followed by a study of the competition policy aspects which led inter alia to a symposium organised by the German government that pushed the Trusted Computing Group into incorporating. Microsoft couldn't get remote attestation to work; Intel abandoned trusted computing; and its only direct descendants were bitlocker and Arm's TrustZone.
  • IP Enforcement: Our lobbying priority in 2003-04 was the EU IPR enforcement directive, which has been criticised by distinguished lawyers. Our lobbying got it amended to remove criminal sanctions for patent infringement and legal protection for devices such as RFID tags. This law was supported by the music industry, the luxury brands, and (initially) Microsoft, while the coalition that we put together to oppose it included the phone companies, the supermarkets, the generic drugmakers, the car parts industry, smaller software firms and the free software community. The press was sceptical – in Britain, France and even America. The issue was even linked to a boycott of Gillette. There is more on my blog.

    This was a watershed in copyright history: the IP lobby was never going to be stopped by fine words, only by another lobby pushing in the other direction, and the Enforcement Directive was when that first came together. It also led to the birth of EDRI, European Digital Rights, a confederation of European digital-rights NGOs, whose establishment was one of FIPR's significant achievements. EDRI's first campaign was against the IP Enforcement Directive; afterwards FIPR and EDRI established a common position on intellectual property. Since then I have given evidence to the Gowers Review of IP and a parliamentary committee on DRM. The lead UK NGO on IP nowadays is the Open Rights Group.

  • Terrorism: Here are Comments on Terrorism I wrote after the 11th September attacks. The resulting hysteria made me work harder at developing security economics to enable policymakers and others to think more rationally about such things, once gthey calmed down. In the dark years that followed, I testified against police attempts to increase pre-charge detention to ninety days; and here is a video I did on the effects of 9/11. We must constantly push back on the scaremongers.

I served on Council, Cambridge University's governing body, 2003–10 and from 2015–18. I stood for election because of a proposal that most of the intellectual property generated by faculty members – from patents on bright ideas to books written up from lecture notes – would belong to the university rather than to its creator. To stop this, and to prevent further incidents like this one), we founded the Campaign for Cambridge Freedoms. The final vote approved a policy according to which academics keep copyright but the University gets a share of patent royalties. I got re-elected in 2006, and in my second term we won an important vote to protect academic freedom. For more, see my article from the Oxford Magazine. From 2013-4 I was on our Board of Scrutiny. In my third term my main contribution was investigating the delays and cost overruns in a large construction project.

Since then the culture wars came to Cambridge. Should our university require us to treat foolish or obnoxious colleagues with "respect", or just with "tolerance"? Our VC demanded "respect" but we called a free speech vote and academics voted decisively for tolerance instead. See Varsity, Newsweek, the FT, the Spectator, the Mail, the Sunday Times, the Times Higher Education Supplement, the Cambridge Student, the Cambridge Radical Feminist Network, Stephen Fry – and the Minister of State for Universities.

Our latest campaign is against Cambridge's policy of forcing academics to retire at 67, an outdated policy to which only Cambridge and Oxford cling; Oxford's version was found illegal in March 2023. Our campaign page is here.

My CV is here while my h-index is tracked here. I'm a Fellow of Churchill College, the Royal Society, the Royal Academy of Engineering, the Institution of Engineering and Technology, the Institute of Mathematics and its Applications, and the Institute of Physics. I won the 2015 Lovelace medal; the interviews I did for that award are here, while my oral history interview transcript is here and an Academy video is here. As for my academic genealogy, my thesis adviser was Roger Needham; his was Maurice Wilkes; then it runs back through Jack Ratcliffe, Edward Appleton, Ernest Rutherford, JJ Thomson, Lord Rayleigh, Edward Routh, William Hopkins, Adam Sedgwick, Thomas Jones, Thomas Postlethwaite, Stephen Whisson, Walter Taylor, Roger Smith, Roger Cotes, Isaac Newton, Isaac Barrow and Vincenzo Viviani to Galileo Galilei. For context, see my Unauthorised History of Cambridge University

Finally, here is my PGP key. If I revoke this key, I will always be willing to explain why I have done so provided that the giving of such an explanation is lawful. (For more, see FIPR.)


cover

The third edition is now on sale – you can read sample chapters on my book page.

Security engineering is about building systems to remain dependable in the face of malice, error or mischance. As a discipline, it focuses on the tools, processes and methods needed to design, implement and test complete systems, and to adapt existing systems as their environment evolves. My book has become the standard textbook and reference since it was published in 2001. You can download both the first and second editions without charge here; the third edition will become free from 2024.

Security engineering is not just concerned with infrastructure matters such as firewalls and PKI. It's also about specific applications, such as banking and medical record-keeping, and increasingly about embedded systems such as payment terminals and burglar alarms. It's usually done badly, so it often takes several attempts to get a design right. It's also hard to learn: although there were good books on a number of the component technologies, such as cryptography and operating systems, there was little about how to use them effectively, and even less about how to make them work together. Most systems don't fail because the mechanisms are weak, but because they're used wrong.

My book was an attempt to help the working engineer to do better. As well as the basic science, it contains details of many applications – and lot of case histories of how their protection failed. It describes a number of technologies which aren't well described elsewhere. The first edition was pivotal in founding the now-flourishing field of information security economics: I realised that the narrative had to do with incentives and organisation at least as often as with the technology. The second edition incoporated the economic perspectives we've developed since then, and new perspectives from the psychology of security, as well as updating the technological side of things. The third edition is an update for the new world of phones, cloud services and social media; it tackles the problems raised by cars and medical devices such as the interaction of security with safety, and the costs of long-term patching; it also adds a huge amount about modern threat actors, from the cybercrime ecosystem to what we learned about state capabilities from the Snowden leaks and elsewhere.

More ...


Highlights by year

2022 highlights include ExtremeBB, a database we collect of extremist postings to support research by political scientists, criminologists and others; CoverDrop which lets a newspaper build an end-to-end encrypted messenger into its app for whistleblowers; a paper on Chat Control or Child Protection for the latest round of the crypto wars; a study of the failures of security proofs; and two developments of Bad Characters and Trojan Source – one showing how these techniques easily mislead search engines, while the other mapping the impulse response of the vulnerability disclosure ecosystem.

2021 highlights include Bad characters and Trojan source, of which the first broke all large language models and the second all computer languages; two adversarial machine-learning papers, on data ordering attacks and markpainting; an analysis of cybercrime ventures as startups; and Bugs in our Pockets, the latest round in the Crypto Wars.

2020 highlights include sponge attacks and nudge attacks on machine-learning systems, along with work on adversarial reinforcement learning and on decoding smartphone sounds with a voice assistant. But my main project in 2020 was writing a third edition of my Security Engineering textbook.

2019 highlights include an acoustic side channel on smartphones, one paper on whistleblowing and two papers on blocking adversarial machine learning. The big paper was on Measuring the Changing Cost of Cybercrime; since we did the first systematic study seven years ago, the patterns changed surprisingly little despite a huge changed in technology. Finally I gave an invited talk at 36C3 on the sustainability of safety, security and privacy.

2018 highlights include papers on what's wrong with bitcoin exchanges and how to trace stolen bitcoin; on making security sustainable; controlling side effects in mainstream C compilers; how protocols evolve and a gullibility metric. There's also an invited talk on privacy for tigers.

2017 highlights include Standardisation and Certification in the Internet of Things, an analysis for the EU of what happens when we get software everywhere, and which informed EU directive 2019/771 on the sale of goods; DigiTally, a prototype payment system we built to extend mobile phone payments to areas of less developed countries with no phone service, using a novel protocol; and a book chapter I wrote for EPIC.

2016 highlights include a new Android side channel; an investigation of the social externalities of trust; studies of when lying feels the right thing to do, of taking down websites to prevent crime and bank fraud reimbursement; and finally two papers on Brexit.

2015 highlights included Keys Under Doormats, on what's wrong with government attempts to mandate exceptional access to all our data; a Nuffield report on what happens to health privacy in a world of cloud-based medical records and pervasive genomics; a report on the emotional impact of Internet fraud; two papers on how to do lie detection using analysis of body motion; severe flaws in Android factory reset and mobile anti-virus apps; and a novel demonstration that the Bell test results can come from pre-existing long-range order as easily as from nonlocal action.

2014 highlights included papers on Chip and Skim describing pre-play frauds against EMV bank cards; Security protocols and evidence which explains how the systems needed to support proper dispute resolution just don't get built; Experimental Measurement of Attitudes Regarding Cybercrime, on how prosecutors and public opinion are out of step; The psychology of malware warnings, on how to get users to pay attention to risk; Privacy versus government surveillance, on network economics and international relations; and Why bouncing droplets are a pretty good model of quantum mechanics, which solves an outstanding mystery in physics.

2013 highlights included Rendezvous, a prototype search engine for code; a demonstration that we could steal your PIN via your phone camera and microphone; an analysis of SDN Authentication; and papers on quantum computing and Bell's inequality.

2012 highlights included a big report on Measuring the Cost of Cybercrime and a history of security economics; an attempt to kill the government's smart metering project; three papers on dynamic networks; and four papers on payment protocols: Chip and Skim: cloning EMV cards with the pre-play attack, How Certification Systems Fail, A birthday present every eleven wallets? and Social Authentication – harder than it looks. Finally, Risk and privacy implications of consumer payment innovation discusses both payment and economic issues.

2011 highlights included a major report on the Resilience of the Internet Interconnection Ecosystem which studies how an attacker might bring down the Internet; an updated survey paper on Economics and Internet Security which covers recent analytical, empirical and behavioral research; and Can We Fix the Security Economics of Federated Authentication? which explores how we can deal with a world in which your mobile phone contains your credit cards, your driving license and even your car key. What happens when it gets stolen or infected? (blog)

2010 highlights included a paper on why Chip and PIN is broken for which we got coverage on Newsnight and a best paper award (later, the banks tried to suppress this research). Other bank security work included a paper on Verified by VISA and another on the unwisdom of banks adopting proprietary standards. On the control systems front, we published papers on the technical security and security economics of smart meters, on their privacy, on their deployment and on key management for substations. I created a psychology and security web page and wrote a paper on putting context and emotion back in security decisions.

2009 highlights included Database State, a report we wrote about the failings of public-sector IT – many of whose recommendations were adopted by the government elected in 2010; The snooping dragon which explains how the Chinese spooks hacked the Dalai Lama in the run-up to the Peking Olympics; Eight Friends are Enough, which shows how little privacy you have on Facebook; and The Economics of Online Crime. There's a videos of a talk I gave on dependability at the IET, as well as a survey paper, the slides, and a podcast. Finally, I wrote an Unauthorised History of Cambridge University.

2008 highlights included a major study of Security Economics and European Policy for the European Commission; the second edition of my book "Security Engineering"; the discovery of serious vulnerabilities in Chip and PIN payment systems; an analysis of the failings of the Financial Ombudsman Service (see also a video from the World Economic Forum in November 2008); the FIPR submission to the Thomas-Walport Review; a piece on confidentiality in the British Journal of General Practice; three videos on privacy made by ARCH; and a video on surveillance. I started a Workshop on Security and Human Behaviour to bring together psychologists with economists and security engineers to work on deception and risk.

2007 highlights included technical papers on RFID and on New Strategies for Revocation in Ad-Hoc Networks (which explores when suicide attacks are effective); a paper on fraud, risk and nonbank payment systems I wrote for the Fed; and a survey paper on Information Security Economics (of which a shortened version appeared in Science). I was a special adviser to House of Commons Health Committee for their Report on the Electronic Patient Record. Finally, following the HMRC data loss, I appeared in the debate on Newsnight.

2006 highlights included technical papers on topics from protecting power-line communications to the Man-in-the-Middle Defence, as well as a major report on the safety and privacy of children's databases for the Information Commissioner. I ended the year debating health privacy with health minister Lord Warner.

2005 highlights included research papers on The topology of covert conflict, on combining cryptography with biometrics, on Sybil-resistant DHT routing, and on Robbing the bank with a theorem prover; and a big survey paper on cryptographic processors.

2004 highlights included papers on cipher composition, key establishment in ad-hoc networks and the economics of censorship resistance. I also lobbied for amendments to the EU IP Enforcement Directive and organised a workshop on copyright which led to a common position adopted by many European NGOs.


Contact details

I only write and referee for open publications, so I discard emails asking for reports for journals that sit behind a paywall.

By default, when I post a paper here I license it under the relevant Creative Commons license; you may redistribute it with attribution but not modify it.

I can no longer admit PhD students for Cambridge, because of forthcoming mandatory retirement; so if you want to do a PhD, please read the relevant web pages. I still admit PhD students at Edinburgh.