The Royal Academy of Engineering, located in London, recently released a report, “Dilemmas of Privacy and Surveillance: Challenges of Technological Change.”
I just ran across it and haven’t had a chance to review it in depth yet, but a quick scan and reading the executive summary shows some interesting thoughts.
The report stresses the responsibilities for systems engineers to design systems with security built in, that will enhance data protection and protect privacy.
As the Foreword states:
“The report includes a number of recommendations for improving the benefits and reducing the problems likely to stem from advances in information technology that could affect privacy and surveillance. Some of these recommendations are directed at engineers, especially those involved in specifying and designing systems. Some of them are aimed at policymakers, including the Government and the Information Commissioner, who need to be aware of and pro-active in relation to technological developments. Some of the recommendations are for commercial organisations that collect data about individuals as part of their everyday business. In addition, the report notes that there are some important areas where we do not yet know enough and where additional research needs to be commissioned by, for example, the Research Councils.
The report and its recommendations focus on the United Kingdom. However, issues of privacy and surveillance are not confined within national borders. The Internet and its disregard for state boundaries means that policies to protect privacy have to be made with an understanding of the limits to what can be achieved through national legislation. Policy in the UK has to be made within the context of the directives of the European Union and the laws of our trading
partners. It is therefore important that the report and its recommendations should influence policy not just within the UK, but also internationally. I am glad that the increasing interchange of information and opinion between Information Commissioners and between academics and policy makers in different countries is making that more likely to happen.”
The report covers a very wide range of privacy topics, such as surveillance, e-passorts, biometrics, database vulnerabilities, the Semantics Web and the Internet, passports, privacy and the law, mobile phones, authentication, identification, digital identities, identity misuse (think identity fraud and theft), and profiling.
I look forward to reading the report in detail. Their recommendations are directed specifically to the U.K. but are generally universally applicable. They are sound, and I’m looking forward to getting a chance to read the research they provide to support each:
“Recommendations
R1 Systems that involve the collection, checking and processing of personal information should be designed in order to diminish the risk of failure as far as reasonably practicable. Development of such systems should make the best use of engineering expertise in assessing and managing vulnerabilities and risks. Public sector organisations should take the lead in this area, as they collect and process a great deal of sensitive personal data, often on a non-voluntary basis.
R2 Many failures can be foreseen. It is essential to have procedures in place to deal with the consequences of failure in systems used to collect, store or process personal information. These should include processes for aiding and compensating individuals who are affected.
R3 Human rights law already requires that everyone should have their reasonable expectation of privacy respected and protected. Clarification of what counts as a reasonable expectation of privacy is necessary in order to protect this right and a public debate, including the legal, technical and political communities, should be encouraged in order to work towards a consensus on the definition of what is a ‘reasonable expectation’. This debate should take into account the effect of an easily searchable Internet when deciding what counts as a reasonable expectation of privacy.
R4 The powers of the Information Commissioner should be extended. Significant penalties – including custodial sentences – should be imposed on individuals or organisations that misuse data. The Information Commissioner should also have the power to perform audits and to direct that audits be performed by approved auditors in order to encourage organisations to always process data in accordance with the Data Protection Act. A public debate should be held on whether the primary control should be on the collection of data, or whether it is the processing and use of data that should be controlled, with penalties for improper use.
R5 Organisations should not seek to identify the individuals with whom they have dealings if all they require is authentication of rightful access to goods or services. Systems that allow automated access to a service such as public Dilemmas of Privacy and Surveillance Challenges of Technological Change
The Royal Academy of Engineering 9 transport should be developed to use only the minimal authenticating information necessary. When organisations do desire identification, they should be required to justify why identification, rather than authentication, is needed. In such circumstances, a minimum of identifying information should be expected.
R6 Research into the effectiveness of camera surveillance is necessary, to judge whether its potential intrusion into people’s privacy is outweighed by its benefits. Effort should be put into researching ways of monitoring public spaces that minimise the impact on privacy – for example, pursuing engineering research into developing effective means of automated surveillance which ignore law-abiding activities.
R7 Information technology services should be designed to maintain privacy. Research should be pursued into the possibility of ‘designing for privacy’ and a concern for privacy should be encouraged amongst practising engineers and engineering teachers. Possibilities include designing methods of payment for travel and other goods and services without revealing identity and protecting electronic personal information by using similar methods to those used for protecting copyrighted electronic material.
R8 There is need for clarity on the rights and expectations that individuals have over their personal information. A digital charter outlining an individual’s rights and expectations over how their data are managed, shared and protected would deliver that clarity. Access by individuals to their personal data should also be made easier; for example, by automatically providing free copies of credit reports annually. There should be debate on how personal data are protected – how it can be ensured that the data are accurate, secure and private. Companies, or other trusted, third-party organisations, could have the role of data banks – trusted guardians of personal data. Research into innovative business models for such companies should be encouraged.
R9 Commercial organisations that select their customers or vary their offers to individuals on the basis of profiling should be required, on request, to divulge to the data subjects that profiling has been used. Profiling will always be used to differentiate between customers, but unfair or excessively discriminating profiling systems should not be permitted.
R10 Data collection and use systems should be designed so that there is reciprocity between data subjects and owners of the system. This includes transparency about the kinds of data collected and the uses intended for it; and data subjects having the right to receive clear explanations and justifications for data requests. In the case of camera surveillance, there should be debate on and research into ways to allow the public some level of access to the images captured by surveillance cameras.”
Tags: awareness and training, corporate governance, Information Security, IT compliance, privacy, privacy study, security incident, surveillance, trust