ETICA European Commission – Ethical Evaluation – Seventh Framework Prgramme
EU Etica – Ethical Evaluation
Ethics in Science and New Technologies
Because ICT implants in the human body go along with the tendency to commercialize the human body and treat humans as objects or as biomechanical platform, implants are considered as a potential threat to human dignity in some contexts.
This document deals with the evaluation of the ―ethical analysis carried out in WP 2 ―with the aid of the overview of computer and information ethics and biometrical analysis.
Our approach is based on official documents on the European level as suggested in the ―Description of Work allowing a comparison between the ethical issues addressed in academic research and the issues likely to be addressed at the level of the European Union.
Among these core values of European institutions we highlight for instance: human dignity, freedom (which includes autonomy, responsibility, persuasion and coercion, informed consent), freedom of research, privacy, justice (which includes: autonomy, consumer protection, cultural diversity, environmental protection, safety, ownership, social inclusion). We also take into consideration the principle of proportionality, the precautionary principle and the principle of transparency as key principles of an ―Ethics of European Institutions.
Since the focus of the ETICA project is on research founded within the FP7 programme, one may assume that no such conflicts could be identified. However, conflicts may only arise in certain areas of applications, or while issues may arise they may not be regarded as serious enough to exclude the respective research. Also, it has to be noted that Ethics in FP7 concentrates on the research process. Control mechanisms are not in force when it comes to the products of research or possible ethical implications of their use, misuse or unintended consequences of mass use (Stahl et al 2009, p. 7).
Of course, there are differences of what kind of ICT implant is used in what context and how it is connected to what part of the human body. While the research on and the development of such implants appears to be central to the vision of some technologies like Bio- und Neuroelectronics, they seem to play a less prominent role in other perspective like Ambient Intelligence.
We assume that all mentioned technologies may rise concerns about the protection of human dignity for instance in the case of ICT implants in the human body but they certainly do so in different degrees. The EGE ―considers that ICT implants are not per se a danger to human freedom or dignity but in the case of applications, which entail for instance the possibility of individual and/or group surveillance, the potential restriction of freedom must be carefully evaluated.
Since ―affective computing is closely linked with ―persuasive technologies, it tends to undermine the autonomy of the individuals affected.
Affective Computing may not only give rise to concerns with regards to ―evil dictatorships but also in democratic societies given the potentials of manipulation.
Informed consent: Persuasive technologies may become especially problematic if the persuasiveness of system is being used to archive ―informed consent (Nagenborg 2010).
The use of Affective Computing tools for specific purposes in specific contexts, especially in case of non-medical applications. Security and surveillance applications, especially if they aim at manipulating persons, might be considered to be similar to ICT implants.
In the ―Description of Technology it is stated that AmI application in healthcare might include ―computers … in your body [monitoring] your health status at all time
Privacy: As has been pointed out in the ―Ethical Analysis, the issue of ―privacy has received the most attention in academic literature. … [T]he technology is perceived to have a clear potential to violate the privacy of the user(s). AmI systems may also become part of a larger ―surveillant assemblage (Haggerty and Ericson 2000) if AmI applications become interoperable with other (AmI)systems. For example, the use and exchange of biometric information in such systems is a critical issue because these may enable to track a person in otherwise distinct systems.
Therefore, the widespread use of AmI in society and particularly the interconnectivity and interoperationality of such systems have to be considered in the ranking. Informed Consent: Because AmI systems are designed to become ‚invisible and are likely to include machine-user-interfaces that are not perceived as such by the users, there is a tendency to undermine the idea of requesting consent of the users except in a very general form.
Consumer protection: AmI applications might be considered as tools for monitoring the environment including the detection of safety risks or security issues.