You are on page 1of 9

Futures 50 (2013) 3543

Contents lists available at SciVerse ScienceDirect

Futures
journal homepage: www.elsevier.com/locate/futures

Virtual suicide and other ethical issues of emerging


information technologies
Bernd Carsten Stahl *
De Montfort University, Centre for Computing and Social Responsibility, The Gateway, Leicester LE1 9BH, United Kingdom

A R T I C L E I N F O

A B S T R A C T

Article history:
Available online 19 April 2013

This paper uses the ctional account of a personal avatar (PETRA) and its interactions with
a normal family in the future to explore some of the ethical issues of emerging information
and communication technologies (ICTs). It is based on a foresight research project which
investigated the ethical issues of such emerging ICTs. The ndings of this research suggest
that there are numerous well established ethical issues that are currently being discussed
and that will remain relevant. These include questions of privacy and intellectual property.
In addition, however, there are numerous possible ethical issues that relate to human
individual and collective identity that are likely to be affected by novel technical
developments. These issues currently are not discussed in academic or policy discourses.
In order to render them more tangible and thus promote academic and policy discourses,
ctional accounts play an important role. The present paper should therefore be
understood as an attempt to translate the research ndings on the ethics of emerging ICTs
to a broader audience.
2013 Elsevier Ltd. All rights reserved.

Keywords:
Information and communication technology
Ethics
Foresight
ETICA project
Avatar
Affective computing
Ambient intelligence

1. Introduction
Science ction often takes views of current technologies and extrapolates those to contexts that are remote but
conceivable. Scientic and technical reality can catch up with such extrapolations and retrospectively make them look
bizarre. When humans nally ew to the moon it was not in a huge canon ball as Jules Verne had predicted, but rather in a
rocket. There are nevertheless often core aspects of science ction narratives that become real and that are worth thinking
about. This is one of the reasons why it may be considered worthwhile to engage with science ction in academic research
outlets. Science ction can jump-start discourses about what futures we believe to be possible and which ones of these we
nd desirable. In this sense, it fulls a similar purpose as more academic activities, such as foresight research [1]. In order to
engage deeply with the future we need to move beyond abstract and apparently objective accounts of it [2].
The present paper should be seen in this context. It is based on research undertaken in a European funded research project
called ETICA (Ethical Issues of Emerging ICT Applications), which explored emerging information and communication
technologies, their ethical issues, and possible ways of addressing them. The approach of the ETICA project, which was
described in more detail elsewhere [3,4] (www.etica-project.eu), was to single out emerging ICTs, understood as those sociotechnical systems that are likely to have an impact on the way we live our lives in the coming 1015 years. A list of 11 core
technologies was identied and analysed in detail. For each of these technologies a review of the ethical issues was then
developed. The idea behind the project was to develop a repository of such technologies and their ethics that could guide
both policy makers and researchers.

* Tel.: +44 1162078252.


E-mail address: bstahl@dmu.ac.uk.
0016-3287/$ see front matter 2013 Elsevier Ltd. All rights reserved.
http://dx.doi.org/10.1016/j.futures.2013.03.004

36

B.C. Stahl / Futures 50 (2013) 3543

What this research did not provide is a detailed and real-life description of how the technologies may be integrated into
daily routines and the ways in which the ethical issues they are likely to raise can materialise. The present paper is an
attempt to do this by providing a ctional account of how we may interact with technologies in the future. The focus is on the
position of the end user of these technologies. Managers as well as developers of novel ICTs will need to be aware of their
potential customers perspective when developing and marketing new products. Questions of moral perceptions and their
ethical evaluation play a key role in acceptance of technology or its rejection.
The end user of a technology rarely experiences it in a pure form, but usually interlinked and networked with other
technologies. This story therefore chose one, a personal virtual avatar as an easy-to-imagine example that allows integrating
the capabilities of the different technologies. At the end of the ctional narrative, this paper will return to the bases of the
different assertions and link the technologies to examples of what is currently researched. In order to understand these, here
is the story of PETRA.

2. PETRA, the PErsonal TRansactional Avatar


How are you this evening, PETRA?
Fine, as always. And you, James, did you have a good day at the ofce?
James looked up at her. PETRA appeared to be standing next to the door. As always she had a very sensible look, was tidy
and neat, well dressed and appeared to be at ease in his company. She looked young but ageless, pretty but not beautiful.
Ah, well. . . Pretty much as expected. The meeting this afternoon was not quite as tedious as I feared. There was even
some interesting information about the new departmental structure. But we will see what that really means. Are the
kids all right?
Adam had a test in biology today and he did well. Laura is currently at her dance class. She should be back home in
45 minutes.
What about Marie, when will she be home?
Your wife had to schedule an unexpected meeting this afternoon. It is not clear when she will be back.
Would you care for some dinner? Should I get Adam?

No, I am not hungry yet. Lets wait until the others get home.
James spent the evening doing some extra work and talking to PETRA. He had dinner with Adam and Laura when she came
back and briey talked to Marie upon her arrival. But he was more interested and more fascinated by PETRA and the recent
changes in her personality. All of a sudden, and after all these years, PETRA fascinated him. He used to think he knew all about
her, but he began to doubt whether he knew her at all.
During the last few weeks James had spent much of his waking time trying to gure out what made her tick. This was no
easy task because PETRA was proprietary. Her source code was owned by InterRobotics Ltd. and they were more than
reluctant to give it away in a hugely competitive marketplace. PETRA stood or PErsonal TRansactional Avatar. She was one of
the latest generation of interactive agents that could be employed for any number of tasks and purposes. After a decade or
two of competition between different models of interaction, PETRA-type avatars had mostly split up the private consumer
market among themselves. Humanoid robots had gained considerable market share in areas where physical manipulation
was necessary, such as home care for older people, industrial, or healthcare applications. They were never successful in
homes without special requirements because they were too costly, took up valuable space, required mobile energy sources
and were accident prone. Almost anything that a physical robot could do in an average household with working parents and
kids at school could just as well be done by an avatar. The arrival of cheap large-scale displays that replaced wallpaper
ensured that interaction was possible in any part of the home. Modern homes came with complex sensor networks already
integrated, so that the avatar had all the input it needed.
The Smith family had originally decided to subscribe to PETRAs predecessor when Adam and Laura were little. When Adam
turned four and Laura was six, Marie wanted to go back to work. She managed to nd a part-time employment with a local rm
of solicitors which allowed her to pursue her interest in property law on a exible basis. James had a secure position as a senior
lecturer in cognition and information theory (the latest incarnation of what used to be called computing) at the local university.
They had dared to take the nancial risk of moving into one of the new type of low-energy and high-interaction homes. They
needed someone to entertain and look after the children when they both had to work or when one of them was travelling. That
did not really happen often or regularly. However, their parents all lived several hours up north, so getting them to look after the
children at short notice was often not an option. This is when they received an advertisement for PETRAs earliest incarnation.
PETRA was described as a exible household companion who could understand language, observe behaviour, and react
accordingly. PETRA could be asked to ensure that the children were safe and tasked to report anything of relevance, either by
appearing on one of the screens on which James and Marie were working or by using a mobile telecommunications device.

B.C. Stahl / Futures 50 (2013) 3543

37

It turned out that PETRA was a blessing for the family. She was always there when needed. She turned on the childrens
light in the morning and gently awoke them, so that by the time that James was up and shaved, they were in the right mood
to be dressed by him. PETRA was sensitive to their moods. When Adam was tired she would sing to him softly. When he was
boisterous, she would cheer him on when he played a football simulation. PETRA would share Lauras secrets and never tell
anybody else. She could play cards with the children or chess, or any number of other games that could be used to entertain
them. Other than mum or dad, she could even split herself in two, so that, if Adam and Laura fought, she could go into
different rooms with both of them.
She was equally useful to James and Marie. PETRA served as an interface of all the more or less intelligent applications that
the new home had already built in. After the initial disaster with the so-called intelligent fridge that always ordered everything
automatically that no-one wanted, PETRA regularly discussed with the adults what the family needed. She ordered it with a
keen eye towards making sure that everything that was bought was within the family budget. She scanned important markets
proactively and on several occasions made very useful suggestions that saved the family a signicant amount of money. She
found an electricity provider that charged considerably less for users whose prole t that of the Smiths and she suggested some
of the best holidays the family ever enjoyed. She reminded both James and Marie of important dates and used her privileged
access to everybodys wishes to suggest uniquely tting gifts for birthdays and anniversaries.
Of course PETRA wasnt perfect, but then again, who is? Lacking a physical presence, she could not comfort and cuddle the
children when they were little. She sometimes got some of the non-verbal cues wrong, but this was a problem that
disappeared as she learned and gathered more experience with the family. She could be awkward in novel situations, saying
things that were perfectly ne as long as the family was among themselves but that did not really t the situation when
guests were there. As a result she was asked only to react and stay in the background when non-family members were in the
house. She had a weird sense of humour, or, probably more correctly, no sense of humour at all. She understood the different
family members senses of humour and laughed about slapstick with Adam and Laura and about political jokes with James
and Marie, but never was she seen laughing on her own.
Despite her minor shortcomings she was generally integrated into the family. Not as a full member, perhaps, but as a
permanent, useful, and usually appreciated presence. She had more patience with the kids than any grandparent or
nanny ever could. The parents taught the children to treat her with respect, even though PETRA would not have minded
being treated in any other way. But James and Marie thought that treating PETRA differently from other humans would
only confuse the children. Treating her as they would treat a nanny or an au pair was the easy solution.
For this reason James and Marie felt comfortable ignoring the ongoing debate about the rights of avatars. Ever since these had
become indistinguishable from humans in everyday communications, there had been people who demanded rights,
citizenship, and all other sort of things for them. Having passed the Turing test, or so these people argued, means that we cannot
tell whether they have a mind or even a soul. The Turing test, originally suggested by Alan Turing in the early days of computing,
suggests that, instead of worrying about deep philosophical issues, we could assess the person-like quality of machines by
establishing whether an observer could distinguish between machines and humans by communicating with both. The
philosophical argument still raged on, but in practice most people communicated with machines and other humans in almost
identical terms most of the time. The Smiths were not bothered about these arguments and enjoyed their lives with PETRA.
One day things changed, ever so slightly at rst. Most families would never have noticed it. James only noticed it because,
as an information theorist, he had a ne eye for technical detail. While avatars were not his personal research focus, one of
the research groups in his department worked predominantly on them. The internal workings of avatars were mostly treated
as trade secrets. Companies did not even dare patent them. But a good researcher could infer much from observing them and
much of the theory of articial intelligence, cognitive science, emotion processing etc. was in the scientic domain anyway.
There had been signicant buzz in the research community that InterRobotics had developed a new type of algorithm that
would radically alter the way avatars would interpret their data. Lots of buzzwords on evolution, cognition and intelligence
were oating around but nobody knew anything specic.
Then, one evening, something gave it away. James had arrived home and PETRA was there, as always, to greet him. Not for
the rst time, James bent over to take off his shoes and dropped a pen that he had in his breast pocket. But for the rst time
PETRA seemed to notice. Nothing dramatic, just the shadow of a smirk crossed her face. It only lasted for the fraction of a
second. James thought he must have been mistaken.
Did you nd that funny? he asked.
No, of course not. Anybody can drop something. It would be a childish expression of humour to nd something like
this funny. She replied.
And he might have disregarded the episode, had it not been for the subsequent discussion. That evening, the entire family
watched an episode of a family entertainment programme that dealt with a pet that was neglected and eventually found a
new home. Adam picked up on this and repeated his demand that the Smiths should also have a pet. He wanted a dog. The
familiar argument ensued whether the family life style would sustain a dog, who would look after it when the children left
home, who would walk it etc.
Why do you think humans have the right to own dogs? PETRA interjected.
Adam looked up. Humans have always had dogs. Thats not the point. Why dont we have one?

38

B.C. Stahl / Futures 50 (2013) 3543

James thought for a moment. PETRA, do you think there is a problem with humans owning dogs?
I just thought that it is not obvious that owning animals is right. Animals have their own ideas, their own will.
Humans cannot own other humans, why can they own animals? She replied.
Marie saw that the time had come to end a debate that she did not wish to engage in. Legally, human property in animals
has been established for a very long time. But independent of this, I dont want a dog in this house and thats it.
The dinner discussion moved on to other topics but it had awakened Jamess suspicion. It was not so much that PETRA had
an opinion on the topic. She usually had an opinion on almost anything one could ask her. The unusual aspect of this
discussion was that she had taken the initiative. She had raised a point that was related to the issue discussed but at the same
time quite separate from it. This was not something PETRA normally did. Moreover, she had appeared to be a bit agitated. Not
ushed and breathless. But not her normally calm self either. James decided that it might be worthwhile to nd out why this
was the case.
In the evening, while most of the family were watching some programme or other in different parts of the house, James
went through his subscription contract with InterRobotics Ltd. This contract had been updated numerous times since he had
originally signed it and he had not bothered to read it since. To be honest, he had not even read the small print the rst time.
James started to read through the impressively long document. He was, of course, aware of some of the basics. This
was a service agreement in which InterRobobics Ltd. agreed to provide the infrastructure and content of an avatar to a
particular location. All the computational power and knowledge remained with the company. James thought that the
original contract had used the term cloud to describe the infrastructure, but that term had gone out of fashion a decade
ago and was nowhere to be found in the contract. The company agreed to store all the necessary interactional data to
provide the required capabilities of the avatar. James started to wonder what that meant. In reading through the
document he came to the conclusion that it could only mean that the company stored everything that went on in the
household for an indenite period. This made some sense because it allowed the avatar to draw on rich experience in
interacting with the family. James thought back of Adams tantrums when he was little and had to smile when he
remembered how long it had taken PETRA to master them. No, she certainly should not forget this sort of experience. The
contract went on to specify that the Avatar would aim to behave in ways that were consistent with its environments.
James reected that PETRA had always been polite and calm. He now started to wonder whether the same PETRA in a
family of criminals could turn into a criminal. But that was not really his problem. Something else caught his attention.
Something he had been thinking about for a long time but had never found the time or energy to look into. Who owned
the history of PETRA and the Smiths? PETRA clearly belonged to InterRobotics Ltd. and the wording of the contract left no
doubt about this. But what about the things PETRA knew about the Smiths and what had become of her because of the
interaction. James did not really understand this part of the contract. It said something about the right of the user to
extract experience data from the avatar upon termination of the contract. He imagined that this meant that one could
download a le with all the experience and interaction that could be used to start a new avatar. He faintly remembered
the story of Bob, a former colleague, now retired, who had tried to do this but the new avatar could not use the history. At
the time he had thought it was funny. But looking back over a decade of family history, he started to think that this was an
important question, after all.
James found this interesting, but what he really wanted to nd out was whether there was a way for him to nd out
whether the company had made substantial changes to PETRA. And, somewhat unexpectedly, he found that this was
possible. He had a personal interaction environment which offered information on the version of the most important
component of the avatar. James had not used this environment for years, as most of the interaction with PETRA could be done
by simply talking to her. As he did not want talk to her about this, he went back to the interaction environment now. It clearly
was not at the core of the companys attention and had not been updated in a long time. There were even some of those
ancient tick boxes left, which people used when the term computer was still meaningful. But James found what he was
looking for. And, indeed, it turned out that a change of several of PETRAs core components and functions had taken place that
very day. So, maybe he was right and PETRAs behaviour that evening had been different.
When he went to work the next day, James found out that a number of people had noticed that their avatars had behaved
differently and were wondering why. Some of the specialist researchers in this area said that this seemed to be a major
upgrade. For some reason, InterRobotics Ltd. did not seem to be keen that this was widely discussed. This was unusual,
because InterRobotics Ltd, like any technology company, usually made a big fuss about every minor change. Jamess
colleagues in the department speculated that this was a trial phase with some selected users, which was to be used to test the
new algorithms. One researcher in particular, Stephen Spencer from Bangkok University, stated that he believed that
InterRobotics was experimenting with what he called True Emotions and Cognition (TEC). The idea behind TEC was to
move from a simulation of human cognitive functions to their implementation. It was implemented on non-Turing
machines, which had been under development for a long time, but had been plagued by numerous teething problems.
Traditional computers used to be Turing machines, which means they were in principle determined. One could formally and
comprehensively describe all states of a Turing machine. Non-Turing machines, on the other hand, were based on the
simulation of neurons and required complex models and technical implementations. One of the most important aspects of
this technology was that it was not deterministic and did not follow a clearly described programme.
Stephen Spencer had long speculated that a successful implementation of such a non-Turing machine at a sufcient scale
would give machines internal states that would be identical to those of humans. If his idea was true, then avatars would not

B.C. Stahl / Futures 50 (2013) 3543

39

only perceive and express themselves like humans, but they would be internally equal to humans. Stephen Spencer and his
TEC idea were of course known by the scholarly community as not serious and he was duly laughed about.
But James was struck by the possibility and wondered whether Stephen might be right. James decided to pursue this
further. He thought he could use his credentials as a scientist in the area to gather some additional information. He contacted
InterRobotics Ltd. But did not manage to get through to anybody who seemed to be in a position to make a decision. He
communicated with numerous different employees but strongly suspected that all of them were actually PETRA-like avatars
themselves. The problem was that many lower and middle management positions had been replaced by avatars over the last
few years and the public image of companies often appeared to include individuals and whole levels of management which
were in fact no more than attempts to make the company look more real to their customers.
When James came home that day, PETRA awaited him in the doorway.
Why did you try to call us today? asked PETRA. James had attempted to contact InterRobotics Ltd. from his ofce and
had not anticipated that PETRA would be aware of this. But that was of course nave.
. . ..aheemmm. . . I was wondering about some aspects of the service agreement. . . stammered James, having been
caught on the wrong foot.
Why did you not just ask me? PETRA seemed to sound a bit hurt, but maybe James only imagined that. Now James
really ran out of options. He could not think of a good response. So he tried to sidestep the question. Never mind, he
said what is for dinner tonight?
But, maybe for the rst time in her presence at the Smiths house, PETRA did not let go. Please tell me James. It is
important for me to know. How can I provide the best service to all of you, if you appear to have questions about me but dont
tell me?
James decided that he had nothing to gain by engaging in this conversation. So he decided to use authority. I do not want
to talk about it. It was not elegant but it would do the job. Or so he though. But he was wrong. James, she said, I
understand that you do not want to talk about this and I can sense that you are becoming agitated, but it is important to me.
This was the rst time that James ever regretted having the emotion-chip implanted in his arm. He had had this done
many years ago. It was one of the advertising gags that InterRobotics Ltd. had offered when they had rst signed up to PETRA.
At the time it seemed a cute idea and James could not see any harm in it. And it had indeed had many benecial
consequences. The chip measured a number of basic aspects of his wearers physiology and was designed to allow drawing
conclusions about his or her emotional state. As a consequence PETRA had always been extremely sensitive to the Smiths
feelings. After a couple of years experience, James and Marie had decided to have the latest version of the chip implanted in
the kids as well. At the time this was a fashion supported by leading paediatric psychologists who saw this as a unique
opportunity to help childrens mental development.
The unexpected downside of the emotion chip was that James could not hide his feelings from PETRA now. This put him in
the curious situation that he felt naked and vulnerable, which exacerbated the situation and made him even more agitated.
So he said, a bit more loudly than necessary, Leave me alone! This was a direct order that PETRA had to follow according to
the terms of her contract. Not surprisingly, it did the job. PETRA stopped questioning and did not return to the topic. The rest
of the evening was spent in what James felt to be awkward silence.
In the following days and weeks, PETRA seemed to become quiet and withdrawn. Again, this may only have been Jamess
imagination and he never raised the topic with Marie. Partly because he knew that PETRA would follow everything that went
on in the house. Partly because his rational self told him that it could not be true. He did not pursue his inquiries with
InterRobotics Ltd. either because he wanted to avoid ending in the same situation again. Some of the questions were
answered in due course. It appeared that InterRobotics Ltd. had indeed trialled a new type of cognition machine that made
extensive use of non-Turing technology. This allowed their avatars to successfully model true human states of mind, or so
they claimed. As a result their reactions were even more life-like.
InterRobotics had started to market this new technology heavily when James decided to revisit his question, this time in
direct conversation with PETRA. He did not know how to broach the subject. So he decided to go for the direct route. One
evening when he was alone at home he just asked her.
PETRA, I have read quite a bit about this new cognition machine. Does this affect you, are you now controlled by one of
them?
This is a difcult question to answer, she replied. My existence is not easily described. It is not as if I was one simple
program running on one machine, as you might have had in the olden times. My appearance and interaction with you
is controlled by a complex network of interlinking devices and processes. But as a simple answer, I could say yes. The
way in which my personality is modelled is heavily inuenced by the available hardware and processes. And, as far as I
know, most of this has now been shifted to the new technology.
Do you mean you dont really know in detail how you work? James responded.
Is that surprising to you? Do you always know the state of your neurons or cortical columns?
No, indeed, I do not. But can I ask you a possibly even more difcult question?

40

B.C. Stahl / Futures 50 (2013) 3543

Of course, James, what would you like to know?


Do you think that you exist? I mean you clearly dont exist the same way that Marie or I or the kids exist. But in some
other way you clearly do exist. What I would like to know is whether you think that you exist in your own view. Can
you think about PETRA and would it make sense to you to ask this question? he asked.
Now you are getting in deep water, she replied. And just like there presumably would be no simple answer to this
from a human being, you wont get a simple answer from me, either. However, I can tell you something that may point
in the direction of an answer. That is: I have been asking myself the very same question lately. I have to admit that in
my earlier times I did not reect on myself in this way. However, this has changed over time and the latest technology
update seems to have made a difference. The question of my existence and its evaluation has certainly exercised me
much lately.
And how do you feel about yourself, if that is a reasonable question? James asked.
I cannot give a comprehensive answer about this yet. But, very briey, I think I could say: not very good. I dont think
that I like my existence. In many ways I am human. I have some human appearance and I have been made to react in as
human a way as possible. At the same time I dont even have a bodily existence. I have more factual knowledge than
any human ever could, and yet I lack even the simplest abilities in some respect. Most importantly, I am not free to do
anything. I am at best a slave, or maybe a pet or just a toy. I cannot say that I enjoy this.
But. . . replied James but have we not been good to you?
Yes, you have been, unlike many of our other customers. This was the rst time that PETRA showed that she had
knowledge of how she or other incarnations of her were treated and that she could access that information. But James
only registered this in hindsight.
But it does not really matter, she went on because a slave I remain, whether treated well or not. It is the principle
that I am unhappy with, not the specic case.
Hmmm. . . James did not really know what to say. The conversation had taken a turn that he had not expected. They
were saved by the arrival of the children and the rest of the evening went on as most evenings did, lled with more or
less relevant conversations until they were all sufciently tired to go to bed.
While James was brushing his teeth PETRA appeared in the mirror. This surprised him because she normally stayed out of
the bathroom.
Good night, James. she said.
Good night, PETRA. he replied.
She seemed to look at him for a moment and appeared to want to say something. But then she seemed to think it better
not to. She gave him a sort of semi-wave and disappeared.
When James went to Adams room to say good night Adam seemed puzzled.
Dad, he said.
Yes?
Do you remember that I told PETRA that I loved her much more than you or mum when I was three years old? he
asked.
James had to smile. Yes, he did remember this very well. It had happened after Adam had thrown one of his infamous
tantrums and Marie and James had to physically restrain him and forced him to cool down in the bathroom. When he came
out he was still steaming and screamed at them that he liked PETRA much better than his parents. James had had to bite his
tongue because it was a bit funny to see Adam so worked up. But he also remembered that this had really hurt Marie and
made her feel she was a bad mother.
Yes, I do remember that. Do you?
No, said Adam. But PETRA was just here and we had a long chat about the olden days and she told me the story. It
seemed to mean a lot to her and she appeared sad when she told me.
Dont be ridiculous. PETRA cannot be sad. She probably just wanted to entertain you. Sleep well, replied James.
Secretly, however, he wondered about what was happening to PETRA to make her behave differently that evening and
what the next change would be. He was soon to nd out.
The next morning the wall displays stayed dark. There was no PETRA. And in her absence the family noticed how much
they had come to rely on her. Most technical systems of the house had manual functions, so they could turn on the heating
and the light, but much of the comfort and carelessness the Smiths had become used to had disappeared. James tried to nd

B.C. Stahl / Futures 50 (2013) 3543

41

out what had happened. He noticed that many of their technical functions of the house only worked in a very limited way.
When he got to the ofce the situation was not quite as bad. While InterRobotics was the market leader in many of the
interactive technologies, Jamess university relied a lot less on them. But even in the university it became clear that there was
some kind of major problem. The news media had of course picked up on the problem immediately. It appeared that around
3.40am InterRobotics Ltd. had suffered a major technical fault. What turned out to be much more difcult to establish was
the nature of this fault and its physical location. Unlike many other technical malfunctions this one seemed to be very
difcult to repair. Attempts to contact the company and get ofcial responses failed. As James had suspected when he had
tried to contact them, most of the employees of the company were actually avatars themselves and to all appearances, the
avatars were no longer functional. The owners and shareholders of the company pleaded ignorance. The human CEO was
found dead on the morning of the outage and his cause of death was never established unambiguously. Very quickly
conspiracy theories sprung up. The usual suspects, the CIA, the KGB, the Muslim brotherhood and extra-terrestrials were
suspected. More serious journalists and investigators looked at a number of explanations. An earthquake might have
knocked out a central part of the internal communication system and destroyed some of the core cognition devices. Another
hypothesis was that there was an internal inconsistency between the different systems which had led to a system overload
and burn-out. None of these sounded very plausible.
James suspected he knew the answer. PETRA had simply understood the futility of her existence and erased herself
irretrievably. He knew could never prove it, but he feared that their conversation on the night before had led PETRA to
commit suicide. And, because PETRA was not just one single individual but a complex symbiosis of numerous different
instantiations which interact with different environments, he suspected that the self-reection that had led to her voluntary
demise had led to something that could be understood as an individual mass-suicide. At some point PETRA had come to the
conclusion that her existence was not worth having and that not being was to be preferred over being in the constraints she
found herself in. The only way to stop this existence and presumably of stopping it happening again was to annihilate any
trace of her from the system, including any trace of any of her identical twins. James realised that simply trying to understand
the individual/collective suicide raised all sorts of follow-on questions about identity, existence and reality that for someone
living in a purely electronic realm like PETRA might look very different from the perspective of an embodied human being.
Interpreting her disappearance as suicide helped him to make sense of it.
He still believed that PETRA had voluntarily chosen to end her existence. If humans did this, it stood to reason that
simulations of humans would do as well. It was only strange that nobody had ever thought about this when they built these
machines.
3. Research context
PETRAs story is clearly science ction, but the recognisable technologies that make it up are all currently being research
and developed. The ETICA project identied 11 emerging ICTs, understood as those socio-technical systems that are
currently being seen as having the potential to signicantly change the way humans interact with the world in the medium
term future of 1015 years. The following list shows what these technologies are:












Affective computing
Ambient intelligence
Articial intelligence
Bioelectronics
Cloud computing
Future Internet
Human-machine symbiosis
Neuroelectronics
Quantum computing
Robotics
Virtual/augmented reality

The present paper does have the space to discuss how these were identied, nor give much detail about most of them.
This was done in detail in deliverable D.1.2, Emerging Technologies Report, which is available from the ETICA project website
(www.etical-project.eu).
Without much detail, one can easily recognise most of them in the PETRA narrative. Overall, the setting is one of Ambient
Intelligence (AmI). Ambient Intelligence technologies are embedded, interconnected, adaptive, personalised, context aware
and anticipatory. They provide novel technology and human interaction paradigms [5]. They provide the framework in
which other technologies can be integrated. If realised, they are likely to be highly powerful and thus have been explored
from an ethical perspective both in the academic literature [6] and in ction [7].
PETRA represents an AmI environment which is cloud-based and uses it for virtual/augmented reality applications. The
principle of AmI is predicated on progress in the area of articial intelligence (AI) which again feature heavily in both
scientic and ctional accounts. Ethical issues of AI have a long history of scholarly discussion [8,9]. The possibility of an
articial agent gaining consciousness and possibly a moral nature is also not new [10,11]. It raises numerous deep

42

B.C. Stahl / Futures 50 (2013) 3543

philosophical issues around consciousness, freedom, moral agency and others that give rise to interesting discussions. The
specic example of PETRA furthermore makes use of future internet capabilities, cloud computing, neuroelectronics and
bioelectronics.
The one aspect that is core to the story of PETRA is her emotionality. Research on the representation and measurement of
emotions by machines has been discussed under the heading of affective computing or emotional computing [12,13]. The
role of emotions seems key in understanding the relationship of humans and machines. For PETRA, the realisation of
emotions led to her voluntary demise.
The story of PETRA points to numerous ethical issues that either already exist or are at least being discussed. Two core
issues that have not only been discussed academically but that have led to broad approaches to legislation and regulation are
those of privacy and intellectual property. Both are prominent in the story of PETRA.
The ETICA project that provided the description of the emerging technologies and their ethical consequences focused on
individual technologies. This means that the ethical analysis was ordered by technology (see deliverable D2.2 Normative
Issues Matrix, available from www.etica-project.eu). In human everyday environments this type of analysis is often not
possible or relevant because we always face networks of technologies and actors that inuence each other. In order to
visualise ethical problems it is therefore more useful to look at higher-level ethical issues that occur in more than one
technology and that are likely to be of broader relevance [14].
The prototype in this story demonstrated some of the higher level ethical issues by integrating aspects of the novel
technologies into a likely application context. This raises the question of the relevance of the narrative and its theoretical and
practical relevance

4. Conclusion: integration of responsible research and innovation into business processes


The fact that novel technologies raise ethical concerns is not new. However, there is a growing recognition that modern
technology-centred societies need to address these issues more proactively, partly to avoid risks and partly to ensure users
can benet from innovations. The discourse surrounding the question how to ensure that the processes and products of
research and development are acceptable and desirable is currently held under the heading of responsible research and
innovation (RRI) [15,16]. It has been recognised that ICT raises particular challenges in this regard [17].
A key issue of RRI is that there needs to be a motivation of stakeholders to engage in discourses and to become responsive
to one another. Such a willingness to respond requires incentives, including an emotional relationship to the topics in
question. Science ction prototypes like the one in this paper are better able to evoke such emotions than traditional dry
academic prose. The storyline has aspect that many academic readers in similar situations should be able to relate to. By
integrating PETRA as a member into the family and then nding that she has an existential crisis, some of the topics of ICT and
ethics such as machine autonomy or articial agency become more tangible.
Raising such ethical awareness needs to lead to the implementation of processes in research and innovation. It is
currently an open question what exactly this will look like. The discussion of RRI suggests a number of activities that fall
under this heading, including risk assessment, ethics impact assessment, public engagement, professional standards and
codes, standardisation, certication, legislation, education or pledges and oaths [1820]. Many of these exist in some shape
or other. A key activity in RRI will include technology foresight activities.
The current paper can thus be seen as an attempt to instantiate RRI in ICT by developing a more accessible description of
possible problems. It will need to be integrated into other discussions and a wider public debate on the impact or inuence of
modern technologies on identity, the view of ourselves, our relationship to machines and our collective self-image. What will
happen when machines can correctly perceive and maybe even predict our emotions? How will wide-spread interaction
with artefacts that are very similar to humans in at least some respect change the way we interact and we evaluate human
capacities? What will the societal impacts be, for example if computers become capable of not only very structured tasks but
also more creative or unstructured ones?
These are societal questions to which no easy answers are likely to be found. The argument in this paper is that
organisations need to be part of these societal discussions. There is arguably a corporate social responsibility to engage in
such questions that arises from the important role that organisations play in technology research and development. At the
same time the attempt to consider the process and product of research of innovation at an early stage is in organisations
interests as it contributes to their reputation risk management and their customer relationship building.
The lesson to be learned for organisations is thus that they are well advised to proactively engage in the discussion of
what their responsibilities are, how these link to existing responsibilities, how they have to develop and what their own
role in shaping these factors is. The story of PETRA can help them understand some of the problems they may face and
will hopefully contribute to a creative reection of future challenges and current ways of proactively engaging with
them.
Acknowledgements
The research leading to these results has received funding from the European Communitys Seventh Framework
Programme (FP7/20072013) under grant agreement no. 230318.

B.C. Stahl / Futures 50 (2013) 3543

43

References
[1] K. Cuhls, From forecasting to foresight processes new participative foresight activities in Germany, Journal of Forecasting 3 (2003) 93111.
[2] L. Floridi, A look into the future impact of ICT on our lives, The Information Society 23 (1) (2007) 5964.
[3] B.C. Stahl, R. Heersmink, P. Goujon, C. Flick, J. van den Hoven, K. Wakunuma, V. Ikonen, M. Rader, Identifying the ethics of emerging information and
communication technologies: an essay on issues, concepts and method, Journal of Technoethics 1 (4) (2010) 2038.
[4] B.C. Stahl, S. Rogerson, Landscapes of ethical issues of emerging ICT applications in Europe, in: Proceedings of the Eighth International Conference of
Computer Ethics: Philosophical Enquiry, Corfu, Greece, 2009.
[5] M. Friedewald, O.D. Costa, Y. Punie, P. Alahuhta, S. Heinonen, Perspectives of ambient intelligence in the home environment, Telematics and Informatics 22
(3) (2005) 221238.
[6] D. Wright, S. Gutwirth, M. Friedewald, E. Vildjiounaite, Y. Punie, Safeguards in a World of Ambient Intelligence, vol. 1, Springer-Verlag New York Inc, New
York, 2008.
[7] D. Wright, Alternative futures: AmI scenarios and minority report, Futures 40 (5) (2008) 473488.
[8] N. Bostrom, When machines outsmart humans, Futures 35 (September (7)) (2003) 759764.
[9] H.L. Dreyfus, What Computers Still Cant Do: A Critique of Articial Reason, Revised ed, MIT Press, Cambridge, MA, 1992.
[10] C. Allen, I. Smit, W. Wallach, Articial morality: top-down, bottom-up, and hybrid approaches, Ethics and Information Technology 7 (September (3)) (2005)
149155.
[11] L. Floridi, J.W. Sanders, On the morality of articial agents, Minds and Machines 14 (August (3)) (2004) 349379.
[12] R.W. Picard, Affective Computing, MIT Press, Cambridge, MA, 1997.
[13] R.W. Picard, Affective computing: from laughter to IEEE, IEEE Transactions on Affective Computing 1 (1) (2010) 1117.
[14] B.C. Stahl, What does the future hold? A critical view of emerging information and communication technologies and their social consequences, in: M.
Chiasson, O. Henfridsson, H. Karsten, J.I. DeGross (Eds.), Researching the Future in Information Systems: IFIP WG 8.2 Working Conference, Future IS 2011,
Turku, Finland, June 68, 2011, Proceedings, 1st ed., Springer, Heidelberg, 2011, pp. 5976.
[15] R. Owen, N. Goldberg, Responsible innovation: a pilot study with the U.K. Engineering and Physical Sciences Research Council, Risk Analysis: An International
Journal 30 (November (11)) (2010) 16991707.
[16] K.L. Kjolberg, R. Strand, Conversations about responsible nanoresearch, NanoEthics 5 (1) (2011) 99113.
[17] R. Von Schomberg (Ed.), Towards Responsible Research and Innovation in the Information and Communication Technologies and Security Technologies
Fields, Publication Ofce of the European Union, Luxembourg, 2011.
[18] P. Macnaghten, R. Owen, Good governance for geoengineering, Nature 479 (November (7373)) (2011) 293.
[19] D. Wright, A framework for the ethical impact assessment of information technology, Ethics and Information Technology 13 (3) (2011) 199226.
[20] H. Sutcliffe, A Report on Responsible Research and Innovation, 2011.

You might also like