You are on page 1of 4

Question 3:

Discuss why e-marketing con be considered a threat to privacy. Suggest how this threat
can be reduced.

The saying, ‘if you are not paying for the product, you are the product’ is now relevant more
than ever. Today, individuals are surrounded by devices that capture and share digital data on
every aspect of their daily life. With the advent of big data, behavioural analytics and e-
marketing, companies are exploiting consumer data under the guise of improved customer
service and increased personalisation. While the benefit of big data to companies is undeniable,
there are rightful concerns about privacy incursion and invasive marketing. The existence of
federated support networks, prescriptive analytics and algorithmic decision-making raises
further questions about the use, diffusion and protection of consumer data. This essay draws
on existing research to discuss the privacy concerns associated with e-marketing and how some
of these threats can be mitigated. First, we define e-marketing, then we look at how data is
collected from users, how analytic techniques are used to guide decision making, how these
insights are shared between federated support networks and subsequent privacy implications.
Lastly, we look at ways in which some of these threats to privacy can be alleviated.

E-marketing is firmly rooted in the practice of using big data and analytics to draw insights
into consumers’ behaviours to influence their purchase decision journeys on a deeply
personalised level. Technology used in online marketing has advanced such that collection,
enhancement and aggregation of information is instantaneous (Ashworth and Free, 2006). This
proliferation of customer information focused technology brings with several issues
surrounding customer privacy.

Data, to support e-marketing, is collected from a myriad of sources, ranging from a user’s
internet search history, social media presence, online shopping and ATM withdrawals to use
of location-based weather apps (Newell and Marabelli, 2015). Even seemingly inactive devices
such as AI based virtual assistants like Siri and Alexa are continuously listening onto users’
conversations, endlessly accumulating and communicating data via feedback mechanisms.
This data is then processed by algorithms that support and drive organisational decision making
(Newell and Marabelli, 2015). Termed ‘algorithmic decision-making’, this method of analytics
collects a user’s ‘digital traces’ from the digitised devices they use, to aid increased profiling
of customers to improve segmentation and targeting. As a result, users have unknowingly
become ‘walking data generators’ (McAfee and Brynjolfsson). This has enabled companies to
price products and services to capitalise on a consumer’s willingness to pay and maximise their
(supplier’s) surplus. It has also enabled them to identify and flag important events and dates in
consumers’ life such that customised offerings can be made to each individual customer, based
on demographic, psychographic and behavioural segmentation. An example of this is the US
supermarket chain Target using Big Data and being able to identify that a teenager was
pregnant and market baby products to a her, even before her family was aware of it. This has
given rise to issues associated with privacy and control, wherein companies strategically
capitalise on and invade into digitised records of consumers everyday activities.

Furthermore, with the introduction of federated support architectures, competitors share


information amongst themselves to mutually improve services. This knowledge sharing
enables and enhances optimisation, resulting (among other things) in tailored e-marketing
campaigns. However, it is also playing in a dangerously grey area between information sharing
to optimise customer service and violation of privacy as a basic human right. Additionally,
although data maybe anonymised, as discovered by researchers, the overlaying of multiple
sources of data makers it possible to reverse engineer the identity of individuals and de-
anonymise parts of the data set (Zimmer, 2008). This compromises the privacy of individuals
raising ethical questions about a third-party collection, analysis and use of proprietary and
personal information for capital gains.

Since the introduction and use of data analytics is a fairly new concept, there is little
understanding and agreement about the ethical implications that underpin the Big Data
phenomenon (Boyd and Crawford, 2012). There are very fine lines and grey areas surrounding
what is considered public knowledge and what should be regarded as personal and private
information. As a result, unless strong and clear regulations are enforced, companies can get
away with violating privacy under the pretext of market research and personalization. While
governmental organizations are taking steps to acknowledge and protect their citizen’s right to
privacy, such as the introduction of the GDPR regulations in the EU, more stringent laws need
to be enforced, particularly in the context of e-marketing.

In conclusion, like all socio-technical phenomena, the advent of Big Data brings with it both
benefits and costs. The use of Big Data makes e-marketing more profitable for companies and
more relevant and engaging for consumers. However, this also raises ethical questions about
the price at which online advertising is made relevant – What use of personal information is
too much use? What constitutes as accessible information as opposed to invasion of privacy?
Hence, as is with all new innovations, new policies will have to be framed and enforced to rein
in possibilities of exploitation and misuse of Big Data and the privacy of individuals.
References:
Danah Boyd & Kate Crawford (2012) CRITICAL QUESTIONS FOR BIG DATA,
Information, Communication & Society, 15:5, 662-679, DOI: 10.1080/1369118X.2012.678878

Lawrence Ashworth & Clinton Free (2006) MARKETING DATAVEILLANCE AND DIGITAL PRIVACY:
USING THEORIES OF JUSTICE TO UNDERSTAND CONSUMERS’ ONLINE PRIVACY CONCERNS,
Journal of Business Ethics (2006) 67:107–123 _ Springer 2006 DOI 10.1007/s10551-006-9007-7

Sue Newell & Marco Marabelli (2015) STRATEGIC OPPORTUNITIES (AND CHALLENGES) OF
ALGORITHMIC DECISION-MAKING: A CALL FOR ACTION ON THE LONG-TERM SOCIETAL
EFFECTS OF ‘DATIFICATION’

You might also like