You are on page 1of 24

The current issue and full text archive of this journal is available at www.emeraldinsight.com/1066-2243.

htm

INTR 21,4

Reviewing persons value of privacy of online social networking


Ulrike Hugl
School of Management, University of Innsbruck, Innsbruck, Austria
Abstract
Purpose The paper aims at a multi-faceted review of scholarly work, analyzing the current state of empirical studies dealing with privacy and online social networking (OSN) as well as the theoretical puzzle of privacy approaches related to OSN usage from the background of diverse disciplines. Drawing on a more pragmatic and practical level, aspects of privacy management are presented as well. Design/methodology/approach Based on individual privacy concerns and also publicly communicated threats, information privacy has become an important topic of public and scholarly discussion. Beside diverse positive aspects of OSN sites for users, their information is for example also being used for data mining and proling, pre-recruiting information as well as economic espionage. This review highlights information privacy mainly from an individual point-of-view, focusing on the usage of OSN sites (OSNs). Findings This analysis of scholarly work shows the following ndings: rst, adults seem to be more concerned about potential privacy threats than younger users; second, policy makers should be alarmed by a large part of users who underestimate risks of their information privacy on OSNs; third, in the case of using OSNs and its services, traditional one-dimensional privacy approaches fall short. Hence, ndings of this paper further highlight the necessity to focus on multidimensional and multidisciplinary frameworks of privacy, for example considering a so-called privacy calculus paradigm and rethinking fair information practices from a more and more ubiquitous environment of OSNs. Originality/value The results of the work presented in this paper give new opportunities for research as well as suggestions for privacy management issues for OSN providers and users. Keywords Online social networking, Proling, Data mining, Information privacy approaches, Privacy management, Customer proling, Data mining, Bullying, Austria Paper type Literature review

384
Received 16 December 2010 Accepted 30 April 2011

Internet Research Vol. 21 No. 4, 2011 pp. 384-407 q Emerald Group Publishing Limited 1066-2243 DOI 10.1108/10662241111158290

1. Introduction Announcements regarding OSN penetration and its relevance of business are euphoric. Universal McCann as a global marketing communications company estimated in March 2008: Social networks are evolving fast [. . .]. They are aiming to be the one-stop shop for all your internet needs[1]. ThinkEquity[2] predicts that by 2012, social network penetration will further increase. A current example: Facebook reports 400 million active users, 50 percent of them log in daily, 25 billion pieces of content are shared each month (including web links, news, blog posts, notes etc.), more than 100 million users engage with Facebook on external web sites every month, half of comScores Global Top 100 web sites have integrated with Facebook[3]. Multiple services related to online social network sites (OSNs) allow space to share professional or personal information. A broad denition of OSNs is brought by Ellison et al. (2007),

arguing that such services enable individuals to create a semi-public or public prole within a bounded system, the possibility to construct a list of other OSN users with whom they share a connection as well as to review their individual list of connections and those made by others within the system. Currently, several OSNs are among the top ten most visited web sites globally. Data of the European Network and Information Security Agency (ENISA, 2010) present 283 million European OSN users, 211 million of them aged between 15 and older. Facebook, MySpace, StudiVZ (a frequently visited OSN in Germany) and others are a popular opportunity for users to connect, share content, and express themselves. Therefore, users offer proles and further information, for example consisting of interests, personal values and norms, friends, school and out-of-school information, medical and probably nancial information as well as information about their workplace. Moreover, Albrechtslund (2008) states that OSNs already reveal insights into users likes, thoughts and preferred music, additionally he refers to geotagging as a newer trend to add geographical information of users or of persons in their vicinity. Based on many publicly known privacy breaches communicated in media, information privacy related to OSNs has become a common discussed issue. Furthermore, study results show that most OSNs offer little explanation about the choices users have and the impacts of their decisions so they are asked to develop their own strategies to manage their privacy needs (Strater and Lipford, 2008). Additionally, Fuchs (2010) argues that in terms of standard privacy settings on most OSNs personal data and usage behavior are stored, analyzed, and transmitted to third parties so that the tastes of the users become known to advertising rms that are allowed to target users with personalized advertising. Such information is used as a vital part of sales and marketing strategies, for mobile advertising (ENISA, 2010), malicious advertising (malvertising) (Sophos, 2010), for insurance and media companies, information brokers, economic espionage or cybercrime activities. More specically, privacy related threats could be based on digital dossiers of personal information (for example risks of blackmailing or damage of the image of prole holders), face recognition, content-based image retrieval as well as image tagging and cross-proling. Thereby, identity related threats may occur in the form of phishing attacks, information leakage, and prole squatting through identity theft, and social threats can be based on stalking and corporate espionage (Al Hasib, 2009). Nevertheless, on the one hand, impressive user data of OSNs demonstrate great opportunities to users. Examples may be to get in touch with peers (Kraut et al., 1998; Kraut et al., 2002; Jeffries and Szarek, 2010), the ability to exercise self-control in conjunction with respect to others perspectives and tolerance, the expression of sentiments in a healthy and normative manner as well as an engagement in critical decision-making and thinking (Hinduja and Patchin, 2008), academic learning in OSN-groups (Villiers De, 2010), or OSNs as a development and distribution channel for open source software (Casalo et al., 2009). On the other hand, a voluntary and gradual loss of privacy of the individual may occur. Examples may be identity fraud (Gross et al., 2005; Young and Quan-Haase, 2009; Acquisti and Gross, 2009a; Haddadi and Hui, 2010), the inability to control ones social sphere (Binder et al., 2009), online harassment (Valkenburg and Peter, 2009), physical stalking (based on the availability of personal information on OSNs) (Gross et al., 2005), cyber-mobbing (Rosenblum, 2007) or

Privacy of online networking

385

INTR 21,4

386

cyber-bullying (for example, circulating false rumours about a person or posting derogatory messages on ones user site) (Aricak et al., 2008; Valkenburg and Peter, 2009). Hence, beside individually seen positive aspects of OSNs for users, their information is probably being used for above-mentioned partially unpredictable and diverse purposes. This paper presents a literature review and seeks to answer to what extent privacy-related issues have yet explored and embraced OSN research as well as approaches of privacy management. It proceeds as follows. In section two, proling and data mining issues in connection with the usage of OSNs are highlighted, followed by a review of issues and recent studies dealing with OSN and information privacy (Table I shows a concept matrix of the analyzed papers at the end of this chapter). In a next section the paper aims at a critical analysis of related information privacy theories and conceptual frameworks, mainly covering the current scholarly discussion of control theories and multidimensional theories of information privacy. The last main section focuses on recent aspects of privacy management and OSN on a more practical level, notably relating to a users concrete opportunity to handle his/her individually needed privacy level. The paper closes with a short summary of the main ndings and suggestions for further research. 2. Information disclosure or nondisclosure and issues of privacy protection 2.1 Scholarly work highlighting proling and data mining Proling can be dened as the recording and classication of behaviors[4], data mining as the process of extracting patterns from data. Shoemaker (2010) focuses on both and estimates conclusively:
Some people may not care all about the management of their identities [. . .]. But despite the lip service paid to the dictum to pay no mind to what others think about you, [. . .] I suspect that it is the rare person who is actually like this. Most of us [. . .] have aws, and care more than we like to admit about how others see us [. . .], and so [. . .] most of us have reason to object to the sort of proling produced by data mining. That a stranger may come to know our aws is mortifying.

Besides Shoemaker (2010), several other authors proceed on the assumption that a related threat to a persons personal identity is a threat to informational privacy (Parker, 2002; Michelfelder, 2001) or privacy in general (Reiman, 1976). Gross et al. (2005) analyzed the online behavior of students at Carnegie Mellon University who have used a popular social networking site and highlight diverse potential attacks on various facets of privacy, exemplary stalking, demographics and face re-identication, social security numbers and identity theft. In more detail: the authors analyzed 4,540 Facebook proles and draw the conclusion that only a minimal percentage of users changes the permissive default of high privacy preferences, and personal data therefore is generously provided (Gross et al., 2005). Furthermore, Acquisti and Gross (2006) conducted a study with students, staff and faculty at the same university with about 300 survey respondents (mainly with a prole on Facebook) and analyzed data mining of about 7,000 proles. Results show on the one hand that the majority of Facebook members claim to know about ways to control visibility and searchability of their proles, but on the other hand a signicant minority of members is unaware of those tools and options (Acquisti and Gross, 2006).

Proling and data mining

Paper Privacy management Analysis of ve interview transcripts (using a grounded theory approach) and comparison with recorded prole data; analysis of videos of users reviewing other proles; Facebook users

(Main) Background Proling Data mining

Key ndings

Underlying (theoretical) orientation

Empirical study or other analysis

Identity theft caused by proling a/o data mining Kind/strategy of users data disclosure

Further issues of information privacy Dichotomy Information between actual disclosure behavior and responding to privacy privacy concerns concerns

Strater and Lipford (2008)

Information systems

(Neo-) Foucauldian surveillance studies; political economy of surveillance approaches Online survey covering 674 valid questionnaires (students; mainly StudiVZ users; Salzburg/ Austria)

Fuchs (2010)

Media and communication sciences

No empirical study. Philosophical review and analysis of privacy theories in connection with issues of self-identity

() )

()

Shoemaker (2010)

Philosophy

The authors focus on privacy management issues in order to reduce risks of individuals participation on OSNs. Results show that users are not aware of the accessibility of their data and underestimate the risks and consequences of disclosure. Only after a specic event (like a privacy intrusion) users are willed to modify their privacy settings. Furthermore, the authors discuss issues of privacy management interfaces and mechanisms An analysis of respondents knowledge about and information behavior (mainly) on the StudiVZ platform show that discussion about surveillance and public information is crucial for stimulating critical information behavior in the light of related and identied surveillance parameters The right to informational privacy corresponds with the right to control or manage the presentation of a persons self-identity; Most individuals do have reason to object to the sort of proling produced by data mining Theories of informational privacy (control theory, limitation theory, Restricted Access/Limited Control theory (RALC)); theories of social identity (e.g. self-esteem theory of identity) Online privacy; information revelation

Gross et al. (2005)

Computer science; Public policy

Users provide generously personal information on OSNs; Limited privacy preferences regarding the visibility of user proles are hardly used Analysis of the impact of privacy concerns on OSN members behavior; comparison of members stated attitudes with actual behavior Online Privacy; experiment with fake identities Real experiment (5 weeks) with OSN users (Facbeook) by creating 40 fake identities: 20 individuals used a wellknown lm star identity (celebrities), the other 20 identities from ordinary people () () Analysis of 4,520 prole IDs on Facebook (mainly undergraduate students, US university) highlighting potential attacks on various aspects of privacy Survey questionnaire (in total 294 respondents) and data mining of about 7,000 prole IDs

Acquisti and Gross (2006)

Computer science; Public policy

Haddidi and Hui (2010)

Computer science

Individuals privacy concerns are only a weak predictor of membership on OSNs; Individuals reveal great amounts of personal information; trust in ability to control the information provided and external access; Signicant misconceptions among some members about the online communitys reach and the visibility of members proles The results of an experiment in order to assess online privacy awareness and vulnerability issues highlight that OSN users show mixed behavior due to strangers (facing different types of users and random friendship requests). The authors stress some suggestions to improve privacy and security trustworthiness on OSNs (prevention from identity theft)

(continued )

Privacy of online networking

387

Table I. Concept matrix of papers highlighting proling and data mining as well as further issues of information privacy

388

INTR 21,4

Paper Conclusions (based on results) for theories of identity expression Survey questionnaire (in total 77 students); Interview sample of 21 undergraduate students in English Canada

Young and Quan-Haase (2009)

Tufekci (2008) Survey questionnaire (sample: 704 students using Facebook and MySpace; US) and interviews conducted in conjunction with the research project Open-ended individual interviews with 16 teenagers in their homes (ages ranged from 13 to 16 years)

Livingstone (2008)

Houghton and Joinson (2010) Main foci: Theory of communication privacy management (CPM) of Sandra Petronio (2002); Uncertainty Reduction Theory (URT) of Charles R. Berger and Richard Calabrese (1975)

Acquisti (2009)

Acquisti and Gross (2009a, b)

Table I.
Proling and data mining Underlying (theoretical) orientation Proling Data mining Empirical study or other analysis Identity theft caused by proling a/o data mining Further issues of information privacy Dichotomy Information between actual disclosure Kind/strategy behavior and responding to of users data privacy privacy disclosure concerns concerns Research is based on Erving Goffmans presentation of the self (Goffman, 1959) and Irwin Altmans framework of privacy optimization (Altman, 1977) Main foci: mutuality between social practices and technological shaping; children-centred, qualitative methodology to research teenagers practices of OSN () Interviews/discussion with 8 (mainly) Facebook users, aged between 23 and 32 years old, focusing on consequences and difculties of experiencing and managing a privacy violation No empirical study. Behavioral economics focusing on asymmetric or soft paternalism (design of systems that may enhance and/or inuence individuals choices to increase his/her welfare) Statistical re- identication (Pattern) Analysis of publicly available information combined with other sources like proles from OSNs (continued )

(Main) Background

Key ndings

Media studies

(Techno-) Sociology; Social psychology

Media and communication sciences

Information systems

Information technology; (Behavioral) Economics of privacy

Privacy protection strategies (on Facebook) show members reaction to privacy: Strategies employed most often are the exclusion of personal information, the use of private e-mail messages and the modication of the default privacy settings. The authors propose a model of information revelation Three important issues inuence information disclosure on OSNs: general privacy concerns, gender, and future audiences (mainly parents, coaches, professors). Results show that younger students are more willing to give up their privacy Younger and older teenagers differ in their phases creation of identity development with potential implications for their experiences of online risks and opportunities. Respondents disclose huge personal information to a wide circle of contacts. The main risks for teenagers: limited internet literacy and unclearness concerning their control over who can see what about them on OSNs. OSN as communication opportunity builds only one part of teenagers social relations (respondents use beside online also ofine modes of communication) Privacy issues are ubiquitous with OSN use. The management of (ubiquitous Web 2.0) boundaries, diverse aspects of information, social spheres, and the relationships are becoming increasingly difcult. Problems are the realization of the nature of boundaries, the importance of ownership of information as well as possible violations. Results show 18 categories covering privacy violations and 16 categories covering general friendship components Privacy economics in combination with behaviorally and psychologically oriented research streams may lead to a powerful tool to assist, understand and improve privacy and security decision making (for example on OSNs)

Information technology; (Behavioral) economics of privacy

Information about an individuals data of birth and place can be used to predict persons Social Security number (SSN). Results show the unexpected privacy consequences of the complex interactions among diverse multiple data sources and quantify privacy risks related to information disclosure in public forums

Proling and data mining

Paper Social theory; analysis of techno-pessimistic, technooptimistic as well as critical OSN research

(Main) Background Proling Data mining

Key ndings

Underlying (theoretical) orientation

Empirical study or other analysis

Identity theft caused by proling a/o data mining

Further issues of information privacy Dichotomy Information between actual disclosure Kind/strategy behavior and responding to of users data privacy privacy disclosure concerns concerns

Fuchs (2009)

Media and communication sciences

The extension and type of higher education, gender, class, and usage frequency of OSNs inuence the degree of critical consciousness on surveillance. The more critical respondents are, the more knowledge they tend to have about surveillance OSNs privacy policies are social contracts cited in social contract theory (users are dependent on the terms of usage stated in a sites policy) Empirical study with 674 analyzed survey datasets of Austrian students (mainly using MySpace, Facebook, studiVZ), focusing on the relationship of surveillance society and the usage of OSNs Survey with 110 valid responses of graduate and undergraduate students (mainly using Facebook and MySpace; New York City)

Lawler and Molluzzo (2010)

Information systems

Comparison of a novel model based on the authors results (the socalled privacy communication game) with other economic models applied to privacy design choices Evaluation of privacy policies of 45 OSNs involving 260 criteria like accessibility, retention and collection of user data, length, the role of third-party advertisers, and compliance with privacy laws

Bonneau and Preibusch (2009)

Computer science

() Online survey to examine information control and disclosure behaviors of Facebook users; 343 undergraduate (psychology) students at a Canadian university

Christodes et al. (2009)

Psychology

Content analysis of a representative sample of diverse elements of publicly viewable MySpace proles (1,475 individuals mainly 17 or younger) () ()

Hinduja and Patchin (2008)

Criminal Justice; Political science

The majority of respondents did not read the privacy policy statement of their OSNs. Many do not know how OSN provides gather, use and share their personal information. In addition, those respondents are not familiar with their rights concerning their own personal data stored on OSNs. The authors claim for more information and education of teenagers, students and parents regarding the storage of personal data on OSNs The results of an evaluation of OSNs lead to several assumptions: generally low quality of privacy policies, poor security practices, and usability problems. There exists high diversity of privacy controls available on OSNs which are not effectively presented to users. Among others, the authors claim for privacy labels, communicating privacy practices in a non-textual format to support users concerning more informed privacy choices The results show that respondents disclose huge identifying and personal information on Facebook. On the one hand, students are generally concerned about their privacy and report likeliness to use diverse privacy settings; On the other hand, respondents presence on Facebook requires that they have active discussions with friends, post many pictures, and share information and interests. The need for popularity forces information disclosure, which does not predict control of information Young individuals disclose a variety of types of information on their public MySpace proles. Nevertheless, the majority of adolescents seem to be responsibly using MySpace, only a minority presents private or personal information. About 40 percent of youth restrict their proles

() (continued )

Privacy of online networking

389

Table I.

390

INTR 21,4

Paper Qualitative study (n 11) and 2 surveys (n 140/293) with students (mainly from German-speaking countries), mainly between 20 and 30 years old; focusing on motives of OSN usage and related contextual factors

Vom Brocke et al. (2009)

Barnes (2006) The work covers diverse aspects of human society and social behavior (on OSNs) with a related discussion of privacy issues No empirical study Privacy calculus perspective; conjoint analysis In depth semi-structured interviews to determine the main drivers of utility of OSNs, followed by a survey mainly focusing on OSN users valuation of privacy (168 participants, mainly Facebook and StudiVZ users) () Privacy calculus perspective Online survey: 237 analyzed responses from German users and 254 from US-users (mainly between 18 and 29 years old) ()

Krasnova et al. (2009b)

Krasnova and Veltri (2010)

Krasnova et al. (2009a)

Note: means that the relevant topic is not directly covered by a papers (study) results, but is more indirectly/implicitly a subject of discussion

Table I.
Proling and data mining Underlying (theoretical) orientation Proling Data mining Empirical study or other analysis Identity theft caused by proling a/o data mining Further issues of information privacy Dichotomy Information between actual disclosure Kind/strategy behavior and responding to of users data privacy privacy disclosure concerns concerns () () () () The authors use a Structural Equation Model to examine the impact of privacy concerns on various self-presentation strategies. Factor Analysis Insights from user privacy concerns based on focus groups and an empirical study (210 individuals, mainly using Facebook and StudiVZ; Germany) () () ()

(Main) Background

Key ndings

Information systems

Communication studies; Social computing

Business and economics; Technology management

Business and economics; Technology management

Business and economics; Technology management

The research identies two main classes of motives for the usage of OSNs: social motives (contact maintenance/social searching) and interest motives (contact interest/topic interest). Security issues are highlighted as main reason for not using OSNs. Contextual factors (potentially) shaping motives are: geographic distance, existence of real world contacts covered by an OSNs, the location of users, their special social environments and interests, degree of commitment, security issues, and relationship status and gender Review of literature, highlighting students and teens using OSNs and posting personal information. The author discusses related privacy issues (especially public versus private boundaries), presents her so-called privacy paradox and highlights steps that can be taken to resolve it Altogether, privacy is important for OSN users. Furthermore, the results value individuals privacy in monetary terms. The authors identify three groups of users with different behavioral preferences (unconcerned socializers, control-conscious socializers and privacy-concerned) and discuss related opportunities for providers of OSNs The survey results of Facebook users in Germany and in the US show (culturebased) differences in perceptions of disclosure-relevant determinants: Germans ascribe higher likelihood and more damage to privacy-related threats, while users from the US seem to be more concerned, feel more benets from an OSN usage, notice more control, and have more trust in legal assurances and their providers The authors try to validate measures for their so-called User Privacy Concerns on OSNs (PCOSN) construct. The results show two main underlying sources of individuals privacy concern: concerns about organizational threats (organizational use of their information) and concerns about social threats (stemming from the user environment of an OSNs). The study provides insights for providers of OSNs and policy-makers

In a real experiment with users, Haddidi and Hui (2010) compared individuals behavior with regard to friendship requests by using 40 fake identities of well-known lm stars and ordinary people on Facebook. The authors results show that usually users do not accept random friendship requests, but some aggressively search for celebrities, making a perfect case for spammers to form honeypots using such fake proles. Other scholars refer to the kind of users data disclosed on popular OSNs: in their quantitative study n 77 and interviews n 21; Young and Quan-Haase (2009) draw attention to undergraduate students Facebook usage: 99.35 percent of respondents use their actual rst and last name in their prole. Nearly two-thirds present their sexual orientation and interests (favorite movies, activities and books); 83.1 percent provide their e-mail address, 92.2 percent their date of birth, 80.5 percent their current town in which they live, 97.7 percent present an image of themselves, and 96.1 percent photos of friends. Tufekci (2008) presents results n 704 of college students mainly using Facebook and MySpace: 94.9 percent of the Facebook users responded using their real names, 80.3 percent their favorite music, 66.2 percent their favorite book, 77.7 percent their favorite movie, 46.3 percent their political view, 75.6 percent their romantic status, 72.2 percent their sexual orientation, and 44.7 percent their religion. Moreover, a subset of students has been interviewed and asked how likely they thought it was that a government agency, a future employer or a company would look at their proles on OSNs. The results show that demographic variables, rather than the concerns about future audiences [. . .] seemed to have more of an effect on participants behavior on OSNs (Tufekci, 2008). Furthermore, students engage in an activity regarding current concerns: The more concerned they were about unwanted audiences in general, the more likely they were to take steps to wall off their proles (Tufekci, 2008). In contrast, more real concern of audiences like the government, companies and employer surveillance was less relevant, but expectations of such real concerns are (similar to (Livingstone, 2008)) authority gures like parents, coaches, and professors. Hence, it seems to be that these young respondents more refer to choices regarding current concerns (for example, as long as parents are not reading their journals) and are less concerned about future threats. In this context, Houghton and Joinson (2010) rightly ask: For example, photos of drunken excursions may be willingly shared with friends, but are they so eagerly shared with family [. . .] or even potential employers? Moreover, different lines of thought regarding identity theft caused by proling and data mining can be pointed out: Acquisti (2009) brings an example due to dates of birth of OSN users. Based on his soft paternalistic approach, OSNs provide context to support the users decision. Such contextual questions can be: How many users might be able to access that data? What can they do with it? However, his usability approach would focus on a system that makes it easy or intuitive for users to change the visibility settings for their birth dates (Acquisti, 2009). Additionally, Acquisti and Gross (2009a; b) demonstrate how information about an individuals date of birth and place can be determined to predict his or her Social Security Number (SSN). Extrapolating their results to the US living population, this would mean the potential identication of millions of SSNs (Acquisti and Gross, 2009a).

Privacy of online networking

391

INTR 21,4

Another specic aspect is pointed out by Fuchs (2009), in particular highlighting the surveillance of OSNs regarding the usage behavior and proles for advertising options. Based on his results he concludes:
It is not an accident that one has to opt out of such features, and not has to opt in. Economic surveillance is protable, therefore media corporations and other rms engage in it. It is unlikely that they will automatically limit these endeavours because their primary interest is and must be the accumulation of money prots.

392

Therefore, among others he suggests to advance critical awareness of the public as well as of corporate and political surveillers as an important political move to guarantee civil rights. Indeed, users still often value OSNs as closed and safe worlds and are mainly not aware of potential consequences (e.g. Tow et al., 2010; Jagatic et al., 2007). Thus, Lawler and Molluzzo (2010) also focus on an enhancement of public awareness especially claiming for educational needs for students and parents. Results of their study show that more than half of respondents (US students) did not read the privacy policies statements of their OSNs and about two-thirds did not know how their personal information might be gathered, used, and shared by their OSN providers. Therefore, the authors claim for an enhanced awareness of OSN privacy through proper education at all educational levels for students as well as for parents. Such education should involve knowledge about what and how data is stored on OSNs, how data might be used, and who is likely to have access to it. In general, data mining programs compile fragments of information and deduce a specic persons prole for advertising and other purposes. A persons prole might include information about his/her name(s) (pseudonyms), age (birthday), profession, social class or nancial status, locations (cell numbers, e-mail accounts), ethnicity, gender, type of car, number of children or childlessness, information regarding pets, music and lm preferences, political views or party membership, voluntary activities, orientations (heterosexual or homosexual), habits (smoking and drinking), hobbies and shopping habits. In this connection, Shoemaker (2010) argues that a specic prole that might well have been drawn from dollops of information that separately dont implicate anything about my identity, but when patterned in this way actually do: the proler now purports to know something about me, whereas without the patterning he just knew various unrelated bits about some persons life. According to him, different possibilities regarding such a construal based on data mining activities may occur (Shoemaker, 2010): . the construal is correct, but aspects are embarrassing or shameful to the exposed party; . the construal is correct, and disclosed facets are ones about which the person is proud; and . the construal is incorrect with implications of embarrassment or shame for the affected person, given that partys emotional investment in his or her identity being construed in a certain way. Hence, the public construal and presentation of an individuals self-identity might undermine ones autonomy in specic cases of data mining (Shoemaker, 2010).

2.2 Scholarly work highlighting OSN and further issues of information privacy Publicly announced privacy breaches as well as criticism from privacy fundamentalists put pressure on network operators of OSNs to increase levels of user privacy. Recently, several civil liberties groups such as the Electronic Privacy Information Center (EPIC, Washington, DC) have led complaints at the US Commerce Commission due to Facebooks existing data protection policy and users data usage. Supporting such complaints, an analysis of 45 OSNs by Bonneau and Preibusch (2009) show a wide range and variety of privacy settings and a market being still in an early stage of aggressive competition for users. Hence, OSNs rarely publicize their privacy enhancing tools (even if available) and largely continue unknown data sharing (e.g. Houghton and Joinson, 2010; Christodes et al., 2009). One conceivable explanation for the rst mentioned point may be that efforts of OSN providers in the eld of privacy enhancing tools could raise attention to privacy controls and lead to a decrease of users data sharing willingness (e.g. Bonneau and Preibusch, 2009). Beside already mentioned studies mainly highlighting specic aspects of proling and data mining (Gross et al., 2005; Acquisti and Gross, 2006; Acquisti, 2009; Acquisti and Gross, 2009a; b; Young and Quan-Haase, 2009; Tufekci, 2008), also further recent scholarly work deals with challenging privacy concerns in connection with OSN: Livingstone (2008) interviewed 16 British teenagers aged from 13 to 16 having their own personal prole on Facebook, MySpace, Piczo, Bebo, or similar. Her research focuses on subtle coherence between online opportunity and risk. Results show that although teenagers may disclose personal information with up to several hundred people known only casually it is the case that they are concerned about their privacy: being visible to strangers is not so much a concern, but a visibility of their proles to their parents is stated as problematic (Livingstone, 2008). Based on study results of Hinduja and Patchin (2008), many adolescents seek to demonstrate familiarity with adult-oriented behaviors: 18 percent report on their publicly available MySpace prole page that they recently consumed alcohol, 8 percent that they had smoked cigarettes or used marijuana (2 percent). All in all, 40 percent of young users restricted their prole page in a way that only those they actively accepted as their network friends could view the content (Hinduja and Patchin, 2008). According to the latter group, based on their qualitative study n 11 as well as two surveys n 140=293 with students from German-speaking countries (mainly between 20 and 30-years-old), Vom Brocke et al. (2009) report that 48 percent do not want to publish their personal data online and therefore do not want to use OSNs. From those respondents who use OSNs, 73 percent state that security reasons inuence their usage behavior. In order to consider partially scholarly work so far and to bring a more general aspect into attention, Barnes (2006) refers to public versus private boundaries and a so-called paradoxical world of privacy: on the one hand, teenagers reveal their intimate behaviors and thoughts (more or less) public available, on the other hand, marketers, college ofcials, government agencies and others are collecting data about individuals. While adults are concerned about potential privacy threats, teenagers make personal data public. In her opinion, many people may not be aware of the fact that their privacy has already been jeopardized and they are not taking steps to protect their personal information from being used by others (Barnes, 2006).

Privacy of online networking

393

INTR 21,4

394

An interesting newer research stream deals with questions of valuing costs and benets: Krasnova et al. (2009b) conducted a study using conjoint analysis to elicit the weighing of privacy costs and benets by OSNs users. They especially researched what choices users would make when facing different trade-offs, including privacy-related decisions and identied three groups of users representing different utility patterns (respondents mainly between 20 and 29 years old): unconcerned socializers, control-conscious socializers as well as privacy-concerned (Krasnova et al., 2009b). Their results show that unconcerned socializers are the higher part of users than unconcerned found in comparable surveys; therefore they argue that this fact should be alarming for diverse stakeholders as well as policy makers. Users of this male-dominated group seem to signicantly underestimate risks concerning their participation in OSN. However, the female-dominated group of control-conscious socializers is placing high value on the ability to control accessibility of the information they provide. To express this issue in monetary terms, the authors found that a social network provider could earn additionally between e36.8 and e44.7 million by offering more rened privacy settings specically to this group. The third group the privacy-concerned addresses concerns on information used by OSN providers. Members of this group would pay between e23.1 and e28.2 to the provider for a non-usage of their demographic data for personalized advertising purposes. The authors point out: This way, the interest of both groups, OSN users and OSN provider, would be met (Krasnova et al., 2009b). Furthermore, from the background of such a privacy calculus perspective, OSN providers with a focus on international markets should also consider cultural differences. For example, a comparison of Facebook users from the USA and Germany provide insights into different OSN socializing behavior (Krasnova and Veltri, 2010). However, it has to be taken into consideration that presently most providers make money by selling users data to third parties. A relevant issue for research could be to examine if such receipts exceed potential receipts from users for a (partially) non-usage of their data. Another aspect is presented by Ameur et al. (2010) as follows: online-marketing companies help interested rms to nd potential customers. After their search of user proles available on OSNs, such companies recommend potential customers to rms, who then directly approach customers. A rm pays $727 for each 5,000 users who agree to be its friend (or 15 cents per friend). The same game is offered for the acquisition of companies fans. Hence, some SNS [social network sites] have really become a network of advertising rather than friendship (Ameur et al., 2010). Besides presented scholarly work of Acquisti and Gross (2006) and Tufekci (2008), primarily focusing on a dichotomy between actual behavior and privacy concerns, the research of Krasnova et al. and Krasnova and Veltri (2009b, 2010) report that users tend to reduce the amount of information they disclose responding to their privacy concerns. I agree with the critical view of Krasnova et al. (2009a), who highlight that most existing different research ndings may be based on a lack of validated measurement instruments addressing the unique character of OSNs. In contrast to the eld of OSN, the multidimensional privacy framework Concern For Information Privacy (CFIP) of Smith et al. (1996) has been applied and measured by several other authors in diverse business-oriented and organizational settings (Stewart and Segars, 2002; Rose, 2006; Milberg et al., 2000).

Summarizing, the mentioned studies show several interesting aspects: rst, users of OSNs provide diverse and partially very detailed personal information and especially younger students are less concerned about other (future) audiences (see, e.g. results of Tufekci, 2008; Hinduja and Patchin, 2008; NCC, 2003). Second, adults seem to be more concerned about potential privacy threats than teenagers (e.g. Barnes, 2006; Vom Brocke et al., 2009). Third, policy makers and other stakeholder should be alarmed by a large part of unconcerned socializers who underestimate risks of their information privacy on OSNs (Krasnova et al., 2009b). A recent study of the Pew Research Centers Internet & American Life Project, covering 800 American adolescents between ages 12 and 17, show that 82 percent are using OSNs (Lenhart et al., 2010). Referring to the second and third mentioned aspects, these results underline the alarming potential for policy makers and other stakeholder, especially towards an avoidance of younger OSNs users data misuse. Fourth, users would be willing to pay a specic amount for different privacy levels to OSN providers, for example for a non-usage of their data (see study results of Krasnova et al., 2009b). In such a way, both providers and users of OSNs may benet from users proles and behavior such a privacy calculus paradigm (rst introduced by Culnan and Armstrong, 1999) for OSN should additionally reect surrounding business practices. An overview of analyzed scholarly work is presented in the concept matrix (Table I). 3. Related information privacy denitions and approaches Scholarly work discussed previously demonstrates the accessibility of the vast quantity of data available on OSNs. In the following, approaches of informational privacy from diverse disciplines with relevance to the usage of OSN for individuals will be analyzed. This chapter specically focuses on the one hand on denitions and approaches of privacy, on the other hand on an overview of control theories and multidimensional privacy theories. Furthermore, aspects of privacy theories representing issues of concrete privacy management with more practical relevance will be discussed in the next chapter. First of all we have to ask how to dene and value privacy and what aspects seem to be relevant. Daniel Solove (2007) as an expert in privacy law argues:
As people use the freedom-enhancing dimensions of the internet, as they express themselves and engage in self-development, they may be constraining the freedom and self-development of others and even of themselves.

Privacy of online networking

395

For the philosopher Herman T. Tavani (2007) privacy can be something referring to lost, invaded, intruded upon, violated, diminished, breached, and others; each of these metaphors refer to existing privacy theories and conceptual frameworks. From a philosophical perspective, Kemp and Moore (2007) differentiate several existing and broadly discussed philosophical privacy perspectives: the right to be alone (referring to the work of Warren and Brandeis, 1890), limited access to the self (Gavison, 1980; Bok, 1982; Allen, 1988), privacy as secrecy (referring to Posner, 1998; Etzioni, 2000), control over personal information (referring to Westin, 1967; Fried, 1968), personality (e.g. Freund, 1971; Benn, 1971; Reiman, 1976), intimacy, and privacy as cluster concept (DeCew, 1997; Moore, 2003). Privacy involves a persons right to control the dissemination of personal information (Berman and Bruening, 2001). Nevertheless, privacy still seems to be a sweeping concept,

INTR 21,4

396

encompassing (among other things) freedom of thought, [. . .] control over personal information, freedom from surveillance, protection of ones reputation, and protection from searches and interrogations (Solove, 2008). One mentioned aspect, namely control over information about oneself, is one of the main issues valuing privacy. Different versions of control theories of informational privacy go back to the 19th century. Three examples: Alan F. Westin (1967) provides a link between privacy and secrecy and describes privacy as the claim of individuals [. . .] to determine for themselves when, how, and to what extent information about them is communicated to others. Additionally, in a more recent article he refers to individual privacy balances based on the constant changing of individuals needs in terms of different situational events and life-cycle progress (Westin, 2003). A similar view is presented by Altman (1975) who as a social psychologist argues with selective control of access to the self, whereas privacy is seen as a dynamic and dialectic boundary control process: the environmental context (for example different information architectures of OSNs) affects social privacy behavior. In other words, also the privacy environment of OSNs inuences users privacy practices. Moreover, other authors refer to the mentioned control aspect involving diverse issues: Miller (1971) explains privacy as the individuals ability to control the circulation of information relating to him, and Rachels (1975) refers to the connection between our ability to control who has access to information about us and our ability to create and maintain different sorts of relationships. Nevertheless, early control theories have been widely criticized (e.g. Solove, 2002). Kemp and Moore (2007) mention the fact that they cannot account for decisional privacy in the sense of individual choices or an autonomy conception of privacy. Moreover, Tavani (2007) claims a lack of clarity regarding which kinds of personal information one can expect to have control over, and how much control one can expect to have over ones personal information. In general, many of the above-mentioned issues have to be seen as a kind of personal freedom within a (more or less) restricted area. Referring to online social networks one may ask several questions: rst, is it doubtful to expect having comprehensive control over personal information provided on OSNs? Or is it control over the accessibility of data? Which data in detail? Do we decide and balance pros and cons in a rational way when we reveal content on OSNs? Second, it also brings up the question if we can reveal huge information on OSNs and still enjoy privacy based on an extensive control over ones personal information as a potential condition of our privacy. Users content is stored (and partially used for further purposes) by providers of OSNs, digital records are searchable, cross-indexable with other data etc. hence, partially not controllable in a total or absolute manner. The problem of vagueness regarding ones zone of privacy represented by a single privacy concept (in this regard control over information) has led to the development of multidimensional theories. Such theories may account for accessibility, information control and physical and expressive issues of privacy as well. From the background of communication sciences, Burgoon et al. (1989) dene privacy as the ability to control and limit physical, interactional, psychological and informational access to ones group or to the self. Their rst dimension refers to how physically accessible an individual is to others; the second relates to an individuals right with whom she or he shares personal information, as well as the control of affective or cognitive inputs or outputs (for example the frequency, length and content of an interaction). The psychological

dimension represents the ability to control social interactions, for example with whom and under what circumstances thoughts or values will be shared or personal information revealed. The next, closely related informational dimension, focuses on an individuals right to reveal personal information to others (not always under an individuals control because partially governed by law or custom). From a philosophical background, DeCew (1997) presents a similar concept covering only three dimensions: informational privacy, accessibility privacy and expressive privacy. In her context, informational privacy refers to control over information about oneself and covers medical details or nancial data. An individual should have the ability to decide who has access to his/her data and for what purposes. Such data should also be protected by recipients of it. Whereas, accessibility privacy refers to sensory or physical access to an individual, and expressive privacy protects a realm for expressing ones self-identity or personhood through speech or activity (DeCew, 1997). Another multidimensional privacy approach is presented by Solove (2002). For him, privacy has to focus on dimensions of particular practices. Hence, an evaluation of privacy has to be seen in particular contexts, especially regarding legal and policy problems. In this context, practices are activities, customs, norms, and traditions, and a protection of individuals privacy has to be a protection against disruptions to certain practices (Solove, 2002). Consequently, valuing privacy is determined by a particular context which depends upon the social importance of the practice of which it is a part (Solove, 2002). Altogether, presented multidimensional approaches from Burgoon et al. (1989), DeCew (1997) and Solove (2002) conceptualize privacy as a cluster concept (a term used by Kemp and Moore, 2007), covering diverse dimensions ranging from access, information and expressions to contexts instead of considering only one single conception. In online social networks, individuals (normally) can decide what personal information is available to the public. In addition, content may include facts that can lead another person directly to the user. So, informational privacy can overlap with accessibility privacy when the acquisition of information additionally involves gaining access to a person (DeCew, 1997). A further aspect can be seen in the eld of new possibilities of pervasive data combination of OSN data with other data, for example regarding medical diagnosis, DNA databases, data of geo-marketing, or via RFID-technology generated data. Such availability of data decreases the ability of individuals control over information about themselves. For example, current research focuses on the role of ubiquitous environments as a new privacy-related context and claims for so-called fair information practices aiming at a protection of individuals private data. Karyda et al. (2009) identied a general list of issues that should be treated in ubiquitous environments. A few facets of such fair information practices can be rethought and adapted for online social networking environments; these are: . The need to support users with specic information about their personal data. Users should have the possibility to make information-based and free choices regarding the collection and the specic usage of their data. Hence, OSN providers should be invited to work for example on their privacy policies and standard privacy settings. One opportunity with potential for benets for both providers and users may be the implementation of a clear communicated and transparent designed privacy calculus model for users (business model-oriented).

Privacy of online networking

397

INTR 21,4

398

The latter suggestion is connected with a converging decrease of the asymmetry of data ownership and power on OSNs. An application of control of data quality on OSNs requires a careful handling of both providers and users, for example due to specic opportunities of users friends on OSNs. A last aspect deals with problems of applying compliance schemes and enforcement. Different countries do have different approaches of independent public supervisory authorities; compliance schemes are regulated by directives, codes of conduct, specic laws, aspects of self-regulation like the use of privacy enhancing technologies, etc. Such regulations as well as users felt privacy are culture-dependent. Therefore, providers managing personal data of OSN users should carefully consider their individual privacy accounts and related enforcement needs.

In summary, new (multidisciplinary) privacy approaches in more and more digital and pervasive environments need to concentrate their efforts on privacy safeguarding requirements that the person as user can set by himself or herself, including special functionalities for data protection mechanisms and self-control. All these efforts should lead to a more proper OSN-usage, aiming at accountability and perhaps (partially) at a limitation of the data at stake. 4. Towards a more practical view of OSN and individual privacy management Control over information is important for the management of privacy. In this regard Tavani (2007) notes that an individual needs to have some degree of control with respect to three elements: choice, consent, and correction. A person needs some control in choosing situations that offer others the level of access the person desires, which can range from total privacy to total publicity. For him, the consent process relates to a persons right to restrict others from access to specic kinds of personal information; correction relates to the ability of an individual to have access to information about oneself and to correct it if necessary. The Restricted Access/Limited Control (RALC) theory of privacy (initially introduced by James Moor, 1990, 1997; expanded by Tavani and Moor, 2001; Tavani, 2007) differentiates the concept of privacy from both, the management of privacy and the justication: privacy policies provide users with the limited controls they need to deal with their privacy, whereby practices and activities can be treated as situations. An example can be the monitoring [of] individuals that exchange information over the internet via le-sharing systems and P2P (Peer to Peer) networks (Tavani, 2007). In the context of a situation, privacy is dened with respect to a protection from intrusion and information access by others. A person has normative privacy in a situation where he/she is protected by explicit norms, policies, or laws that have been established to protect individuals in that situation (Tavani, 2007). Hence, privacy focuses on restricted access and protection the notion of control and adequate privacy policies of OSN providers should provide individuals with the limited controls which are needed to handle their privacy needs. Based on his specic analysis of a data mining example, Tavani (2007) states that RALC can frame a comprehensive online privacy policy that could be applied not only to situations

involving data mining but also to a wide range of privacy controversies associated with computer and information technologies. In this respect, RALC builds a context-based privacy theory considering restricted access to certain information in a specic situation (for example, a relationship) and a specic context (for example a P2P network). Advocates of this theory maintain the situation or the zone, not the kind of information itself, that is used in determining whether information should be normatively protected (Tavani, 2007). The philosopher David W. Shoemaker (2010) takes a different point-of-view and argues two important issues: rst, the demands for recognition of a zone of informational privacy, as well as protection against its breach, are essentially demands for protection against the unauthorized exposure of ones identity; and second, it is not the situation or zone, but rather [. . .] the kind of information itself, that matters for determinations of normative protection. Based on a detailed discussion and reection of self-identity, he summarizes that a person (in our case a user of OSNs) has a right to informational privacy, to manage and control the presentation of his/her self-identity as well as from a specic users point of view the right to manage certain public construals of my self-identity, or at least to have some sort of say in determining what others think about the type of person I am (Shoemaker, 2010). Transferring the arguments of Shoemaker (2010) to OSN, the following aspects are important: users are not able to control what others in fact think about them, but they normally can have an effect on the ways others construe them in virtue of what facets of their self-identity they themselves expose or allow to be exposed and how they go about exposing them. However, in the case of data mining and proling it has to be taken into consideration that a construal is effected without ones desired or expected input. From the background of social psychology, Livingstone and Brake (2010) refer to concrete policy implications and demand a balancing of opportunities and risks in social networking, especially for children and teenagers. They identify ve issues for policy makers and researchers: (1) The more knowledge and skills teenagers have in their internet usage, the more they experience both opportunities and risks (and not, as often supposed, the more able they are to avoid risks). The linkage between such opportunities and risks partially arises from risk-taking practices and youthful exploration (referring to Hope, 2007). Therefore, the interface design plays a central role, for example regarding a ltering of pornography and sexual advice results from the same online search and special designed privacy controls. (2) Young users should be educated in new practices of embedded marketing, a potential misuse of personal data, data mining, proling and so forth. Increased knowledge about new practices may inuence their behavior. (3) Addressing risk is not a primary task of parents and children. Therefore, diverse (European) initiatives are calling for safety by design, for example regarding an improvement of OSN providers transparency of data handling practices. (4) Especially risk assessment regarding a misuse of personal information by fraudsters and spammers as well as aspects of reputation threat and employment prospects should be taken into consideration.

Privacy of online networking

399

INTR 21,4

(5) Based on the UN Convention on the Rights of the Child, children have the right to freedom of expression and assembly and freedom from harm from the commerce, state and individuals. Since children are concerned to maintain privacy from their parents, this challenges simplistic advice that parents should check up on their childrens social networking activities, with or without their permission. The balance between opportunities and risks should, arguably, be struck differently for at risk children, where greater monitoring or restrictions may be legitimate moreover, for these children especially, relying on parents to undertake this role may be inappropriate (Livingstone and Brake, 2010). These suggestions support diverse above-mentioned study results, namely that especially younger users provide diverse and partially very detailed personal information and are less concerned about other (future) audiences. Interesting aspects of privacy management are also discussed by Ameur et al. (2010). The authors refer to Privacy-enhanced Social Networking Sites (PSNS) fullling several properties: rst, privacy awareness and customization making the user aware of potential risks regarding the sharing of information with others as well as presenting a exible and easy way for users to communicate privacy concern related to a personal policy and comparing it with policies of other users etc.; second, data minimization the user knows which of his/her information is accessed by services of the provider or of third parties and how his/her information is being used; and third, data sovereignty personal data belongs to the user and not to the provider of an OSNs; explicit consent of a user is necessary to sell data to third parties and a user should be able to control/track how his/her information is disseminated. Only a few systems exist trying to implement such properties into their design. Additionally, Ameur et al. (2010) introduce a Privacy Framework for SNS with four levels of privacy No Privacy/Soft Privacy/Hard Privacy/Full Privacy and present a concept of User Privacy Policy (UPP) which refers to the rst above-mentioned point regarding awareness communication, but also to a users choice to which extent he/she trusts an OSN provider. According to a chosen users privacy level, the user has the opportunity to determine how much information he/she would like to leave to the provider. Ameur et al. (2010) demonstrate an example: in the situation where the user chooses Full Privacy, the SNS server is only trusted in storing an encrypted version of the personal information of the user so that it can consult at any time by one of his friends but not to the point where the SNS itself [the provider] has access to this information (as the SNS server does not know the keys needed to decrypt this information). Based on an analysis of 45 OSNs using 260 criteria, Bonneau and Preibusch (2009) present a game-theoretic explanatory approach explaining the observed data and the so-called privacy communication game. The model focuses on an explanation of varying levels of privacy-related advertising within a single site as well as a consideration of heterogeneous privacy preferences in the user population, and the temporal dynamics of privacy concerns. Based on Westins work (see, e.g. Westin, 2003), the authors differentiate three groups related to their privacy concerns: the marginally concerned, the pragmatic majority, and the privacy fundamentalists. A successful provider would play a game of minimising the awareness of privacy for

400

the non-fundamentalists while simultaneously minimizing the concerns of the fundamentalists. Hence, based on a detailed analysis of user groups behavioral options, Bonneau and Preibusch (2009) highlight an OSNs providers optimal strategy at the least for the groups of fundamentalists and non-fundamentalists and relate it to their empirical evidence. Furthermore, the authors summarize in their results a lack of accessible information for users, suggest (referring to Kelley et al., 2009) as reduction against information asymmetry standardized privacy nutrition labels which are able to communicate privacy practices in a non-textual format and support users in their more informed privacy decisions and choices (Bonneau and Preibusch, 2009). Currently, OSN providers increasingly offer new personalized services and other related benets for users. Presented aspects of individual privacy management on OSNs would support users opportunities to consider personal information in a sense of a double-edged sword (term used by Malhotra et al., 2004): offering benets to those who want to (excessively) use them and simultaneously considering needs of so-called privacy fundamentalist. One possibility to balance both sides could be the consideration of an individuals privacy calculus approach, for example in terms of a monthly or yearly users payment for an enhancement of individual data protection for a specic user group like privacy fundamentalists, or other business model conceptions for the management of privacy on OSNs. And as Weiss (2007) notes: the viewpoint of privacy protection needs to consider users decisions on a case-by-case basis if the person wants to provide a specic set of sensitive data about himself/herself, additionally answering the question of who should be allowed to access this information or not. 5. Conclusion The contribution of this paper is threefold. First, it has presented a current literature review of OSN research highlighting diverse information privacy related issues. The analysis has suggested several main aspects: identity theft caused by proling and data mining seems to be a current trend with exponential growth for advertising and other purposes; adults seem to be more concerned about potential privacy threats than younger users; and policy makers should be alarmed by a large part of users who underestimate risks of their information privacy on OSNs. Second, the paper has presented information privacy denitions and frameworks from the background of diverse disciplines, mainly covering control theories and multidimensional theories of privacy. Especially the latter has shown the necessity to focus on an interrelationship of several approaches when trying to analyze privacy-related aspects for OSN users. In addition, more and more upcoming ubiquitous environments may also require a change of thinking regarding presented fair information practices in OSN research and of OSN providers. Third, drawing on a more pragmatic level, several currently discussed suggestions and issues of (individual) privacy management with their relation to OSN have been presented, especially trying to consider a level of privacy a specic user desires to receive. Such approaches have potential for further research, perhaps also covering (in a probably more complex model) relevant third parties. Especially the consideration and implementation towards a so-called privacy calculus paradigm (for example

Privacy of online networking

401

INTR 21,4

402

expanding the work of Krasnova et al., 2009b) would offer new opportunities for further OSN research. Users would probably be willing to pay a specic amount for different privacy levels to OSN providers, for example for a non-usage of their data for specic purposes. Nevertheless, it has to be taken into consideration that presently most providers make money by selling users data to third parties. A relevant issue for research could be to examine if such receipts exceed potential receipts from users for a (partially) non-usage of their data. In general, there is a need for further research toward validated measurement instruments addressing the specic character of OSNs and users practices. Multidimensional privacy approaches in more and more digital and pervasive environments have to be considered. Efforts on privacy safeguarding requirements including special functionalities for data protection mechanisms and self-control opportunities should lead to a more proper OSN-usage, aiming on accountability and perhaps (partially) on a limitation of the data at stake. In addition to that, very little research has been done so far covering organizations data protection and security vulnerability in connection with employees OSN behavior. For example, increased activities in the eld of economic espionage all over the world and related monitoring of employees data on OSNs should enhance efforts for research in the eld of privacy between employers and employees in the corporate context. One nal remark. Existing studies, mainly focusing on individual privacy-related behavior and privacy concern, reinforce that privacy is a crucial topic in OSN research. To generally argue with Daniel Solove (2007): New technologies do not just enhance freedom but also alter the matrix of freedom and control in new and challenging ways. As far as we can see things are in progress regarding both directions.
Notes 1. Available at: www.scribd.com/doc/3836535/Universal-Mccann-on-Social-Media (accessed 7 October 2010). 2. Available at: www.thinkequity.com/ (accessed 15 September 2010). 3. Available at: www.facebook.com/press/info.php?statistics (accessed 29 August 2010). 4. Electronic Privacy Information Center (EPIC), available at: http://epic.org/ (accessed 2 October 2010).

References Acquisti, A. (2009), Nudging privacy: the behavioral economics of personal information, IEEE Security and Privacy, Vol. 7 No. 6, pp. 82-5. Acquisti, A. and Gross, R. (2006) in Golle, P. and Danezis, G. (Eds), Imagined communities: awareness, information sharing, and privacy on the Facebook, Proceedings of 6th Workshop on Privacy Enhancing Technologies, 28-30 June, Robinson College, Cambridge, UK. Acquisti, A. and Gross, R. (2009a), Predicting Social Security numbers from public data, Proceedings of the National Academy of Sciences (PNAS), Vol. 106 No. 27, pp. 10975-80. Acquisti, A. and Gross, R. (2009b), Social insecurity: the unintended consequences of identity fraud prevention policies, paper presented at the Workshop on the Economics of Information Security, 24-25 June, University College London, London.

Ameur, E., Gambs, S. and Ho, A. (2010), Towards a privacy-enhanced social networking site, Proceedings of the ARES International Conference on Availability, Reliability and Security, 15-18 February, Krakow. Al Hasib, A. (2009), Threats of online social networks, IJCSNS International Journal of Computer Science and Network Security, Vol. 9 No. 11, pp. 288-93. Albrechtslund, A. (2008), Online social networking as participatory surveillance, First Monday, Vol. 13 No. 3, available at: http://rstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/ article/view/2142/1949 (accessed 5 October 2010). Allen, A.L. (1988), Uneasy Access: Privacy for Woman in a Free Society, Rowman & Littleeld, Totowa, NJ. Altman, I. (1975), The Environment and Social Behaviour, Brooks/Cole, Monterey, CA. Altman, I. (1977), Framework of Privacy Optimization, Brooks/Cole Publishing, Pacic Grove, CA, (originally published in 1975). Aricak, T., Siyahhan, S., Uzunhasanoglu, A., Saribeyoglu, S., Ciplak, S., Yismaz, N. and Memmedov, C. (2008), Cyberbullying among Turkish adolescents, CyberPsychology & Behavior, Vol. 11 No. 3, pp. 253-61. Barnes, S.B. (2006), A privacy paradox: social networking in the United States, First Monday, Vol. 11 No. 9, available at: http://rstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/ article/viewArticle/1394/1312#b3 (accessed 3 November 2010). Benn, S.I. (1971), Privacy, freedom and respect for persons, in Pennock, J.R. and Chapman, J.W. (Eds), Privacy (Nomos XIII), Atherton Press, New York, NY. Berger, C.R. and Calabrese, R. (1975), Uncertainty Reduction Theory (URT), Taylor Graham Publishing, London. Berman, J. and Bruening, P. (2001), Is privacy still possible in the twenty-rst century?, Social Research, Vol. 68 No. 1, pp. 306-18. Binder, J., Howes, A. and Sutcliffe, A. (2009) in Greenberg, S., Hudson, S.E., Hinckley, K. and Morris, M.R. (Eds), The problem of conicting social spheres: effects of network structure on experienced tension in social network sites, Proceedings of the 27th Annual CHI Conference on Human Factors in Computing Systems, 4-9 April, ACM, Boston, MA. Bok, S. (1982), Secrets: On the Ethics of Concealment and Revelation, Pantheon, New York, NY. Bonneau, J. and Preibusch, S. (2009), The privacy jungle: on the market for data protection in social networks, paper presented at the Eighth Workshop on the Economics of Information Security (WEIS), 24-25 June, London. Burgoon, J.K., Parrot, R., Lepoire, B.A., Kelley, D.L., Walther, J.B. and Perry, D. (1989), Maintaining and restoring privacy through communication in different types of relationship, Journal of Social and Personal Relationships, Vol. 6, pp. 131-58. , L.V., Cisneros, J., Flavian, C. and Guinalu, M. (2009), Determinants of success in open Casalo source software networks, Industrial Management & Data Systems, Vol. 109 No. 4, pp. 532-49. Christodes, E., Muise, A. and Desmarais, S. (2009), Information disclosure and control on facebook: are they two sides of the same coin or two different processes?, CyberPsychology & Behavior, Vol. 12 No. 3, pp. 341-5. Culnan, M.J. and Armstrong, P.K. (1999), Information privacy concerns, procedural fairness, and impersonal trust: an empirical investigation, Organization Science, Vol. 10 No. 1, pp. 104-15.

Privacy of online networking

403

INTR 21,4

404

DeCew, J. (1997), In Pursuit of Privacy: Law, Ethics, and the Rise of Technology, Cornell University Press, Ithaca, NY. Ellison, N.B., Steineld, C. and Lampe, C. (2007), The benets of Facebook Friends: social capital and college students use of online social network sites, Journal of Computer-Mediated Communication, Vol. 12 No. 4, available at: http://jcmc.indiana.edu/vol12/issue4/ellison. html (accessed 7 October 2010). ENISA (2010), Online as Soon as it Happens, European Network and Information Security Agency (ENISA), Heraklion. Etzioni, A. (2000), The Limits of Privacy, Basic Books, New York, NY. Freund, P.A. (1971), Privacy: one concept or many?, in Pennock, J.R. and Chapman, J.W. (Eds), Privacy (Nomos XIII), Atherton Press, New York, NY. Fried, C. (1968), Privacy, Yale Law Journal, Vol. 77, pp. 475-93. Fuchs, C. (2009), Social Networking Sites and the Surveillance Society. A Critical Case Study of the Usage of studiVZ, Facebook, and MySpace by Students in Salzburg in the Context of Electronic Surveillance, Research Group Unied Theory of Information, Salzburg/Vienna, available at: http://fuchs.icts.sbg.ac.at/SNS_Surveillance_Fuchs.pdf (accessed 4 August 2010). Fuchs, C. (2010), studiVZ: social networking in the surveillance society, Ethics and Information Technology, Vol. 12 No. 2, pp. 171-85. Gavison, R. (1980), Privacy and the limits of law, Yale Law Journal, Vol. 89, pp. 421-71. Goffman, E, (1959), Presentation of Self, Doubleday Anchor Books, Garden City, NY. Gross, R., Acquisti, A. and Heinz Iii, H.J. (2005) in De Capitani Di Vimercati, S. and Dingledine, R. (Eds), Information revelation and privacy in online social networks (The Facebook case), Proceedings of the 2005 ACM Workshop on Privacy in the Electronic Society (WPES), 5-7 November, ACM, Alexandria, VA. Haddadi, H. and Hui, P. (2010) in IEEE (Ed.), To add or not to add: privacy and social honeypots, Proceedings of the ICC 2010: IEEE International Conference on Communications, 23-27 May, IEEE, Capetown, South Africa. Hinduja, S. and Patchin, J.W. (2008), Personal information of adolescents on the internet: a quantitative content analysis of MySpace, Journal of Adolescence, Vol. 31 No. 1, pp. 125-46. Hope, A. (2007), Risk taking, boundary performance and intentional school internet misuse, Discourse: Studies in the Cultural Politics of Education, Vol. 28 No. 1, pp. 87-99. Houghton, D.J. and Joinson, A.N. (2010), Privacy, social network sites, and social relations, Journal of Technology in Human Services, Vol. 2 No. 1, pp. 74-94. Jagatic, T.N., Johnson, N.A., Jakobsson, M. and Menczer, F. (2007), Social phishing, Communications of the ACM, Vol. 50 No. 10, pp. 94-100. Jeffries, W.B. and Szarek, J.L. (2010), Tag this article! Todays learners and the use of Web 2.0 in teaching, Molecular Interventions, Vol. 10 No. 2, pp. 60-4. Karyda, M., Gritzalis, S., Park, J.H. and Kokolakis, S. (2009), Privacy and fair information practices in ubiquitous environments: research challenges and future directions, Internet Research, Vol. 19 No. 2, pp. 194-208. Kelley, P.G., Bresee, J., Cranor, L.F. and Reeder, R.W. (2009), A nutrition label for privacy, Proceedings of the 5th Symposium on Usable Privacy and Security (SOUPS), 15-17 July, ACM, Mountain View, CA.

Kemp, R. and Moore, A.D. (2007), Privacy, Library Hi Tech, Vol. 25 No. 1, pp. 58-78. Krasnova, H. and Veltri, N.F. (2010), Privacy calculus on social networking sites: explorative evidence from Germany and USA, Proceedings of the 43rd Hawaii International Conference on System Sciences (HICSS), 5-8 January, IEEE, Koloa, Hawaii. Krasnova, H., Hildebrand, T. and Gunther, O. (2009b), Investigating the value of privacy in online social networks: conjoint analysis, Paper 173, paper presented at the 13th International Conference on Information Systems (ICIS), 15-18 December, Phoenix, AR. Krasnova, H., Gunther, O., Spiekermann, S. and Koroleva, K. (2009a), Privacy concerns and identity in online social networks, Identity in the Information Society, Vol. 2 No. 1, pp. 39-63. Kraut, R., Kiesler, S., Boneva, B., Cummings, J., Helgeson, V. and Crawford, A. (2002), Internet paradox revisited, Journal of Social Issues, Vol. 58 No. 1, pp. 49-74. Kraut, R., Patterson, M., Lundmark, V., Kiesler, S., Mukopadhyay, T. and Scherlis, W. (1998), Internet paradox: a social technology that reduces social involvement and psychological well-being?, American Psychologist, Vol. 53, pp. 1017-31. Lawler, J.P. and Molluzzo, J.C. (2010), A study of the perceptions of students on privacy and security on social networking sites (SNS), on the internet, Journal of Information Systems Applied Research, Vol. 3 No. 12, available at: http://jisar.org/3/12/ (accessed 30 June 2010). Lenhart, A., Purcell, K., Smith, A. and Zickuhr, K. (2010), Social Media & Mobile Internet Use Among Teens and Young Adults (Report of Pew Internet & American Life Project), Pew Interent & Amercian Life Project, available at: www.pewinternet.org/ , /media//Files/Reports/2010/PIP_Social_Media_and_Young_Adults_Report.pdf (accessed 10 September 2010). Livingstone, S. (2008), Taking risky opportunities in youthful content creation: teenagers use of social networking sites for intimacy, privacy and self-expression, New Media & Society, Vol. 10 No. 3, pp. 393-411. Livingstone, S. and Brake, D.R. (2010), On the rapid rise of social networking sites: new ndings and policy implications, Children & Society, Vol. 24 No. 1, pp. 75-83. Malhotra, N.K., Kim, S.S. and Agarwal, J. (2004), Internet Users Information Privacy Concerns (IUIPC): the construct, the scale, and a causal model, Information Systems Research, Vol. 15 No. 4, pp. 336-55. Michelfelder, D.P. (2001), The moral value of informational privacy in cyberspace, Ethics and Information Technology, Vol. 3 No. 2, pp. 129-35. Milberg, S.J., Smith, H.J. and Burke, S.J. (2000), Information privacy: corporate management and national regulation, Organization Science, Vol. 11 No. 1, pp. 35-57. Miller, A. (1971), The Assault on Privacy, Harvard University Press, Cambridge. Moor, J.H. (1990), The ethics of privacy protection, Library Trends, Vol. 39 No. 12, pp. 69-82. Moor, J.H. (1997), Towards a theory of privacy in the information age, Computers and Society, Vol. 27 No. 3, pp. 27-32. Moore, A. (2003), Privacy: its meaning and value, American Philosophical Quarterly, Vol. 40, pp. 215-27. NCC (2003) in NCC (Ed.), Survey of information security policy and practice 2004, National Computing Center (NCC), Manchester, in partnership with Ernst&Young, Computer Weekly and Information Risk Management (IRM).

Privacy of online networking

405

INTR 21,4

Parker, L.S. (2002), Information(al), matters: bioethics and the boundaries of the public and the private, Social Philosophy and Policy, Vol. 19 No. 2, pp. 83-112. Petronio, S. (2002), Theory of Communication Privacy Management (CPM), SUNY Press, Albany, NY. Posner, R.A. (1998), Economic Analysis of Law, Little, Brown, Boston, MA. Rachels, J. (1975), Why privacy is important, Philosophy and Public Affairs, Vol. 4 No. 4, pp. 323-33. Reiman, J.H. (1976), Privacy, intimacy, and personhood, Philosophy & Public Affairs, Vol. 6, pp. 26-44. Rose, E.A. (2006), An examination of the concern for information privacy in the New Zealand regulatory context, Information & Management, Vol. 43 No. 3, pp. 322-35. Rosenblum, D. (2007), What anyone can know: the privacy risks of social networking sites, IEEE Security & Privacy, Vol. 5 No. 3, pp. 40-9. Shoemaker, D.A. (2010), Self-exposure and exposure of the self: informational privacy and the presentation of identity, Ethics in Information Technology, Vol. 12 No. 1, pp. 3-15. Smith, J.H., Milberg, S.J. and Burke, S.J. (1996), Information privacy: measuring individuals concerns about organizational practices, MIS Quarterly, Vol. 20 No. 2, pp. 167-96. Solove, D.J. (2002), Conceptualizing privacy, California Law Review, Vol. 90, pp. 1087-156. Solove, D.J. (2007), The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Yale University Press, New Haven, CT. Solove, D.J. (2008), Understanding Privacy, GWU Legal Studies Research Paper 420 Harvard University Press, pp. 1-24, available at: www.usdrinc.com/downloads/Privacy.pdf (accessed 5 April 2009). Sophos (2010), Security Threat Report: 2010, Sophos Group, Boston, MA, available at: www. sophos.com/sophos/docs/eng/papers/sophos-security-threat-report-jan-2010-wpna.pdf (accessed 25 August 2010). Stewart, K.A. and Segars, A.H. (2002), An empirical examination of the concern for information privacy instrument, Information Systems Research, Vol. 13 No. 1, pp. 36-49. Strater, K. and Lipford, H.R. (2008), Strategies and struggles with privacy in an online social networking community, Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction, 1-5 September, British Computer Society, Liverpool. Tavani, H. and Moor, J. (2001), Privacy protection, control of information, and privacy-enhancing technologies, ACM SIGCAS Computers and Society, Vol. 31 No. 1, pp. 6-11. Tavani, H.T. (2007), Philosophical theories of privacy: implications for an adequate online privacy policy, Metaphilosophy, Vol. 38 No. 1, pp. 1-22. Tow, W.N.-F.H., Dell, P. and Venable, J. (2010), Understanding information disclosure behaviour in Australian Facebook users, Journal of Information Technology Research, Vol. 25, pp. 126-36. Tufekci, Z. (2008), Can you see me now? Audience and disclosure regulation in online social network sites, Bulletin of Science, Technology & Society, Vol. 28 No. 1, pp. 20-36. Valkenburg, P.M. and Peter, J. (2009), Social consequences of the internet for adolescents: a decade of research, Current Directions in Psychological Science, Vol. 18 No. 1, pp. 1-5.

406

Villiers De, M.R. (2010), Academic use of a group on Facebook: initial ndings and perceptions, Proceedings of Informing Science & IT Education Conference (InSITE), 21-24 June, Cassino, Italy. Vom Brocke, J., Richter, D. and Riemer, K. (2009), Motives for using social network sites (SNSs), an analysis of SNS adoption among students, Proceedings of 22nd Bled eConference, June 14-17, Bled, Slovenia. Warren, S.D. and Brandeis, L.D. (1890), The right to privacy, Harvard Law Review, Vol. 4 No. 5, pp. 193-220. Weiss, S. (2007), Online social networks and the need for new privacy research in information and communication technology, Proceedings of the Third International Summer School: The Future of Identity in the Information Society (IFIP, in cooperation with FIDIS Network of Excellence and HumanIT), 6-10 August, Karlstad. Westin, A.F. (1967), Privacy and Freedom, Atheneum, New York, NY. Westin, A.F. (2003), Social and political dimensions of privacy, Journal of Social Issues, Vol. 59 No. 2, pp. 431-53. Young, A.L. and Quan-Haase, A. (2009), Information revelation and internet privacy concerns on social network sites: a case study of Facebook, Proceedings of the 4th International Conference on Communities & Technologies (C&T09), 25-27 June, ACM, Pennsylvania, PA. Corresponding author Ulrike Hugl can be contacted at: ulrike.hugl@uibk.ac.at

Privacy of online networking

407

To purchase reprints of this article please e-mail: reprints@emeraldinsight.com Or visit our web site for further details: www.emeraldinsight.com/reprints

You might also like