{"id":1171,"date":"2013-09-24T07:49:59","date_gmt":"2013-09-24T07:49:59","guid":{"rendered":"http:\/\/reframe.sussex.ac.uk\/activistmedia\/?p=1171"},"modified":"2017-01-24T23:12:47","modified_gmt":"2017-01-24T23:12:47","slug":"small-resistances-to-big-data-protecting-and-politicising-personal-data-trails","status":"publish","type":"post","link":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/2013\/09\/small-resistances-to-big-data-protecting-and-politicising-personal-data-trails\/","title":{"rendered":"Small resistances to Big Data: Protecting &#8211; and politicising &#8211; personal data trails"},"content":{"rendered":"<p><!--\n\/* Font Definitions *\/\n@font-face\n{font-family:Geneva;\npanose-1:2 11 5 3 3 4 4 4 2 4;\nmso-font-charset:0;\nmso-generic-font-family:auto;\nmso-font-pitch:variable;\nmso-font-signature:3 0 0 0 1 0;}\n@font-face\n{font-family:Calibri;\npanose-1:2 15 5 2 2 2 4 3 2 4;\nmso-font-charset:0;\nmso-generic-font-family:auto;\nmso-font-pitch:variable;\nmso-font-signature:3 0 0 0 1 0;}\n@font-face\n{font-family:Garamond;\npanose-1:2 2 4 4 3 3 1 1 8 3;\nmso-font-charset:0;\nmso-generic-font-family:auto;\nmso-font-pitch:variable;\nmso-font-signature:3 0 0 0 1 0;}\n@font-face\n{font-family:NimbusRomNo9L-Medi;\npanose-1:0 0 0 0 0 0 0 0 0 0;\nmso-font-alt:\"Times New Roman\";\nmso-font-charset:77;\nmso-generic-font-family:auto;\nmso-font-format:other;\nmso-font-pitch:auto;\nmso-font-signature:50331648 0 0 0 1 0;}\n@font-face\n{font-family:SimSun;\nmso-font-alt:\u5b8b\u4f53;\nmso-font-charset:134;\nmso-generic-font-family:auto;\nmso-font-pitch:variable;\nmso-font-signature:3 680460288 22 0 262145 0;}\n\/* Style Definitions *\/\np.MsoNormal, li.MsoNormal, div.MsoNormal\n{mso-style-parent:\"\";\nmargin-top:0cm;\nmargin-right:0cm;\nmargin-bottom:10.0pt;\nmargin-left:0cm;\nline-height:115%;\nmso-pagination:widow-orphan;\nfont-size:11.0pt;\nfont-family:Geneva;\nmso-ascii-font-family:Calibri;\nmso-fareast-font-family:SimSun;\nmso-hansi-font-family:Calibri;\nmso-bidi-font-family:Geneva;\nmso-fareast-language:ZH-CN;}\na:link, span.MsoHyperlink\n{color:blue;\ntext-decoration:underline;\ntext-underline:single;}\na:visited, span.MsoHyperlinkFollowed\n{mso-style-noshow:yes;\ncolor:purple;\ntext-decoration:underline;\ntext-underline:single;}\n@page Section1\n{size:612.0pt 792.0pt;\nmargin:72.0pt 90.0pt 72.0pt 90.0pt;\nmso-header-margin:36.0pt;\nmso-footer-margin:36.0pt;\nmso-paper-source:0;}\ndiv.Section1\n{page:Section1;}\n--><!--\n\/* Font Definitions *\/\n@font-face\n{font-family:Arial;\npanose-1:2 11 6 4 2 2 2 2 2 4;\nmso-font-charset:0;\nmso-generic-font-family:auto;\nmso-font-pitch:variable;\nmso-font-signature:3 0 0 0 1 0;}\n@font-face\n{font-family:Times;\npanose-1:2 0 5 0 0 0 0 0 0 0;\nmso-font-charset:0;\nmso-generic-font-family:auto;\nmso-font-pitch:variable;\nmso-font-signature:3 0 0 0 1 0;}\n@font-face\n{font-family:Cambria;\npanose-1:2 4 5 3 5 4 6 3 2 4;\nmso-font-charset:0;\nmso-generic-font-family:auto;\nmso-font-pitch:variable;\nmso-font-signature:3 0 0 0 1 0;}\n\/* Style Definitions *\/\np.MsoNormal, li.MsoNormal, div.MsoNormal\n{mso-style-parent:\"\";\nmargin:0cm;\nmargin-bottom:.0001pt;\nmso-pagination:widow-orphan;\nfont-size:12.0pt;\nfont-family:\"Times New Roman\";\nmso-ascii-font-family:Cambria;\nmso-ascii-theme-font:minor-latin;\nmso-fareast-font-family:Cambria;\nmso-fareast-theme-font:minor-latin;\nmso-hansi-font-family:Cambria;\nmso-hansi-theme-font:minor-latin;\nmso-bidi-font-family:\"Times New Roman\";\nmso-bidi-theme-font:minor-bidi;\nmso-ansi-language:EN-US;}\na:link, span.MsoHyperlink\n{color:blue;\ntext-decoration:underline;\ntext-underline:single;}\na:visited, span.MsoHyperlinkFollowed\n{mso-style-noshow:yes;\ncolor:purple;\ntext-decoration:underline;\ntext-underline:single;}\np\n{margin:0cm;\nmargin-bottom:.0001pt;\nmso-pagination:widow-orphan;\nfont-size:10.0pt;\nfont-family:\"Times New Roman\";\nmso-ascii-font-family:Times;\nmso-fareast-font-family:Cambria;\nmso-fareast-theme-font:minor-latin;\nmso-hansi-font-family:Times;\nmso-bidi-font-family:\"Times New Roman\";}\n@page Section1\n{size:612.0pt 792.0pt;\nmargin:72.0pt 90.0pt 72.0pt 90.0pt;\nmso-header-margin:36.0pt;\nmso-footer-margin:36.0pt;\nmso-paper-source:0;}\ndiv.Section1\n{page:Section1;}\n--><em><a href=\"http:\/\/www.sussex.ac.uk\/mfm\/internal\/people\/mediaandfilm\/person\/174167\">Tanya Kant<\/a> is a doctoral researcher in Media and Cultural Studies in the <a href=\"http:\/\/sussex.ac.uk\/mfm\/\">School of Media, Film and Music<\/a>\u00a0at the University of Sussex. Her current research focuses on <a href=\"http:\/\/problematisingpersonalization.wordpress.com\/\"> personalisation<\/a> on popular online platforms. She is currently conducting interviews with users of tracker blocking tools, especially Ghostery, about their experiences of data tracking, online privacy and tracker blocking. If you are interested in participating in interviews, please contact Tanya Kant directly via email (tk44[at]sussex.ac.uk) or <a href=\"https:\/\/twitter.com\/TanyaKant1\">twitter<\/a>. This week, Tanya questions if resistance is futile to online data tracking.<br \/>\n<\/em><\/p>\n<p>In his new book <i><a href=\"http:\/\/politybooks.com\/book.asp?ref=9780745662848\">Networks of outrage and hope<\/a>, <\/i>Manual Castells celebrates the Internet\u2019s capacity to bring together otherwise disconnected social subjects, creating a de-centralised, many-to-many network of communication that supports and encourages emerging social movements. Like many other scholars and popular writers, Castells emphasises the communitarian opportunities for social-political action that the web affords. As a \u2018horizontal\u2019, deinstitutionalised form of communication technology, the Internet has been widely championed for its capacity to mobilise and publicise even the most marginal activist collectives, facilitating the up-rise of citizens compelled to challenge, resist and even overthrow dominant societal norms and institutions (2012). Less dramatically, the Internet also opens up new spaces for collective activist debates \u2013 as evidenced by the existence of this blog.<\/p>\n<p>However, though the net\u2019s accommodation of socio-political activism is of course important, it is only one component of the multi-faceted, multi-functional plethora of platforms that most users utilise and enjoy. Platforms such as social networks, blogging sites or search engines can &#8211; and are \u2013 used to support activist movements and support social change, but they also serve other far more banal uses; personal messages between friends, day-to-day news and entertainment consumption, lifestyle applications and status updates that express everyday frustrations rather than stark national tensions.<\/p>\n<p>Yet, as recent revelations concerning both <a href=\"http:\/\/www.theguardian.com\/world\/nsa\">state<\/a> and <a href=\"http:\/\/www.huffingtonpost.com\/bob-cesca\/how-the-guardian-is-quiet_b_3923408.html\">commercial<\/a> surveillance have highlighted, even web users\u2019 most individuated, ephemeral web movements are being monitored and mined; our day-to-day online trajectories are harvested for data that is archived, profiled and used (amongst other things) to generate profit. Furthermore, as this article will explore, there is a rising proliferation of \u2018self-help tools\u2019 available to users who are interested in protecting their personal data trail; by tracking the first and third party companies and organisations interested in following our online movements, and by blocking the data signals that these parties use to track us. Which begs the question, if our individual data trails are mined, tracked and used for profit \u2013 but can also be protected and controlled &#8211; should we be considering them as sites of resistance? As well as considering the web within the context of collective activist debates, should we be considering protecting our everyday online movements as a form of individualised activism?<\/p>\n<p><b>The personal data economy <\/b><\/p>\n<p>At current, the amount of digital data that we collectively generate is staggering; as Viktor Mayer-Schonberger and Kenneth Cukier highlight, the globe\u2019s digital data stores double in size approximately every three years, and at current they amount to around 1,2000 exabytes (Mayer-Schonderger and Cukier, 2013: 9).\u00a0 Somewhat unsurprisingly then, a users\u2019 individual data trail &#8211; comprised of a number of identifying signals such as browsing histories, cookie aggregates, log-in details, IP address and other data \u2013 can be extensive, and can be tracked by a number of companies and other organisations interested in what that user is doing with the web.<\/p>\n<p>As some users are aware and as scholars such as Helen Nissenbaum (2010) emphasise, our online movements and articulations may be personal, but they are not necessarily private; data collection is an established practice for popular cost-free platforms like Facebook and Google. After all, personal data is what keeps these services afloat \u2013 the new documentary <a href=\"http:\/\/tacma.net\/\">Terms and Conditions May Apply<\/a> states that every Google users is worth around $500 dollars per year to the company. Google and other platforms persistently assert (in their Terms of Service and in <a href=\"http:\/\/www.independent.co.uk\/life-style\/gadgets-and-tech\/news\/google-gmail-users-cant-expect-privacy-when-sending-emails-8762280.html\">courts of law<\/a>) that though we may generate our data trails, we do not necessarily own them; the data is not just \u2018yours\u2019, but \u2018theirs\u2019 too.<\/p>\n<p>Furthermore, though many users are aware of the often ambiguous data collection strategies executed by these prominent first parties, it is the increasingly opaque data aggregation by third party trackers \u2013 advertisers, marketing specialists, data analysts and audience research agencies \u2013 that remains relatively under-publicised. As Nick Nikiforakis et al (2013) and Johnathan Mayer (2011) highlight, these third party trackers user a number of techniques, such as cookie aggregation and browser fingerprinting to extract data from a users\u2019 web movements, despite the fact that the user may never have actually visited that third party\u2019s site. By using these relatively covert and legally ambiguous data mining techniques, third parties can build robust and coherent user profiles and user datasets that can be sold to interested parties or used to generate revenue through targeted advertising and other marketing practices.<\/p>\n<p><b>Tracking the trackers; finding \u2018a window to the invisible web\u2019<\/b><\/p>\n<p>As Andrew McStay points outs, the data tracking practices currently in operation are often ambiguous and nonnegotiable (2012). Though users may be forced to accept the existence of identifying tags like cookies if they want to continue using their favourite sites, the idea that users truly \u2018consent\u2019 to data tracking is problematised by the striking lack of concise information, understanding and transparency that is needed for user to enter an informed agreement with their platform of choice. As a result, even if a user has supposedly \u2018consented\u2019, many data collection policies actively deny individuals the control or agency to effectively opt-out of such data mining. He states that;<\/p>\n<p>\u2018Consent is not passive, but rather requires that people do something. This means that people must be informed and able to conceive an educated opinion so as to express will\u2026. To be devoid of understanding is to be unable to give proper consent (2012: 600)\u2019<\/p>\n<p>Furthermore, scholars such as Lev Manovich and Christian Fuchs argue, surplus value extraction from user generated data platforms, as well as the lack of user access to their own data files, can lead to user exploitation and the creation of a new type of \u2018digital divide\u2019 in the form of \u2018a hierarchy of \u201cdata-classes\u201d\u2019 (Manovich, 2011) in which disempowered users constitute a kind of digital \u2018lower class\u2019. These writers highlight the tensions \u2013 between user control and platform ownership \u2013 that surround our online data tracking, revealing that even if a users\u2019 themselves consider their online trajectories to be apolitical, everyday and innocuous, the trail they leave behind has become an active site of contention.<\/p>\n<p>Fortunately for concerned users, there are a number of \u2018self-help tools\u2019 available to those interested in protecting and controlling their data trails. Browser add-ons and software applications such as <a href=\"https:\/\/www.abine.com\/dntdetail.php\">DoNotTrackMe<\/a>, <a href=\"https:\/\/www.torproject.org\/\">Tor<\/a>, <a href=\"https:\/\/adblockplus.org\/en\/firefox\">AdBlock Plus<\/a> and <a href=\"https:\/\/www.ghostery.com\/\">Ghostery<\/a> provide a number ways of blocking trackers or preventing the consequences of data mining. For example, privacy specialists Ghostery offer a downloadable browser extension that enables individual web users to detect and block over 1,200 listed trackers. As the Ghostery site states, the Add-on is designed to help even the most technophobic user \u2018become a web detective\u2019 by equipping them with a free and easy-to-use tool that will alert them of any ad providers, data analysts, marketing companies and web publishers that are invisibly monitoring their visits to websites. Not only does the Add-on detect unwelcome third parties, it can \u2018block\u2019 them too \u2013 by preventing the tracker from following your data trail on either just the website you are viewing or from the web in its entirety. Tracking blocking tools such as Ghostery help to rectify the lack of knowledge and control surrounding ambiguous and opaque data collection strategies, by informing users on who\u2019s tracking them and empowering users with the tools to protect their data.<\/p>\n<p>However, as Nikiforakis et al (2013) and Mayer recognise, there are a number of problems with the self-help tools currently available that undermine their effectiveness. The most common and widely publicised issue is that no self-help tool can offer 100% protection against tracking. For example, some ad blockers block advertising from appearing on the sites you visit but don\u2019t stop you from being tracked all together, whilst other tools protect you from certain types of data mining (cookies) but not others (social plugins). Another problem is that the use of some blockers may result in a loss of functionality on some sites. Most worryingly however is Nickiforakis et al.\u2019s finding that some blocking practices actually help third party trackers to identify users \u2013 they state:<\/p>\n<p>\u2018Over 800,000 users, who are currently utilizing user-agent-spoofing extensions are more fingerprintable than users who do not attempt to hide their browser\u2019s identity, and challenge the advice given by prior research on the use of such extensions as a way of increasing one\u2019s privacy (2013).\u2019<\/p>\n<p>In other words, some attempts at resisting personal data tracking are ironically actually assisting the companies using fingerprinting. All of which begs the question; what can be done to protect our data trails? Is resistance really futile?<\/p>\n<p><b>Taking back control: data trails as sites of resistance <\/b><\/p>\n<p>As already outlined, there are those that argue that if users enjoy cost-free services like search engines, entertainment sites, social media and video-hosting platforms, then users shouldn\u2019t be resisting tracking at all \u2013 it is user data, instead of user income, that fund these free-to-use services. This may be true, but given the invisible and ethically ambivalent tactics of tracking companies, it is understandable that users try to not only inform themselves about the complex tracking tactics currently in operation but also try and take back control of the data that they generate. As scholars such as Andrew McStay and Sundar &amp; Marathe argue, ideas such as user consent, user agency and user control can only truly operate with a robust framework of user understanding \u2013 in other words, an individual needs to know what they are consenting to and who has control of their data if they are to be afforded any true sense of agency.<\/p>\n<p>Are self-help tools such as Ghostery enough to give users the power and understanding they need in order to take back control of their data trails? Despite the technological problems that Mayer highlights, but they at least offer some kind of individualised resistance to data mining practices that users are often involuntarily implicated in. Resistances to the commercial data tracking industry \u2013 those responsible for personalising and shaping our web experience (for better or worse) \u2013 may be small, but they are not futile. Better understanding and control can equate to a greater sense of user agency \u2013 an attribute that, as Sundar &amp; Marathe argue, may not necessarily be detrimental to the cost-free web we all enjoy.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><b>References<\/b><\/p>\n<p>Castells, Manuel (2012) <i>Networks of Outrage and Hope: Social Movements in the Internet Age, <\/i>Polity Press: Cambridge<\/p>\n<p>Fuchs, Christian (2012) \u2018The political economy of privacy on Facebook\u2019, <i>Television and New Media, <\/i>13 (2), 139-159<\/p>\n<p>McStay, Andrew (2012) <i>I consent: an analysis of the Cookie Directive and its implications for UK behavioural advertising, <\/i>New Media &amp; Society, 15 (4) 596-611<\/p>\n<p>Manovich, Lev (2011) \u2018<a href=\"http:\/\/www.google.co.uk\/url?sa=t&amp;rct=j&amp;q=&amp;esrc=s&amp;source=web&amp;cd=1&amp;ved=0CFAQFjAA&amp;url=http%3A%2F%2Fwww.manovich.net%2FDOCS%2FManovich_trending_paper.pdf&amp;ei=tkmRUfyIDMOQ0AX0z4GwBQ&amp;usg=AFQjCNELEstX6koVILvjUo8aAQa5MR6kZA&amp;sig2=2Io1dOiAt6cwmzRy85V96w&amp;bvm=bv.46340616,d.d2k\">Trending: The Promises and the Challenges of Big Social Data<\/a>\u2019 (accessed 17<sup>th<\/sup> April 2013)<\/p>\n<p>Mayer, Jonathan (2011) \u2018<a href=\"http:\/\/cyberlaw.stanford.edu\/blog\/2011\/09\/tracking-trackers-self-help-tools\">Tracking the trackers: self help tools<\/a>\u2019, <i>The Centre for Internet and Society Blog <\/i>(accessed 15th September 2013)<\/p>\n<p>Mayer-Schonberger, Victor and Cukier, Kenneth (2013), <i>Big Data: A Revolution that Will Transform How we Live, Work and Think, <\/i>London: John Murray<\/p>\n<p>Nick Nikiforakis, Alexandros Kapravelosy, Wouter Joosen, Christopher Kruegely, Frank Piessens, Giovanni Vignay (2013) \u2018Cookieless Monster: Exploring the Ecosystem of Web-based Device Fingerprinting\u2019 , SP &#8217;13 Proceedings of the 2013 IEEE Symposium on Security and Privacy, Pages 541-555<\/p>\n<p>Nissenbaum, Helen Fay (2010) <i>Privacy in context: technology, policy, and the integrity of social life<\/i>, Stanford: Standford Law Press<\/p>\n<p>Sundar, Shyam S. &amp; Marathe, Sampada S. (2010) \u2018Personalization versus Customization: The Importance of Agency, Privacy and Power Usage\u2019, <i>Human Communication Research, <\/i>36, pp 298-322<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Tanya Kant is a doctoral researcher in Media and Cultural Studies in the School of Media, Film and Music\u00a0at the&#8230; <\/p>\n<div class=\"link-more\"><a href=\"https:\/\/reframe.sussex.ac.uk\/activistmedia\/2013\/09\/small-resistances-to-big-data-protecting-and-politicising-personal-data-trails\/\">Read More<\/a><\/div>\n","protected":false},"author":11,"featured_media":1181,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":true,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[48],"tags":[118,117,90,75],"class_list":["post-1171","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-research","tag-big-data","tag-personal","tag-power","tag-resistance"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/files\/2013\/09\/feat.jpg","jetpack_shortlink":"https:\/\/wp.me\/p2TCNp-iT","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/posts\/1171","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/users\/11"}],"replies":[{"embeddable":true,"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/comments?post=1171"}],"version-history":[{"count":11,"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/posts\/1171\/revisions"}],"predecessor-version":[{"id":1185,"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/posts\/1171\/revisions\/1185"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/media\/1181"}],"wp:attachment":[{"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/media?parent=1171"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/categories?post=1171"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/reframe.sussex.ac.uk\/activistmedia\/wp-json\/wp\/v2\/tags?post=1171"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}