Facebook Privacy in the Age of Graph Search

Facebook’s Graph Search extends longstanding debates on Facebook privacy.  According to Zoller (2013), “Graph Search allows users to search from the Facebook platform rather than the wider web, which is consistent with Facebook’s drive to make itself the centre of gravity on the web.”  It allows users to search through profiles for different interests users have listed and likes that they have made.  Graph Search advocates have argued that it is not a big deal and that it does not expose anything that is not already publicly visible over Facebook.  Such a stance does not consider cultural nuances, evolving Facebook privacy policies, and prior Facebook controversies.  These concerns show that Graph Search can be problematic in the age of big data.

One of the scariest aspects of Graph Search – and Facebook in general – is that it is an interface whose design reflects the culture from which it emerged.  This means it does not consider the transparency of topics and interests that are taboo and punishable in other cultures.  Ingram (2013) reveals this possibility in providing example searches from the Actual Facebook Graph Searches Tumblr page of “‘Islamic men interested in men who live in Tehran, Iran’ (where homosexuality is a crime punishable by death) or ‘family members of people who live in China and like Falun Gong,’ the latter being a religious group whose members are routinely persecuted.”  Thus, the potential for governments, law enforcement agencies, or employers to conduct such searches is a very dangerous possibility.

It is worth mentioning that information designated as private will not appear on Graph Search.  This is a valid argument that many of those who downplay the dangers of Graph Search adopt.  If it is the user’s responsibility to ensure that private information is protected, then there is little responsibility on Facebook’s part.  However, this disregards the well-publicized issues regarding how Facebook privacy policies often change in ways that demand user action to preserve their privacy.  This is often confusing for users and could harm users who are not up-to-date on any changes or simply do not use Facebook often.  Their data would be vulnerable, unbeknownst to them.  Even Mark Zuckerberg’s sister unknowingly posted a Zuckerberg family photo publicly when she thought that it was completely private (Ingram, 2012).

Likewise, in viewing the controversy from a historical perspective, Facebook easily garners distrust amongst its users.  The Beacon and Sponsored Stories controversies surrounding Facebook are reasons to be skeptical of Graph Search.  In the Beacon controversy, from 2007, a Facebook feature captured information on user transactions from different commercial websites and notified users’ friends on Facebook of these actions.  In the Sponsored Stories controversy, which remains unresolved, Facebook appropriated users’ photos without their permission.  The current Instagram controversy is also relevant.  Facebook’s Instagram partnership allows advertisers to include user images in their advertisements, which is unsettling for many users (Zoller, 2013).  All of these controversies support the view that as Facebook evolves with new features, one should question what Facebook means by user privacy.

In turn, a major component of the argument supporting Graph Search is that users should be more careful on Facebook.  Even Tom Scott, the man behind the critical Actual Facebook Graph Searches Tumblr, tells users that for any interests they post, “If it’d be awkward if it was put on a screen in Times Square, don’t put it on Facebook” (Neeley, 2013).  This sentiment, however, ignores the amount of data that Facebook users have made available over the duration of their time on Facebook.  For most users, it is likely long enough that they do not remember the earlier interests they mentioned or “likes” that they made which could be quite detrimental in the present (Neeley, 2013).  Privacy expert Adi Kamdar, affiliated with the Electronic Frontier Foundation (EFF), poses an example:

[S]omeone may not remember that she ‘liked’ the ‘Samsung Mobile’ page back in college, but now people can search for ‘People who work at Apple, Inc. who like Samsung Mobile,’ which could lead to a heavy dose of awkward (Neeley, 2013).

Even seemingly innocuous interests available on Facebook and lost amongst users’ data over the years could have serious repercussions.  This is where big data becomes a big issue with graph search, especially when cultural differences, changing privacy policies, and prior privacy controversies are taken into account.  After all, in extending the previous example, were Apple superiors to conduct such a search and find similar results, it may be something more than just “a heavy dose of awkward.”

 

Works Cited

Ingram, M. (2013, Jan. 24).  You can’t hide from Facebook Graph Search.  Retrieved from http://www.businessweek.com/articles/2013-01-24/you-cant-hide-from-facebook-graph-search.

Ingram, M. (2012, Dec. 26).  A valuable lesson from Randi Zuckerberg: Online privacy is complicated.  Retrieved from http://gigaom.com/2012/12/26/a-valuable-lesson-from-randi-zuckerberg-online-privacy-is-complicated/.

Neely, J. (2013, Feb 21).  Controversy swarms around Facebook Graph Search.  Retrieved from http://www.westhost.com/blog/2013/02/21/controversy-swarms-around-facebook-graph-search/.

Zoller, E. (2013, Jan. 18).  Facebook’s Graph Search puts user privacy back in the spotlight.  Retrieved from http://www.guardian.co.uk/media-network/media-network-blog/2013/jan/18/facebook-graph-web-search-privacy.

Ned Prutzer

Ned Prutzer is a former CCT Graduate Student.