Online privacy and transparency – between desire and need

23 Mar 2016
This is a reblog from a LinkedIn post written by Blanca Mayayo, DTL Program Manager and Policy Team member. Originally published on Feb. 22, 2016

Some weeks ago the Federal Trade Commission made its position clear on consumer privacy during AdExchanger’s Industry Preview, actually expressing surprise to the lack of motivation from the ad-tech industry to offer consumers more tools to protect their privacy. As per some recent research indicates, the increased usage of ad blockers could be due not only to the deteriorating experience, but to the lack of trust driven by the relentless and obscure forms of tracking that may come with advertising.

This is yet another example illustrating the human dilemma of ignoring certain potentially dangerous situations until they become an imperative need. The desire is transforming into need: need to preserve trust of users and for entire businesses to survive.


Beyond Privacy

Indeed one of the main challenges around big data is privacy, but as automated processes in decision-making are everyday more embedded into our lives, the span of possible risks becomes more complex and difficult to tackle. Even governments have already acknowledged that big data technologies can cause “societal harms beyond damages to privacy”.

Algorithms are being used in a huge array of areas in our everyday life, to the extent that they shape what we see or what we are eligible for. Even more concerning than their ubiquity is how invisible their operations are on a daily basis. People have very little insight into what data is fed into these coding structures, or how it affects decision-making. Moreover, negative effects, such as discrimination, can be enhanced by algorithms even in the absence of explicit discriminatory intent.

Although data protection limits the undefined and unaccountable use of data, as it is set forth transparency remains not an enforceable right, or at least, requires huge technical investigation to demonstrate and provide evidence. As it is already covered as a principle in EU data protection regulation, transparency and right to fair processing of data should do the way from desire to need in order to achieve sustainability.

Discussion on how this big data algorithmic transparency - at least to the extent that accountability is guaranteed - can be achieved remains ahead and goes well beyond discussion on just privacy-related issues. In any case, it needs to be acknowledged that it will soon be no longer just a desirable way to “improve customer engagement and experience”, but a real need to regain trust from users and making the balance of access, content and privacy the right one for each individual.