Prevailing models of information regulation which address data protection and privacy concerns are overwhelmingly informed by individual-centric protections and notions of contractual freedom. And this is usually typified by data-protection frameworks which attempt to give individuals agency over their choices after being provided adequate notice of the implications of these choices – known as the ‘notice and choice framework. Even as this mode of expressing privacy choices prevails, it has become abundantly clear that while individuals and communities express and vouch for greater privacy protections in realms of law and policy, the choices expressed in their interactions with technical design, particularly on the internet, are often contradictory. This post briefly explains this so-called ‘privacy paradox’ in the context of the design of online technologies.
Contractual models of notice and consent centre around the legal fiction of the rational reasonable man – privileging certain assumptions about cognitive behaviour in responding to available information about privacy choices. Yet, this model has been recognised as deeply flawed, for multiple reasons, including critiques of assumptions of cognition and rationality, particularly criticisms focusing on the manner in which our behaviour and choice is constricted by the design of pervasive technologies. The environments of online platforms offer certain kinds of affordances for what behaviour is possible or desirable – these designs include user interfaces – the most common ways in which users interact with online platforms.
Research on the design of social media technologies and its effect on user behaviour has begun to reveal how the notice and consent framework overlooks patterns of design meant to manipulate user behaviour in ways which are not immediately apparent or responsive to user ‘choice’. Such design, termed as ‘dark patterns’, are pervasive, and include pop-ups which obscure available choices, disabled back buttons, screen-blockers intended to get users to agree to hidden terms. Dark patterns have been described as “interface design choices that benefit an online service by coercing, steering, or deceiving users into making decisions that, if fully informed and capable of selecting alternatives, they might not make.”
The ubiquity and influence of design and particularly manipulative design like dark patterns have important implications for privacy regulation and law. First, it requires us to challenge the assumptions and the limits of the ‘notice and consent’ model. What are the limits on informed consent in the context of online user interfaces and forms of manipulation which are not apparent? While the current draft of Indias Personal Data Protection Bill sets an important standard of meaningful and informed consent, translating these into everyday user expressions of privacy requires delving deeper than privacy policies and terms of service, and into UI and UX. Secondly, privacy law and data protection regulation must attempt structural regulation of the design of online technologies to protect individuals and groups from the harms of dark patterns and manipulative design.
Our upcoming workshop and publication on ‘The Philosophy and Law of Information Regulation in India’ will feature research on social media design and the privacy paradox which speaks to data protection regulation and technological design in India.