From Zero-Sum to Win-Win: Commercialization and Protection of Consumer Privacy
By Bonnie Liu
Consumer Privacy: Foremost Asset 4.0
Throughout history, core resources assumed different forms. In nomadic societies, the foremost asset was movable property. Following the agricultural revolution, immovable property became most valuable. Ensuing the industrial revolution, financial and human capital assumed significance. In our current digital revolution, information is power. Under the reign of Web 2.0, individual components of our Big Data deluge and information crowdsourcing are irrefutably indispensable yet easily overlooked: Consumers are public producers of a most important asset— personal data.
Personal information possesses both final/intrinsic value for privacy and intermediate/instrumental value for commercial use. Exploiting its commercial value often diminishes private utility. To leverage its value, companies gather user attributes, track visitor activity, and record transaction details for targeted advertising and data trade, creating decentralized, multiplayer, and multipurpose market ecosystems. Corporate data intermediaries cater free service to users in exchange for personal data, and furnish advertisement space to businesses in return for revenue. Data aggregators procure, process, and sell information. Consumers exchange personal data for services, purchase privacy protection products like identity theft insurance, or explicitly sell data to “personal data vault” firms.
Privacy: Worthless or Priceless?
Within the neoclassical economic framework, complete information — accessibility of all relevant information to all relevant market participants — yields a perfectly competitive market and maximizes efficiency. Chicago School pioneer Richard Posner opines that protecting privacy —the concealment of information — creates market inefficiencies by restricting useful information access and impeding fully informed decision-making. In other words, privacy is a zero sum game whereby protection/exploitation thereof confers benefit to one party while incurring costs on another. Specifically, privacy protection merely transfers costs from individuals to corporations, increasing consumer private utility while decreasing corporate advertisement profitability.
Other economists label allocation of rights as irrelevant. Coase’s theorem suggests that in a competitive market with low transaction costs, parties will achieve Pareto efficiency concerning an externality. Regardless of initial data ownership, consumers and businesses could negotiate mutually beneficial agreements that entail optimal privacy level. The economic agent that values privacy more rationally expends more to secure ownership: even in the absence of regulation, privacy-concerned consumers will achieve propertization and protection of data through marketplace interactions like foregoing privacy-invasive services and purchasing privacy-protective services.
However, the pre-modern Information and Communication Technologies Chicago School privacy model is becoming obsolete. It assumes several prerequisites — including information symmetry among transacting parties, universal ability to establish and maintain property rights, and no market failure— many of which are unfulfilled. Foremost, the model undermines its plausibility by reducing complex phenomena into economic transactions and dismissing human components like the irrationality inherent in consumer decision-making. Subsequently, more nuanced and granular theories have developed.
Meanwhile, firms excuse intrusive data collection and unlicensed data distribution with pretexts like “tailoring customized services” and “catering to personalized preferences.” To the contrary, 66 percent of Americans do not want tailored advertisements, and 86 percent object to it when informed that promotional targeting arises from behavior tracking. Disregarding consumer interests, companies often abuse customer confidentiality for objectionable purposes, subjecting individuals to harmful practices like price discrimination (in which corporations harness consumer surplus), identity theft, unsolicited marketing, and fraud. Privacy violations not only trigger psychological discomfort, but also infringe on consumers’ right to personal and property safety (when unauthorized release of sensitive, personally-identifiable information like address/income leads to fraud or assault) and the arguable right not to be disturbed.
Optimal privacy requires a modulation between qualitative justice and quantitative efficiency, between safeguarding individual rights and maximizing general utility. Between disclosure and reservation, between exploiting and protecting consumer data, individuals and organizations encounter myriad trade-offs and opportunity costs, some tangible (like advertised discounts) and others intangible (like missed social connections).
Causes: Uninformed Consent
As derivations of physical shrink-wrap agreements, internet platforms have adopted browse-wrap agreements— hyperlinks to privacy policies (PP) and terms of service (TOS) located at the bottom of web pages— as well as more conspicuous clickwrap agreements, which require users to click a checkbox affirming their agreement to the accompanied conditions. Although legally binding, neither type of end-user license agreements (EULAs) truly elicits informed and free consent. EULAs’ intolerable length and obscurity have conditioned consumers to disregard their significance and habitually accept them. Some companies’ small print amounts to 30,000 words, approximately the length of a short novel. Reading the privacy agreements a typical American encounters in a year consumes 76 work days, and its national hypothetical opportunity cost amounts to a staggering 781 billion dollars. Additionally, 43 percent of those who skip reading TOSs perceive them tedious or abstruse. Some EULAs sacrifice legibility for conspicuousness: Apple’s terms and conditions begin each paragraph with all capital letters. Under the Uniform Commercial Code, text that disclaim implied warranty, impose restrictions upon or increase responsibility of consumers must be made conspicuous; however, no federal regulation ordains legibility, resulting in difficult-to-decipher fonts. Treating EULAs as nuisance, consumers instinctively ignore them and proceed to pursue their ends, unencumbered by the means. Information asymmetries arise when consumers fail to understand privacy practices, or overestimate/underestimate companies’ privacy protection.
Consequently, as discovered in a 2017 survey, 91 percent admit to not reading terms and conditions before proceeding. Of those who browse TOSs, only 17 percent report to understand it. Ninety-eight percent of experimental subjects in a study overlooked the fictional company NameDrop TOS’s “gotcha clauses” about forgoing their first-born child as payment and data sharing with the National Security Agency. In fact, the ubiquitous prompt “I have read and agree to the terms and conditions” may be the biggest lie on the internet, further questioning the legality of such agreements. Arguably, recission should be permitted, due to the preponderant absence of informed, free consent.
Ubiquitous EULAs employ visual nudging and train even privacy concerned users to click “accept” whenever presented with a coercive interception dialogue reminiscent of EULAs. Research of 80,000 participants discovered that users more likely consent when presented with imperative demands and highlighted “Agree” buttons than with polite requests suggestive of voluntary decision. This counterproductive behavior thwarts the intentions of informed consent.
Causes: Lack of Contractual Freedom and Fairness
Legality is necessary but alone insufficient for valid contracts; unfair standard-form policies lack genuine spirit of both contractual freedom and fairness, rendering them illegitimate. Due to monopolistic/oligopolistic market structure and lack of substitute goods, dependent consumers must “take it or leave it”— accept unfair standard form contracts or forgo service. Just as neither Hobson’s choice nor Sophie’s choice is free, consumers lack authentic freedom of choice in a dilemma with no desirable alternatives. The current “all-or-nothing” dichotomy allows no possibility of compromise and partial consent. To accommodate varying attitudes towards privacy erosion, businesses should provide options consisting of graduated levels of data collection and corresponding services.
Since unfair standard form contracts are unilaterally drafted by firms to self-empower and self-exculpate from liabilities, contracts lose their bi-directional nature of mutuality and fairness. As a result of business-to-consumer power imbalance, information asymmetry, and financial disparity, consumers endure comparative disadvantage with little to no bargaining power when confronting corporate giants. The free rider collective action syndrome additionally plagues consumers: coaxed by blind optimism that other victims will litigate, consumers delay or abandon individual litigations against companies.
The ease of capturing, analyzing, replicating, and disseminating personal information across the internet renders regulatory inadequacies rampant. Rather than a binary contrast, a spectrum of possibilities range from market self-regulation to government regulation, each with heterogenous and context-dependent implications. Bottom-up consumer self protection, horizontal corporate self governance, as well as top-down government regulation must operate in conjunction, producing a dynamic, market-specific, and consumer-friendly approach.
Consumer Self Protection
The emergence of privacy-protective “hiding” technologies to combat privacy-invasive “mining” technologies ostensibly presents a major opportunity for consumer self protection. For instance, in response to eCommerce merchants employing cookies to track browsing and purchases, consumers can delete cookies, use private browsing and anonymous payment. However, the efficacy of self-protection is illusory for multiplex reasons. Primarily, while privacy-diminishing services are integral to quotidian communication, privacy-enhancing services require additional user knowledge, commitment, and monetary input, thereby limiting its potency, especially for vulnerable groups. Consumers realistically lack awareness of tracking mechanisms and privacy trade-offs as well as technical sophistication necessary for self-protection.
Furthermore, the privacy paradox underlies a stark disparity between consumers’ concerned attitude and insouciant behavior. According to a National Cyber Security Alliance study, more Americans are concerned about their data privacy than about losing their main source of income, with the former surpassing the latter by 11 percent. However, their vehement apprehension seldom materializes in action, as demonstrated by the lagging adoption of privacy-enhancing technologies behind sharing technologies. For instance, despite unease with email providers’ automated content analysis, 65 percent of users refused to pay for alternatives. A further study shockingly revealed that over 30 percent of respondents gave up work-related account passwords for a chocolate bar. Interestingly, while consumers claim privacy to supersede monetary concerns, the reverse may be true in practice. While this inconsistency underscores the paucity of self protection, consumers nonetheless should not be blamed as myopic, as they routinely perform privacy cost-benefit analysis and opt to disclose or publicize data according to different priorities.
Corporate Self Governance
Meanwhile, corporations must engage in self regulation, and upgrade business models from capital-centered to human-centered. Paradigm shifts from the traditional profit-driven shareholder theory to the sustainable integrity-driven stakeholder theory are indispensable. The current parasitic relationship between data brokers/holders and data subjects must be transformed into ideally mutualism or commensalism.
Motivated by either self-interest— when adverse consumer response endanger corporate reputation or advertising revenue— or ideally corporate social responsibility, businesses must shoulder Kantian duty of beneficence and duty of care towards consumers, self impose higher moral standards, implement consumer-friendly PPs, and refrain from unauthorized commercialization of private information. From a technical standpoint, corporations must eliminate visual and verbal coercion from user interface designs, and delineate narrower terms through less obscure language to increase the accessibility and transparency of EULAs and promote informed consent.
Since market-driven solutions cannot alone achieve optimal privacy, governments’ visible hand must intervene when markets’ invisible hand fails. A Pew Research Center survey found that 68 percent of adults believe current privacy protection laws to be insufficient. Consumer rights centered on right to privacy deserve greater emphasis and should be broadened, including right to safety, right to information, right to freedom of choice, right to fair transaction, right to be heard, and right to redress. Recently, the European Privacy and Electronic Communications Directive has pioneered codification of the “right to be forgotten,” involving the right to erase personal information posted by users themselves, reposted by other users, as well as any information containing references to users. With the right to be forgotten, consumers, reassured and equipped to withdraw information or seek judicial remedy if necessary, will more likely share data with firms, benefiting both parties. This right remains contentious within the diametrically opposed U.S., which considers deletion as encroachment upon freedom of speech, and traditionally values preservation of historical records.
While U.S. legislators perform utilitarian calculations concerning data protection, their European counterparts espouse privacy as a fundamental human right. Different attitudes translate to different regulatory policies: with a limited and sectorial approach due to pervasive corporate lobbying, the U.S. often advises instead of enforces, manifest in the 2012 Federal Trade Commission’s Do-Not-Track proposal recommending opt-in/opt-out mechanisms. By contrast, with a stringent and universal stance due to greater government independence and impartiality, the European Union passes omnibus data protection laws, exemplified in the 1995 Data Protection Directive and the 2018 General Data Protection Regulation. The U.S.’s decentralized ad-hoc protection requires state initiatives, while EU’s centralized general protection coordinates international data flows, enhancing its effectiveness. Lest the U.S. lags behind amongst members of the Organization for Economic Cooperation and Development (OECD), it must implement OECD’s Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, including principles of collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability.