Unintended Consequences: The High Costs of Data Privacy Laws

July 19, 2022 Topic: Big Tech Region: North America Blog Brand: Techland Tags: Big TechBig DataData PrivacyCongressFTC

Unintended Consequences: The High Costs of Data Privacy Laws

The American Data Protection and Privacy Act will pass on costs to consumers and undermine market competition.

Democrats and some Republicans have swung behind bills to curtail “Big Tech” giants like Amazon, Google, and Facebook that depend on sensitive data from millions of Americans to support their businesses and market dominance. The latest example is the bipartisan draft of the American Data Protection and Privacy Act (ADPPA) before the House Committee on Energy and Commerce. It would set new federal data privacy standards enabling users to request correction or deletion of their data and requiring their explicit consent for data use or transfers. It would also minimize the amount of data companies collect, force companies to be more transparent about their algorithms, and ban practices like advertising targeted at minors.

These ideas sound appealing, and the bill contains some good policies. However, we can’t have this debate without acknowledging the compliance costs for all digital, data-dependent businesses—including those that present no real policy challenges and small, innovative startups.

Theoretically, the ADPPA would promote a more competitive marketplace by holding the largest big data companies accountable for perceived evils. These include the sale of personal data to problematic buyers (revealed in scandals like Cambridge Analytica), breaches that have compromised data security, and “predatory” commercial practices like Amazon’s alleged use of sensitive commercial data from sellers on its platform to benefit its own retail business.

In practice, however, compliance costs will be passed to consumers through higher prices and reduced choices that likely exceed the benefits. Furthermore, the international experience with similar regulation—the European Union’s Global Data Protection Regulation (GDPR)—shows that the same tech giants the bill is aimed at will be best equipped to weather these costs at the expense of smaller players.

Following the GDPR’s enactment, Google’s European digital advertising market share grew to 95 percent. Other companies struggled to recruit a new bureaucracy of compliance staff and lawyers, resulting in substantial additional costs. But while Facebook expanded its staffing rapidly, half of the digital businesses in the European Union and over 70 percent of those in the United Kingdom expected to be non-compliant when it took effect. Many tech startups either ceased selling to Europeans or closed down permanently.  

The ADPPA has additional serious flaws. It does not distinguish between the responsibilities of covered entities that “process” data and those that merely “control” or hold it. Instead, different types of data-holding entities must bear the same regulatory burdens even if it makes no difference to consumers. And while the bill rightly applies less stringent requirements for smaller entities, given the reduced volume and scale of the data they hold and use, this can also discourage growth by deterring tech start-ups from expanding and arbitrarily lower start-ups’ equity value once they cross an arbitrary size threshold.

Most concerningly, the bill’s requirement that large entities divulge information about algorithms that could be protected trade secrets—and conduct annual assessments based on the vague and subjective notion of the harms these could cause—is likely to deter the development of technology that transforms human lives and serves us better. 

By failing to specify concrete examples of algorithmic “harm,” the bill leaves businesses at the mercy of unelected bureaucrats at the Federal Trade Commission who could define these harms in terms as broad as exposure to content or views they subjectively deem offensive or objectionable. Given the experience of the past few years, do we really want federal bureaucrats, armed with only the vaguest guidance, to put more pressure on Facebook and Twitter to suppress speech that sometimes turns out to be accurate?

The bill also allows for a “private right of action,” meaning aggrieved users can sue for damages for violations, plus lawyers’ fees. This may be sensible in some cases but could subject compliant businesses to costly and time-consuming class action litigation that they would be forced to settle. Since such suits only proceed if federal regulators or state attorneys general declined to take action, it is likely that most of these private suits will be frivolous. Businesses will be forced to divert resources away from their customers and be put at a disadvantage against foreign competitors, like data-driven firms backed by the Chinese communist regime, which do not face comparable burdens and costs at home.

Rather than requiring a blanket “opt-in” request for permission to use user data to develop or calibrate new products and services, as permitted by the GDPR, the ADPPA would require consent to be obtained separately for each specific new product or service. This could make it prohibitively costly for firms to develop novel welfare-enhancing applications using data that those companies already hold with permission.

What is forcing a barrage of online opt-in consent pop-ups meant to solve? Customers are already taking the matter seriously. Consider Google’s Android, which already faces intense competition in the smartphone operating system market from Apple, and Google’s competitor in the search engine market, DuckDuckGo, which grew its user base by nearly 50 percent last year alone. Both Apple and DuckDuckGo cater to customers who value their privacy and want to reduce the data they hand over.

In contrast, consent pop-ups undermine user experiences while discouraging users from parting with data they would otherwise have been happy to make available. This “data minimization” reduces the ability of smaller companies to grow the legitimate databases and network effects they need to challenge Amazon, Google, and other giants that policymakers are so concerned about.

The ADPPA does have a few potentially positive features that target clearly harmful conduct in concrete ways, and which could enhance transparency without compromising intellectual property. For example, the bill’s federal data-broker registry would promote transparency in how third-party data exchange is handled, and its prohibition on the transfer of “non-consensual intimate images” is a welcome development. Care would have to be taken, however, to ensure that the latter prohibition does not force companies to scan the private communications of innocent users.

Overall, the ADPPA would better achieve its stated aims while limiting the burdens imposed on well-meaning businesses and users if it singled out specific forms of clear algorithmic harm while allowing companies to maintain trade secrets, broadened the exceptions to its “opt-in” consent requirements to be more in-line with the GDPR, and abolished the “private right of action” that only exposes Big Tech’s competitors to the risk of crippling lawfare. 

Even such a redrafted bill, however, could have unanticipated consequences for economic efficiency and innovation and should be subjected to close scrutiny. The Hippocratic Oath, “first do no harm,” should apply to digital markets that have conferred huge welfare benefits on society. The “cure” should not be worse than the ailment.

Alden Abbott is a senior research fellow with the Mercatus Center at George Mason University and a former general counsel with the Federal Trade Commission. Satya Marar is a Washington DC-based tech policy professional, foreign-trained lawyer and MA fellow at the Mercatus Center at George Mason University.

Image: Reuters.