A Legal Framework for the "Information Apocalypse"
In 2009, a CNN article noted that the law is "at least five years behind technology as it is developing."
In late 2016, Aviv Ovadya was one of the first people to see that there was something fundamentally wrong with the internet. A few weeks before the 2016 election, he presented his concerns to technologists in San Francisco’s Bay Area and warned of an impending crisis of misinformation in a presentation he titled “Infocalypse.”
Ovadya saw early what many — including lawmakers, journalists, and Big Tech CEOs — wouldn’t grasp until months later: Our platformed and algorithmically optimized world is vulnerable — to propaganda, to misinformation, to dark targeted advertising from foreign governments — so much so that it threatens to undermine a cornerstone of human discourse: the credibility of fact.
Ovadya — now the chief technologist for the University of Michigan’s Center for Social Media Responsibility and a Knight News innovation fellow at the Tow Center for Digital Journalism at Columbia — says the shock and ongoing anxiety over Russian Facebook ads and Twitter bots pales in comparison to the greater threat: Technologies that can be used to enhance and distort what is real are evolving faster than our ability to understand and control or mitigate it.
With law arguably lagging exponentially behind technology, what's the solution?
Omri Ben-Shahar's article, Data Pollution, is a good place to begin.
From the abstract:
Digital information is the fuel of the new economy. But like the old economy's carbon fuel, it also pollutes. Harmful "data emissions" are leaked into the digital ecosystem, disrupting social institutions and public interests. This article develops a novel framework- data pollution-to rethink the harms the data economy creates and the way they have to be regulated. It argues that social intervention should focus on the external harms from collection and misuse of personal data. The article challenges the hegemony of the prevailing view-that the harm from digital data enterprise is to the privacy of the people whose information is used. It claims that a central problem has been largely ignored: how the information individuals give affects others, and how it undermines and degrade public goods and interests. The data pollution metaphor offers a novel perspective why existing regulatory tools-torts, contracts, and disclosure law-are ineffective, mirroring their historical futility in curbing the external social harms from environmental pollution. The data pollution framework also opens up a rich roadmap for new regulatory devices-an environmental law for data protection-that focus on controlling these external effects. The article examines whether the general tools society has long used to control industrial pollution-production restrictions, carbon tax, and emissions liability-could be adapted to govern data pollution.
At this point, data regulation is still in its infancy -- in fact, it's currently almost nonexistent (at least in the U.S.). Ben-Shahar's article starts the discussion about an online environmental law as a regulatory scheme using personal data as its case study.
We must continue to focus on the development of these regulatory schemes, for personal data and other, given that we are already so far behind. In addition to a discussion of the regulatory schemes, legal educators must include data ethics and discussions of algorithmic issues into the law school curriculum so that our future lawyers are thinking about these issues.
While the law definitely has a role to play in data protection, as stated in the 2009 CNN article, the law is only good at policing the most extreme invasions and the most outrageous cases. It can't take the place of good manners, social norms and etiquette -- the kind of thing that has always governed negotiations about face-to-face behavior. We should never expect that the judges are going to save us from our own worst impulses.
In late 2016, Aviv Ovadya was one of the first people to see that there was something fundamentally wrong with the internet. A few weeks before the 2016 election, he presented his concerns to technologists in San Francisco’s Bay Area and warned of an impending crisis of misinformation in a presentation he titled “Infocalypse.”
Ovadya saw early what many — including lawmakers, journalists, and Big Tech CEOs — wouldn’t grasp until months later: Our platformed and algorithmically optimized world is vulnerable — to propaganda, to misinformation, to dark targeted advertising from foreign governments — so much so that it threatens to undermine a cornerstone of human discourse: the credibility of fact.
Ovadya — now the chief technologist for the University of Michigan’s Center for Social Media Responsibility and a Knight News innovation fellow at the Tow Center for Digital Journalism at Columbia — says the shock and ongoing anxiety over Russian Facebook ads and Twitter bots pales in comparison to the greater threat: Technologies that can be used to enhance and distort what is real are evolving faster than our ability to understand and control or mitigate it.
With law arguably lagging exponentially behind technology, what's the solution?
Omri Ben-Shahar's article, Data Pollution, is a good place to begin.
From the abstract:
Digital information is the fuel of the new economy. But like the old economy's carbon fuel, it also pollutes. Harmful "data emissions" are leaked into the digital ecosystem, disrupting social institutions and public interests. This article develops a novel framework- data pollution-to rethink the harms the data economy creates and the way they have to be regulated. It argues that social intervention should focus on the external harms from collection and misuse of personal data. The article challenges the hegemony of the prevailing view-that the harm from digital data enterprise is to the privacy of the people whose information is used. It claims that a central problem has been largely ignored: how the information individuals give affects others, and how it undermines and degrade public goods and interests. The data pollution metaphor offers a novel perspective why existing regulatory tools-torts, contracts, and disclosure law-are ineffective, mirroring their historical futility in curbing the external social harms from environmental pollution. The data pollution framework also opens up a rich roadmap for new regulatory devices-an environmental law for data protection-that focus on controlling these external effects. The article examines whether the general tools society has long used to control industrial pollution-production restrictions, carbon tax, and emissions liability-could be adapted to govern data pollution.
At this point, data regulation is still in its infancy -- in fact, it's currently almost nonexistent (at least in the U.S.). Ben-Shahar's article starts the discussion about an online environmental law as a regulatory scheme using personal data as its case study.
We must continue to focus on the development of these regulatory schemes, for personal data and other, given that we are already so far behind. In addition to a discussion of the regulatory schemes, legal educators must include data ethics and discussions of algorithmic issues into the law school curriculum so that our future lawyers are thinking about these issues.
While the law definitely has a role to play in data protection, as stated in the 2009 CNN article, the law is only good at policing the most extreme invasions and the most outrageous cases. It can't take the place of good manners, social norms and etiquette -- the kind of thing that has always governed negotiations about face-to-face behavior. We should never expect that the judges are going to save us from our own worst impulses.
Comments
Post a Comment