|

Analyzing Sen. Warner’s 20 ideas to regulate the internet giants

Graphic: pxhere

Senator Mark R. Warner’s office wrote a paper with 20 legislative ideas on how to restrain the excesses, abuses, and ill effects of internet giants like Facebook, Twitter, Google, and Apple; Axios got a copy and published it.

With problems from fake news to data sharing and bullying, these platforms certainly deserve scrutiny. Of course, every potential regulation has a downside too: anything that slows the free give and take of social media and messaging will impair, not just the platforms’ profits, but the usefulness and free exchange that makes social media powerful. And inevitably, every regulation also has unintended consequences.

Senator Warner’s full 23-page leaked paper is here. I’ll list all 20 ideas and describe their pros and cons, with a recommendation for each. The titles of each section below match the sections in the paper.

1 Duty to clearly and conspicuously label bots 

What it is: Platforms must create a visible label to identify participants that are bots and not humans — both those provided by the platform and those running on the platform. This would apply to, for example, Twitter accounts that generate posts automatically.

Pros: Helps prevent nonhuman accounts from tricking people.

Cons: If a participant on a platform is not a human, will the platform be able to tell and to label it? If not, who’s liable, the platform or the bot-creator?

Recommendation: So far, this isn’t the platforms’ biggest problem. But this fix is relatively innocuous, and won’t impair social media significantly. Yes.

2 Duty to determine origin of posts and/or accounts 

What it is: Require that posters reveal their country of origin. Platforms would show the correct countries on their account pages.

Pros: Allows other participants (and presumably, law enforcement) to identify attempts at foreign influence, like the Russian and Macedonian accounts that spread fake news to manipulate the 2016 US election.

Cons: With VPNs, concealing country of origin is easy.

Recommendation: Hard to enforce, and won’t have much in the way of positive effect. No.

3 Duty to identify inauthentic accounts 

What it is: Require that posters use their real names. If others identify someone using a fake name, the platforms would kick them off.

Pros: This would cut down on activity like fake accounts that are, basically, trolls and spreaders of fake news.

Cons: A lot of unintended consequences. Vulnerable people like LGBT individuals and people resisting totalitarian governments use false identities to protect themselves from real-world harassment and arrest. Requiring people to verify their name on demand would create a burden on both the platforms and any individual called out as potentially fake. Are corporations allowed to participate — and if so, what is their “real” name? Can individuals participate by pretending to be corporations? Note that the cost of this regulation would probably impose an existential burden on Twitter.

Recommendation: While fake posters are a problem, the solution would cause more harm than good. No.

4 Make platforms liable for state-law torts (defamation, false light, public disclosure of private facts) for failure to take down deep fake or other manipulated audio/video content 

What it is: Undo the “safe harbor” provisions that allow platforms to escape liability for false and defamatory content posted on their platforms, but only in the instance of hard-to-detect fakes like manipulated video and photos.

Pros: Would require platforms to work harder to identify and prevent the spread of faked content.

Cons: Identifying and killing faked content is far easier to describe than to implement.

Recommendation: Defamation is a problem, but the Constitution protects satire. Distinguishing what’s fake is a nasty challenge, and regulation is only going to make it worse. This is a problem that the platforms should solve themselves. No.

5 Public Interest Data Access Bill 

What it is: Require large platforms to make their anonymized data sets to researchers, who could then point out abuses to regulators like the FCC or FTC.

Pros: Research community could reveal new facts about how platforms spread fake news, enable trolling and abuse, or enable bias.

Cons: Well, there was that little problem that Facebook had with Cambridge Analytica. Is anonymized data safe to share?

Recommendation: I’m all for turning qualified researchers loose on data sets. These platforms are now central to all of our experience, but we don’t really know what’s going on inside them. Beef up the anonymization to prevent abuse, then go for it. Yes.

6 Require Interagency Task Force for Countering Asymmetric Threats to Democratic Institutions 

What it is: Fund a cross-agency effort to identify malicious cyber activity intended to influence elections.

Pros: Could further protect electoral integrity.

Cons: Not clear how effective further surveillance would be, or what the cost would be.

Recommendation: President Trump appears not to take electoral interference seriously. Protecting the integrity of elections is crucial. Yes.

7 Disclosure Requirements for Online Political Advertisements 

What it is: Apply political advertising rules currently in place for traditional media to social media.

Pros: Prevent foreign and unauthorized political advertising.

Cons: Harder to regulate than traditional advertising.

Recommendation: Social media should not be exempt from the rules on political advertising. Yes.

8 Public Initiative for Media Literacy 

What it is: Fund state and local activities to train children and adults how to be skeptical and verify what they see online.

Pros: Smart people are harder to con.

Cons: Would it make a difference, and how much would it cost?

Recommendation: An ignorant and credulous population is an invitation to spread fake news and manipulate people. Smart people are ultimately the only defense against bad actors using technology to fool people. Yes.

9 Increasing Deterrence Against Foreign Manipulation 

What it is: Strengthen deterrence — the ability to respond in kind to cyber attacks.

Pros: Could slow or prevent cyber attacks.

Cons: Is creating a cyber arms race a good idea?

Recommendation: I’m concerned about this. As much as I believe deterrence is necessary, I’m uncomfortable with information warfare. And our own cyber weapons have a way of getting into the hands of adversaries, who use them against us. Defense against black hats is a better investment. No.

10 Information fiduciary 

What it is: Treat online providers as data fiduciaries. Like a financial fiduciary, this would mean they have an affirmative duty to protect and not abuse their users’ data.

Pros: Hold providers like ISPs, cloud providers, and social networks to a higher standard for data stewardship; outlaw inappropriate sale of data.

Cons: Difficult to define what a data fiduciary’s responsibility is; regulation could prove burdensome.

Recommendation: Our online data is subject to far too little oversight. This could help. Worth exploring. Yes.

11 Privacy rulemaking authority at FTC 

What it is: Enable the Federal Trade Commission to hold hearings and make regulatory rules, as the FCC does now.

Pros: Privacy challenges are multiplying faster than legislation can keep up with. Rulemaking allows regulation to come closer to keeping pace with shifting events.

Cons: Another regulatory body means more red tape for businesses.

Recommendation: I don’t trust Congress to keep up with our privacy challenges. Yes.

12 Comprehensive (GDPR-like) data protection legislation 

What it is: European-style data regulation, for example data portability, right to be forgotten, and data breach notification.

Pros: Puts burden of protecting consumer privacy on businesses.

Cons: Creates a new and burdensome set of regulations; slows innovation in social media.

Recommendation: Some elements of GDPR, like the right to be forgotten, would violate the free speech protections in the Constitution. I’d rather see these regulations considered separately (say, by the FTC) rather than single blunt-force set of regulations. No.

13 1st Party Consent for Data Collection 

What it is: One specific element of the GDPR which says that once businesses get permission to use your data, they can’t send it to a third party unless that third party gets permission as well.

Pros: Slows spread and sale of data.

Cons: Would cripple ad network-based advertising and data brokers like Axciom.

Recommendation: This would completely disrupt the current ecosystem centered around selling consumer data and targeting consumers. I like that idea. Yes.

14 Statutory determination that so-called dark patterns are unfair and deceptive trade practices 

What it is: Outlaw user interfaces intended to fool users with opt-out (rather than opt-in) checkboxes and other manipulative practices.

Pros: Such interfaces are intended to fool users. This is deceptive and should be as illegal as deceptive advertising.

Cons: Hard to regulate. When is an interface deceptive? It’s a judgment call.

Recommendation: I’d like to see this enshrined in law. It won’t stop all unfair practices, but would allow regulators to prosecute the most egregious cases, which would put providers on notice to be more transparent. Yes.

15 Algorithmic auditability/fairness 

What it is: Allow the federal government to set standards for fairness in algorithms, for example in credit or housing decisions.

Pros: Algorithms may include bias. Without access to the algorithms, regulators will have a hard time detecting that bias.

Cons: Allowing the government to poke around in and regulate algorithms would be very hard to get right.

Recommendation: Rather than providing the government access to algorithms, the providers could give the government a “test bed” so it could run Monte Carlo type simulations to determine bias. No.

16 Data Transparency Bill 

What it is: Require providers to alert users when their data was being used and shared. Force providers to estimate and publish the value of their user data.

Pros: Gives users more visibility into the value and use of their data.

Cons: Would probably spawn a whole host of annoying alerts without changing anything. (Clicked on any cookie permissions lately?)

Recommendation: I think a single quarterly update on the value and use of data would be sufficient, rather than consumer alerts. No.

17 Data Portability Bill 

What it is: Require providers to expert data in machine-readable form so users can take it to another provider and sign up there.

Pros: Acknowledges that users’ data belongs to the users; encourages competition rather than lock-in.

Cons: Would create a burden on providers; might enable large providers to poach smaller ones with consumer bonuses.

Recommendation: Lock-in is pernicious. This would reduce it. And it makes sense, because you own your own data. Yes.

18 Interoperability Bill 

What it is: Require dominant providers to open up APIs so other providers can connect (for example, access to Facebook’s social graph).

Pros: Reduces sources of dominance and lock-in; opens up access for competitors.

Cons: Burden on providers; APIs are also subject to potential manipulation by malicious actors.

Recommendation: This is a step too far. I don’t want to see legislation determining and regulating APIs, especially with the potential for abuse. I don’t want to worry about who is using my Facebook data other than Facebook. No.

19 Opening federal datasets to university researchers and qualified small businesses/startups 

What it is: Provide access to appropriately secured datasets owned by the Federal government, e.g. tax data, economic data.

Pros: Enables university and other researchers to find insights.

Cons: Cost to federal government; potential for abuse.

Recommendation: If the feds have data, I want researchers to have access to it. Requires a stringent set of rules around anonymization of data. Yes.

20 Essential Facilities Determinations

What it is: Require owners of essential and difficult to replicate data facilities (for example, Google Maps) to provide access to others at a fair price.

Pros: Prevents providers from locking in a monopoly position based on collecting data.

Cons: Intrusive federal regulation of corporations after they gain benefits from their own significant investments in these facilities.

Recommendation: Monopolies formed through data are a problem. Let’s regulate them through the usual monopoly regulations, rather than forcing rules on providers. No.

Which regulations do you think are necessary?

I am in favor of 12 of these 20 regulations, which makes me somewhat liberal in my approach to regulation. How many do you favor? I’m guessing that if you favor more than me, you’re liberal; if you favor less, you’re conservative.

But these regulations challenge your thinking. Most liberals would say that in general, the internet and social media have been forces for liberalization of people everywhere against tyranny, and the people who run these organizations are typically liberal in their own politics. Liberals love tech; tech loves liberals. But tech doesn’t love regulation.

While you can add any civil comments you want on this post, it may be easier for us readers to understand if you cite these potential regulations by name and say on which ones you disagree with me, and why.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

4 Comments

  1. Thanks for sharing this, Josh! My thoughts:

    1. I agree identifying bots may be difficult, but verifying human accounts is not.
    2. VPNs make it difficult, but social platforms could identify posts made with known VPNs.
    3. I agree 3 is a non-starter, but I do think labeling (as noted above) could help to better inform users. Social platforms might also give preference to verified humans posting through non-VPNs.
    4. I agree 4 is a non-starter. There is no way for peer-to-peer platforms to police what is or is not satire, parody, critique, legitimate criticism or defamation.
    5. I see a lot of risks here (as you point out.) Nothing is every truly anonymized. I think this is possible but would have to be done with great care.
    6, 7 and 8 are solid ideas and not difficult to implement.
    9. I don’t know enough about what is proposed here to comment. I think deterrence is necessary to protect users, corporate assets, data, and brands. That really is nothing new. A cyber arms race is on anyway, so I am not sure I share your concern, but I also am not sure what practically is being demanded by this of digital platforms.
    10 and 11 are also solid, practical ideas.
    12. I am not sure I agree with you here. I am not a fan of the EU’s “right to be forgotten,” but otherwise, strong data protection will not happen unless regulated. There’s too much profit in selling and sharing our data. And I don’t think GDPR will slow innovation. In fact, it might be argued it could help–if brands develop more trust and users are given more access and control, good things can happen.
    13, 14 are more good ideas.
    15. I agree with you. Algorithms are the secret recipes of 21st-century businesses. They are also EVERYWHERE. The idea of allowing the government to try to know, understand and evaluate all the algorithms is terribly naive.
    16. I mostly disagree with you here. I think consumers need to be given control and transparency. Of course, most will give it away anyway, but I think digital platforms should DEFAULT to alerting users to shared ideas and then provide methods for users to control it (turn off alerts, quarterly alerts, real-time alerts, etc.)
    17. This is a little dicey for me. I agree it’s very consumer-friendly to allow user data to be completely portable. The problem for me is that that data is also an asset to businesses–one that costs a lot of money to collect, maintain and protect. So, if Facebook abides by all these 20 ideas and invests an enormous amount of money doing so, should users be able to merely rip a decade or more of their data out of the system and put it somewhere else? Maybe–it’s their data, after all–but I also feel this ignores the legitimate business investment made by digital properties to create a system that collects, stores, uses, and protects that data. Think of it this way–if clients of Forrester or Gartner give up their data to be published in case studies and reports, should they be able to rip that data away, ignoring the cost to the research firms to use and develop it, and give it away to anyone else? I think there is room for balance here, but I admit I’m not sure exactly what that would look like. Maybe T&Cs should have some conditions for users to either delete their data for free or collect it for some reasonably nominal fee?
    18. I agree with you for the reasons noted in #17 and also because completely open platforms and APIs create their own risks, as well. This would strip away from Facebook, Google and others their opportunity (one might say obligation) to protect users.
    19. Makes sense, but I also see lots of opening for abuse here. Cambridge Analytica had its root in Cambridge University, after all. The fact someone works for a school hardly ensures they won’t abuse or sell data.
    20. I agree with you here. Private enterprise is private enterprise. If a company creates a data facilities of great value and at great expense, they get to set the price!