Loading…
This event has ended. Visit the official site or create your own event on Sched.
Please note though that you must register through Eventbrite in order to attend RightsCon Silicon Valley 2016! 

Find below the Official Schedule v1.0 (as of March 24, 2016). Slight changes may be made over the coming week — including session descriptions, panelist bios and room locations. Be sure to click on "Attendees" to see who’s coming and set up a personal profile. You can then select the sessions you wish to attend and create your own customized RightsCon schedule. Visit our RightsCon site for more details.
View analytic
Thursday, March 31 • 12:00pm - 1:15pm
Opening the Black Box: Understanding How Companies Enforce their Rules

Sign up or log in to save this event to your list and see who's attending!

Who can we trust with our digital rights? The 2015 Ranking Digital Rights Corporate Accountability Index found that no major tech company provides overall data about the enforcement of their terms of service. At a time where online expression is increasingly regulated by contract, this means that content is removed or user behaviour penalised ‘under the radar’: although we know that companies take down or filter content, or suspend user accounts, we only have anecdotal evidence about the frequency of such measures and the type of content/behavior affected. On various occasions this has brought to light inconsistent company practice on how terms are enforced (see e.g. https://advox.globalvoices.org/2015/08/06/we-will-choke-you-how-indian-women-face-fatal-threats-on-facebook-while-trolls-roam-free/), highlighting the need to have clear company disclosure on this.

Meanwhile, governments expressly encourage companies to restrict content on the basis of their terms of service, sometimes in extra-legal ways, thereby obscuring a form of government-triggered censorship. The White House asked YouTube to review whether the ‘Innocence of Muslims’ video violated its Terms, and the UK government’s counter extremism strategy explicitly mentions online platforms’ Terms & Conditions as an area of interest. The fact that governments view Terms enforcement by companies as a mechanism to restrict content without making formal requests through legal channels, without due process mechanisms, underscores why it is vital that companies are more transparent about their enforcement practices.

In this session we want to discuss with freedom of expression experts, company representatives, and other participants how companies could improve their transparency when it comes to their enforcement practices, and why governments should be transparent about extra-legal requests they make to companies to restrict content as part of their private Terms enforcement:

- why should this information be disclosed?
- what aspects would be most relevant to highlight (type of content or activities, frequency, triggers for enforcement, other)?
- how should companies disclose this kind of information?
- what insights can companies offer on challenges regarding disclosure about Terms of Service enforcement?
- to what extent do companies refer governments to community-flagging mechanisms?
- how should companies treat and report on content reported by governments through community-flagging mechanisms?
- how should this disclosure tie into wider company policies & practices re: ToS enforcement and remedies?

Since no major tech company is currently providing a clear picture of content removals or other sanctions being applied on the basis of their Terms of Service the session is intending to help start a framework for company disclosure around ToS enforcement.

Speakers
avatar for Chinmayi Arun

Chinmayi Arun

Executive Director, Centre for Communication Governance at National Law University, Delhi
GG

Gabrielle Guillemin

Senior Legal Officer, ARTICLE 19
Gabrielle is Senior Legal Officer at ARTICLE 19, an international free speech organisation based in London. She has been leading the organisation's work on internet policy issues since 2011. She is a member of the UK Multistakeholder Advisory Group on Internet Governance (MAGIG) and an independent expert attached to the Council of Europe committee on Cross-border flow of Internet traffic and Internet Freedoms. Prior to ARTICLE 19, Gabrielle... Read More →
avatar for Rebecca MacKinnon

Rebecca MacKinnon

Director, Ranking Digital Rights, New America Foundation
Ranking Digital Rights
avatar for Peter Micek

Peter Micek

Global Policy & Legal Counsel, Access Now
avatar for Jillian C. York

Jillian C. York

Director for International Freedom of Expression, Electronic Frontier Foundation
Talk to me about TOS enforcement and censorship on social platforms, or your work in the Middle East and North Africa.


Thursday March 31, 2016 12:00pm - 1:15pm
The Nest

Attendees (77)