Our target users are individuals around the world who have experienced content removals or account takedowns of protected speech on social media websites.
We know this is a broad definition - that's why we did an initial round of outreach to superusers; that is, human rights defenders, activists, and organizations with some degree of awareness toward this problem.
That said, we were surprised by some of the submissions we received during our alpha round! We found that certain users within our networks were experiencing this phenomenon disproportionately. Users in Hong Kong and Palestine, for example, were frequent reporters in round one. We also found that human rights organizations aren't immune to the whims of corporate Terms Of Service (TOS) enforcement - we were able to help a major organization publicize a takedown of their content, resulting in its restoration!
What assumptions are you making in what you propose, and how will you test them?
The major assumption that we are making is that social media companies are over-enforcing their Terms of Service (TOS); that is, they are taking down content that does not actually violate company policy. We suspect that this often happens unintentionally, due to the speed with which these companies must respond to user reports of TOS violations.
We will test these assumptions by comparing data that we receive via incident reports against the TOS of the various companies concerned. We hope to be able to provide a set of concrete examples demonstrating where companies are "overcomplying" with their own policies.
How will you get your project in front of the necessary people or organizations?
Luckily, we're uniquely well positioned to do this. Our team is situated across advocacy and academia, with advisors from the EFF, the Berkman Center, Global Voices, and Penn's Annenberg Center for Communication. These partners have already established channels of communication with our target companies and are able to talk to them about policy.
We plan to expand our advisory group and reach out to human rights bodies and other relevant organizations to promote the project, so as to create the best possible data set. Both EFF and Global Voices (with which we work closely) have connections with dozens of news media and free speech advocacy organizations all over the world. We will reach out to these groups to encourage their community members and audiences to file incident reports through the Onlinecensorship.org platform.
Finally, technology, information design, and data visualization are key strength areas of our team via our connection with Visualizing Impact. This greatly expands our possibilities for raising awareness on our data and insights among key target audiences. We hope that this data set will benefit not only the advocacy community, but also academics conducting research on these topics.
What are the obstacles to implementing your idea, and how will you address them?
We have thought seriously about the risks and challenges we may face in implementing this project. The most notable of these are:
Accuracy and quality of information
We are seeking data on legitimate free speech cases. We may receive reports with false or misleading information, or reports on content that has been removed because the user was engaging in abusive behavior against other users. During our startup phase, we are evaluating reports to better understand the problems associated with quality control, vetting, and verification. We have already installed technical anti-spam and anti-robot measures, and we are now focused on refining our systems and processes to minimize the number of reports we receive that clearly fall outside the parameters of legitimate free speech cases.
Privacy and security
Engagement of target companies
What if the companies we target don't listen? While this is a possibility, our experience with Google, Twitter, Yahoo, and Facebook thus far indicate a willingness to work with us once the project is up and running.
Whatever obstacles we might face, we think this is a challenge worth taking on, and have stacked the deck with a fantastic group of advisors to help us think through potential issues.
How much do you think your project will cost, and what are the major expenses?
To get this project truly up and running, we are seeking $400,000. The budget for the project falls into the following major areas:
Management and Editorial Team
- Improving incident report questionnaires;
- Evaluating incoming incident reports;
- Engaging target companies on individual cases and systemic issues;
- Expanding target platforms - users can currently report incidents experienced on Google+, Facebook, Youtube, Twitter, and Flickr, but we hope to expand to other platforms such as Blogspot, Wordpress.com, Tumblr, Pinterest, Dailymotion, Vimeo, and Instagram.
Design and Technical Team
- Implementing design improvements;
- Designing and launching our mobile platform, which is key because users in the developing world access social networks increasingly and in some cases exclusively through mobile technology;
- Begin to integrate data visualization tools that will map locations and dates of reported incidents.
- Making Onlinecensorship.org available in Arabic, Chinese, Farsi, French, Portuguese, Russian, Spanish, and Vietnamese, with more languages to follow.
- Developing and implementing a community engagement strategy;
- Spreading the word about Onlinecensorship.org at major venues over the next year.
How do you see this project scaling and sustaining itself in the long term, if needed?
As noted above, we plan to expand our project to include additional social media platforms and languages, as well as to incorporate innovative data visualization elements. Fortunately, much of the initial budget is to get the project off the ground - web design and translation can be expensive, but most of that work needs to happen only once. After that, we hope to spread the costs across several organizations through a partnership model, similar to Chilling Effects. We imagine that the project will eventually need one full-time person (or a combination of part-timers) to maintain the site and wrangle the data, costing less than $100,000/year.
Need for OnlineCensorship.org
We treat social media platforms as if they are the public sphere, but they are actually commercial entities with their own proprietary terms of service. Facebook and YouTube have over 1 billion users and Twitter has millions, making these platforms the "sovereigns of cyberspace" with the power to police and regulate our online lives, including our speech.
In many countries, particularly where the press is not free, independent media and activists are heavily dependent on the Internet to reach their audiences or convey ideas and messages. They are also increasingly dependent on large social networking platforms run by corporations. A significant number of activists and journalists have encountered problems maintaining their accounts and content on Facebook and other social media. Community policing of content is done quickly - anyone can report anything, and the employee monitoring reports has merely a split second to make a determination. This results in regular takedowns of legitimate content.
This is an impediment to the work of journalists, bloggers, and activists and an obstacle to their ability to exercise their freedom of expression. Unfortunately, cases in which content or accounts have been censored are rarely resolved unless the user is lucky enough to have the right connections and support. This problem needs to be addressed strategically and systematically.
The actual scope of censorship affecting journalists’ and activists ability to do their work is not known – to the public or, we believe, to the companies themselves. Solicitation of "incident reports" and follow-up surveys would enable us to identify, track, analyze, and publicize the facts of these incidents. Only with the proper research and data will it be possible to convince companies that their policies are inflicting unacceptable levels of “collateral damage” on the free expression rights of journalists and activists, which requires greater prioritization of staff and resources to address.
On November 7, 2012 administrators of a Facebook group called “The
Uprising of Women in the Arab World” discovered that their accounts had been temporarily disabled due to unspecified violations. Facebook deleted a photo on the page of a woman with cropped hair and a tank top holding a passport photo of herself fully veiled, attached to a sign protesting her lack of freedom. Other postings calling on followers to support the woman on Twitter were also deleted. Efforts by the page’s administrators to contact Facebook met with no response. After the press began running stories about what had happened, Facebook responded to a reporter’s queries with a statement acknowledging that the photo had been taken down in error and that “an item was removed because it was reported to us and found to have violated our community standards.” 
Egyptian activist Sally Zohney explained to a reporter that this page had become as important to a growing women’s movement in the Arab world as another Facebook page, “We are all Khaled Said,” had been for Egyptian human rights and democracy activists. The latter page was not only a driving force for the growing anti-torture movement in Egypt in late 2010, but also played a central role in organizing the January 25, 2011 Tahrir Square protest that eventually led to the fall of the Mubarak regime. Interestingly, “We are all Khaled Said” was taken down by Facebook administrators in the fall of 2010 due to terms of service violations (the administrators, fearing arrest and worse, were not using their real names), and the page was only reinstated after international human rights groups intervened with Facebook executives.
Activists in the Middle East and North Africa are not the only Facebook users complaining of account deactivations and content removals of journalistic and political speech that is clearly protected by the Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights. In Hong Kong last year, several days before planned protests to mark the anniversary of the 1989 Tiananmen Square Massacre, a large number of user accounts were suddenly suspended without explanation. Many of these users reported in other social media and to journalists that the suspensions occurred soon after they posted messages urging their friends to attend the June 4 protest. Suspended accounts included known democracy activists and journalists critical of Beijing . Facebook later responded to journalists’ queries with vague explanations about a technical problem that was later fixed, but users themselves never received an explanation .
 Kristin Deasy, “Activists: Why is Facebook censoring this photo?” Global Post, November 8, 2012, at: http://www.globalpost.com/dispatch/news/regions/middle-east/121108/activists-why- facebook-censoring-photo
 Oiwan Lam, “Hong Kong: A Large Number of Facebook User Accounts Suspended at the Eve of Annual June 4 Vigil,” Global Voices Advocacy, June 2, 2012, at: http://www.globalpost.com/ dispatch/news/regions/middle-east/121108/activists-why-facebook-censoring-photo
 Rebecca MacKinnon, “Ruling Facebookistan,” Foreign Policy, June 14, 2012, at: http:// www.foreignpolicy.com/articles/2012/06/13/governing_facebookistan
- Learn more about the scope of this problem and better understand its impact on activists, citizen media makers, and journalists around the world;
- Educate affected groups and communities about the nature of the problem and the reasons behind it;
- Educate companies about the real-world human impact of failing to address the problem adequately;
- Establish formal channels of communication with companies to resolve specific cases and to identify solutions/improvements;
- Encourage companies targeted by our project to show concrete improvements to their systems and improve the way they communicate with users about reasons and processes around content/account censorship;
- Develop a set of recommended best practices that all social media companies should follow if they want to be considered compatible with and supportive of the free speech rights of activists, citizen media makers and journalists.
Our Action Plan
- Build and promote the Onlinecensorship.org platform (desktop and mobile) in order to solicit “incident reports” from journalists, online activists, and citizen media creators all over the world;
- Document cases in which social media accounts have been deactivated, or in which protected journalistic and political content and speech has been deleted - We will base the definition of protected speech on commonly agreed standards of international human rights law.
- When information is available, document whether the content has been deleted or blocked to enforce commercial rules or at the behest of governments;
- Conduct systematic surveys of affected communities and follow-up research to identify the nature and scope of problems first identified through the crowd-sourced reports;
- Carry out education of activists, bloggers, and journalists about the reasons behind the problems as well as possible steps to address or rectify them.
- Engage with companies about how their practices can be improved or changed, and organize grassroots campaigns to convince companies that their global reputations will be affected unless the rights of activists and journalists are better protected and defended on their platforms.