By Alex Warofka, Product Policy Manager
We want Facebook to be a place where people can express themselves freely and safely around the world.
As part of that commitment, we commissioned an independent human rights impact assessment on the role of our services in Myanmar and today we are publishing the findings. The assessment was completed by BSR (Business for Social Responsibility) – an independent non-profit organization with expertise in human rights practices and policies — in accordance with the UN Guiding Principles on Business and Human Rights and our pledge as a member of the Global Network Initiative.
The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.
Over the course of this year, we have invested heavily in people, technology and partnerships to examine and address the abuse of Facebook in Myanmar, and BSR’s report acknowledges that we are now taking the right corrective actions.
BSR’s report also examines the complex social and political context in Myanmar, which includes a population that has fast come online, a legal framework that does not reflect universal human rights principles, and cultural, religious, and ethnic tension. In this environment, the BSR report explains, Facebook alone cannot bring about the broad changes needed to address the human rights situation in Myanmar.
BSR provided several recommendations for our continued improvement across five key areas, in order to help mitigate the adverse human rights impact and maximize the opportunities for freedom of expression, digital literacy, and economic development. These areas include building on existing governance and accountability structures, improving enforcement of content policies, increasing engagement with local stakeholders, advocating for regulatory reform and preparing for the future.
As the BSR report notes, we have made progress towards many of the recommendations put forth in the report, but there is more to do. Here is an update on our work to address each of the five key areas BSR identified:
Governance and Accountability at Facebook
BSR recommends that Facebook adopt a stand-alone human rights policy, establish formalized governance structures to oversee the company’s human rights strategy, and provide regular updates on progress made.
Our policies regarding what is and is not allowed on our platform are developed with an eye towards international human rights principles, including the Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights. As a member of the Global Network Initiative, we’re committed to upholding the human rights standards set out in the GNI Principles and Implementation Guidelines. We are independently assessed on our implementation of the GNI Principles on a biennial basis, with our latest assessment underway. Related to this work, we engage in conversations around the world focused on existing and proposed laws and regulations to promote human rights and responsible use of technology and online services.
The BSR report recommends that we establish a separate policy that defines our approach to content moderation with respect to human rights, which we are looking into. We’re also working to hire additional human rights specialists to strengthen engagement with and solicit input from NGOs, academia, and international organizations.
Enforcement of Our Content Policies
BSR urges Facebook to improve enforcement of our Community Standards, the policies that outline what is and isn’t allowed on Facebook. Core to this process is continued development of a team that understands the local Myanmar context and includes policy, product, and operations expertise.
Earlier this year, we established a dedicated team across product, engineering, and policy to work on issues specific to Myanmar, and said that we plan to grow our team of native Myanmar language speakers reviewing content to at least 100 by the end of 2018. We have now hired and onboarded 99 of these reviewers. This team is making a difference, improving the development and enforcement of our policies.
We have, for example, updated our credible violence policy such that we now remove misinformation that has the potential to contribute to imminent violence or physical harm. We have also undertaken research to better understand how content that doesn’t ordinarily break our rules (for example, potentially hateful content that doesn’t amount to hate speech under our policies) has the potential to incite offline harm. In this vein, we are working with partners to use CrowdTangle and other tools to analyze potentially harmful content and understand how it spreads in Myanmar.
On the enforcement side, we have improved proactive detection of hate speech in Myanmar, and are taking more aggressive action on networks of accounts that are set up to mislead others about who they are, or what they’re doing.
We also just extended the use of artificial intelligence to posts that contain graphic violence and comments that are violent and dehumanizing, and will reduce their distribution while they undergo review by our Community Operations team. If this content violates our policies, we will remove it. By limiting visibility in this way, we hope to mitigate against the risk of offline harm and violence. We also plan to reduce the distribution of individual posts from people and Pages in Myanmar who post content that is spammy or sensational, thereby promoting more meaningful and authentic conversations on Facebook.
Engagement, Trust, and Transparency
The report recommends that Facebook preserve and share data where it can be used to evaluate international human rights violations, and that the company publish data specific to Myanmar so that the local and international community can evaluate progress more effectively.
As the report recognizes, we are committed to working with and providing information to the relevant authorities as they investigate international human rights violations in Myanmar, and we are preserving data for this purpose, including content on the accounts and Pages we removed in August and October.
We agree with BSR on the value of publishing more data on our enforcement efforts in Myanmar. That’s why we first published data on the progress we’ve made in proactive detection and removal of hate speech on Facebook in Myanmar earlier this year. In the third quarter of 2018, we saw continued improvement: we took action on approximately 64,000 pieces of content in Myanmar for violating our hate speech policies, of which we proactively identified 63%—up from 13% in the last quarter of 2017 and 52% in the second quarter of this year.
Advocacy Efforts Aimed at Reform in Myanmar
BSR acknowledges that human rights issues in Myanmar cannot be addressed by Facebook alone, and instead requires broader, more systemic change. As such, BSR recommends that Facebook play an active role in advocacy efforts aimed at policy, legal, and regulatory reform in Myanmar, support the country’s transition to Unicode, and continue to invest in efforts to increase digital literacy and counter hate speech.
Facebook’s mission is to help people build community and to bring the world closer together, which is clearly aligned with legislative and national settings in which human rights are respected. Our platform is used extensively to advance human rights causes around the world, which is why we believe universal availability of our services and upholding our Community Standards is paramount. We engage in conversations around the world related to existing and proposed laws and regulations to promote human rights and responsible use of technology and online services, both as a company and as part of industry associations.
Myanmar is currently the only country in the world with a significant online presence that hasn’t standardized on Unicode – the international text encoding standard. Instead, Zawgyi is used to encode Burmese language characters and is the dominant typeface in Myanmar. This lack of a single standard poses some very real technical challenges for us and others. It makes automation and proactive detection of bad content harder, it can weaken account security, it means less support for languages in Myanmar beyond Burmese, and it makes reporting potentially harmful content on Facebook more difficult. Resolving compatibility issues is also important for Myanmar’s technological development and its economic growth.
To support the transition to Unicode we have removed Zawgyi as an interface language option for new Facebook users and are working on font converters to improve content experience on Unicode devices. We will continue to make progress in this area, and fully support the country’s transition to Unicode.
We also continue to invest in partnerships aimed at improving digital and media literacy in Myanmar. We are working with the Myanmar Book Aid Preservation Foundation on a pilot to update their current digital literacy curriculum and support training outreach through their national network of libraries over the next six months through their network of libraries locations.
And we are working closely with independent publishers in Myanmar to help build capacity and resources in their online newsrooms, laying the groundwork for them to build a sustainable business on the platform if they choose to do so. This includes training sessions, and the continued roll-out of programs and tools from the Facebook Journalism Project, including journalist safety training.
Prepare for and Mitigate Risk Related to Future Developments in Myanmar
The report concludes with mention of future developments in Myanmar, among them the 2020 elections, and the growth and development of Facebook products and services in the country.
Our dedicated product, engineering, partnerships and policy teams will continue to work on issues specific to Myanmar, and to address a diverse set of challenges. This includes our work to root out abuse in the run up to the country’s 2020 elections. During the recent by-elections we put in place a risk mitigation plan which included additional Community Operations support, proactive work to monitor key events and elections-related content, the takedown of impersonating Pages, and the removal of credible threats to politicians and political Pages that violated our ads policy on hate speech.
We are also committed to advancing the social and economic benefits of Facebook in Myanmar, and plan to roll out several programs to support local developers and small businesses, including the #SheMeansBusiness initiative which focuses on support for women entrepreneurs.
This is some of the most important work being done at Facebook, and we will continue to keep people updated on our progress. We know we need to do more to ensure we are a force for good in Myanmar, and in other countries facing their own crises.
BSR undertook this Human Rights Impact Assessment (HRIA) between May and September 2018, using a methodology based on the UN Guiding Principles on Business and Human Rights (UNGPs). It involved interviews with 60 rightsholders and stakeholders in Myanmar, as well as interviews with relevant Facebook employees. The HRIA was funded by Facebook, though BSR retained editorial control of its contents.