New Position Papers responding to UNHRC and Facebook Oversight Board consultations

The Alliance for Healthy Infosphere (AHI) of which UMB DSL is a member, has recently release two new position papers.

 

The first paper was drafted as part of the Facebook Oversight Board’s call for public comments on the case of Facebook’s indefinite suspension of Donald Trump’s account. In the position paper AHI argues that:

  • Facebook’s decision to suspend the account of Donald Trump was right and proportionate to the situation.
  • The prolongation of the suspension to an indefinite period, however, had no sufficient ground in Facebook’s own rules.
  • The assessment of “off-Facebook” context when dealing with account suspension or content removal due to violation of community standards is constantly failing because of insufficient capacities for proper evaluation.
  • The appeal mechanism for users against Facebook’s decisions on content removal or account suspension is insufficient, as users often cannot receive proper reasoning behind Facebook’s decisions.

FULL PAPER HERE

 

The second position paper is a contribution to the UN OHCHR Special Rapporteur’s paper on disinformation and human rights, that will be presented before the 47th Human Rights Council in June 2021. In the paper AHI argues that:

  1. Media literacy and critical thinking: Education systems need to be reformed to entail digital skills and responsibility. Such education needs to start at an early age. Finland could serve as a successful example showing how resilience to disinformation can be fostered through an agile and modern education system. Life-long learning and other special programmes should be established as well for older generations or disadvantaged communities.
  2. Digital platforms regulation: Digital platforms need to operate in a clearer legal framework. As voluntary self-regulation measures applied by social platforms have not worked, efficient digital regulation/ co-regulation regimes, as currently being developed by the EU, will need to be implemented. There are other promising models which would help address the situation such as updating digital platforms to information fiduciaries.
  3. Application of community standards: Rules of conduct need to be enforced systematically, regardless of the size of a country. Social media platforms need to invest in local capacities for fact-checking and content moderation. Furthermore, functional independent appeal processes need to be set up.
  4. Demonetize disinformation: Tech giants need to do more to demonetize disinformation. One of the successful examples is an appeal to advertisers to stop placing ads on disinformation sources.
  5. Transparency: We need greater transparency of algorithms and independent audits in order to see whether the measures taken by social media platforms are working and effective.
  6. Invest in independent journalism and develop strategic communication of public institutions.
  7. More data: More detailed country information on malign coordinated bot activities, trolls or fake accounts or how much malign content was taken down by social media platforms is required in order to gain full scope of the problem. Similarly, more detailed information on advertisements on social media platforms are needed to identify who is being targeted and why as well as to identify sources of funding.
  8. What is illegal online/offline: Often, existing legislation on illegal content is not systematically enforced. Capacities to prosecute such digital offences should be further developed and funded, for example through taxation of digital platforms.

FULL PAPER HERE