The eleventh report in the Digital Rights Series
  • 26/03/2023
  •  https://dg.samrl.org/l?e4757 
    Social Platforms
    Digital Rights |

    The SAM Digital Rights Project issued its eleventh report on the restrictions imposed by social media platforms on freedom of expression, as part of the initiative implemented by the organization in cooperation with Internews.
    The report, titled "Social Platforms' Restrictions on Freedom of Expression in the Yemeni Digital Space," said that social media companies impose restrictions on publishing and interaction, which have led to account closures and bans in some cases, without committing any violation of the policies set by the platforms, in many cases.
    Mark Zuckerberg, Facebook's founder, said the Community Standards policy enforcement team consists of about 30,000 employees, including content reviewers who speak almost every widely spoken language in the world, working from offices in several time zones to ensure a quick response to reports. He adds that the mistakes made by the team are mostly due to errors in the application of policy details, rather than disagreements about what those policies should actually be, noting that in some cases, wrong decisions are made in more than one in 10 cases reviewed by the team, depending on the type of content. 
    As for the legal aspect of keeping legitimate content online, the report notes that companies are not required to do so under any legal obligation. They are private companies free to create and enforce their terms of service and guidelines, including those related to speech protected by human rights law. There is also no legal obligation on private companies to protect or even respect any citizen's right to freedom of expression, and the fact that they do so is often just a consequence of their final plan, meaning that you will be less likely to use their platform if they do not allow the user to talk about what is openly on their mind.
    The report quotes academic Waleed Al-Saqqaf as saying that political accounts may impose restrictions on social media platforms, as these platforms can be affected by external pressures and political trends. He adds: As for the use of platforms by the parties to the conflict against their opponents, they exploit the reporting of content collectively to suspend or restrict certain accounts, and this can lead to a response by the platforms based on the number of reports, reflecting the collective influence of these parties in manipulating platforms, which in turn negatively affects the freedom of expression of activists, journalists, and others.
    While data journalist Farouk Al-Kamali believes that the matter has to do with social media algorithms that put specific terms and phrases in the list of prohibitions, and therefore when this term or phrase is written in the context of a post, it is considered a violation of the laws that regulate publishing in this or that community. Facebook currently deals with many fact-checking platforms to combat disinformation, hate speech and incitement to violence, and therefore once a platform classifies a post as falling into the list of misinformation, falsification or hatred, those posts are prepared, even retroactively, and thus the list of prohibited words grows and restrictions increase. He adds: We do not deny that social media has become an effective weapon and a tremendous force that various forces and parties seek to have the ability and control over, and for this they harness it to achieve their agenda to the extent that there is no social term left except for the name only.
    On the possibility of the difference in the cultural context contributing to causing confusion and misunderstanding among those in charge of social media platforms, Professor Al-Saqqaf says: This happens a lot, especially on the Internet, due to the different expressions, terms and symbols between different cultures. He continues: Those in charge of social platforms may not always understand the cultural connotations of published content and may be misinterpreted, and this can lead to unjustified blocking or restriction of content due to the cultural gap and lack of understanding, citing what was recently discussed in Meta about the use of the phrase "martyr" and considering it normal by Muslims while some parties objected to it.
    The fact-checker Al-Kamali believes that the cultural context may be more influential if we talk about Arabs and the West, but the Arab cultural context in its general form in the entire Arab countries is a close, overlapping, and interconnected context in fact, the situation created by social media in the Arab world and within each country, is the conflict between generations, between the old and the new, between those who want to get rid of all the old and forget that most of what exists, in reality, is old.
    Dr. Al-Saggaf points out that algorithms have evolved remarkably in recent years, but they are still deficient and do not allow dispensing with human reviewers, and adds: Algorithms continue to suffer from poor interpretation and understanding of context and intentions, especially in different cultures and dialects, and their decisions may sometimes be biased towards the culture of programmers in the West, which achieves unfairness for users from other cultures. The human role remains essential to validate the results and ensure a more accurate response under the supervision of a wide range of people from diverse backgrounds and cultures. 
    For his part, journalist Al-Kamali explains that algorithms are software codes made by people with cognitive ability, no matter how limited, and therefore the creation of these codes was based on their cognitive abilities in addition to the perceptions presented by others and also what group of countries develops of legislation and laws that are considered acts, words or images as a violation, for example, blood is a violation of human feelings and harms the psychology of Western societies, but the Yemeni citizen does not find a problem in watching and publishing them, circulate and talk about it, and for this, he publishes a picture that pushes Facebook to block it first and then ban the account, according to his words. 
    The report stresses that the increasing use of AI algorithms to automatically selectivize disinformation and other types of content may lead to excessive censorship of legitimate content, thereby violating the author's freedom of expression and right to access information. These algorithms may also have inherent biases and may be prone to manipulation... Therefore, algorithms are still not accurate enough to be used in a fully automated way in organizing content.
    He notes that algorithms have procedural weaknesses that include a lack of oversight and transparency, and a potential implicit and explicit bias in their design and in the training data used to develop them, which increasingly poses significant problems, especially when companies have also limited users' ability to resort to a human-run stabbing process.
    Prof. Al-Saggaf recommends that the management of social platforms should work to establish clear and transparent standards for acceptable behavior in easy language and not open to different interpretations, and rely on human reviewers to assess the context and intentions, especially people with competence and knowledge of the cultures from which users come, and also improve algorithms to identify offensive content more accurately, in addition to collaborating with civil society organizations and independent local activists to understand the cultural and social context. In doing so, platforms ensure a balance between freedom of expression and tackling offensive content.
    He called on platforms to continue to improve tools and resources for users to report and deal with abusive content, enabling them to promote a more efficient, secure and respectful interactive environment for all users, and platforms must ensure that there are fast, clear and fair appeal procedures for users who believe that their accounts or content have been handled incorrectly.
    The report issued by the Digital Rights Project stresses the importance of companies establishing procedures that allow appealing bans, demotion and removal decisions, and suspending or suspending accounts, which requires detailed notification of the procedure that has been applied, the possibility of appealing the procedure directly through the company's service, and notification of the decision taken by the company regarding the appeal.
    The report points out that platform decisions must adhere to the same standards of legality, necessity, and legality that are binding on states when they limit freedom of expression. As a result, the company's rules should be clear enough so that users can predict, with acceptable confidence, which content will be blocked (the principle of legality), the restrictions should serve a legitimate purpose (the principle of legality), and the restrictions should be applied narrowly and without resorting to intrusive actions (the principle of necessity).
    Holding social media platforms accountable for removing legal content will motivate them to create a review system that appropriately takes into account the user's freedom of expression. To ensure that this remains the case, the technology industry must be properly regulated, ensuring that it continues to grow and thrive without restricting user rights.
    The report concluded that the restrictions imposed by social platforms on users pose a major challenge, and raise many questions and concerns about freedom of expression, which requires taking into account the different cultural contexts and any other factors that may cause confusion and confusion among those in charge of the platforms. Social platforms should assume responsibility for protecting users' rights online, providing transparent and clear criteria for enforcing restrictions, and ensuring respect for freedom of expression and cultural diversity.

     It is noteworthy that the report (Restrictions of Social Platforms on Freedom of Expression in the Yemeni Digital Space) is the first and last within the digital rights project, which is implemented by SAM with the support of Internews, with the aim of advocating for digital rights issues for Yemenis, leading to a free and safe digital space.


    press here to download the file


  •  
    © 2023 Sam Organization, Designed & developed by