YouTube Site suspension for persecuted minorities in China is a human rights issue
Serikzhan Bilash founded the Atajurt Kazakh Human Rights organization in 2017 to be a voice for the voiceless.
He launched the YouTube channel for people to share their experiences of persecution and draw attention to family members caught in the vast network of internment camps in Xinjiang, China.
Now the very platform that once gave them a voice and an audience attempted to silence them.
June 15, Bilash access to his YouTube account was temporarily suspended, initially without explanation. The account has more than 11,000 videos, mainly testimonials from family members of people currently detained inside of more than 260 known political re-education camps that currently house between 1.8 million and 3 million Kazakhs, Kyrgyz and predominantly Uyghurs, a Muslim minority in China.
YouTube has since clarified that it revoked access to the account due to the use of personally identifiable information, which violates its community standards. While YouTube’s reasoning for not allowing the use of personally identifiable information in videos may be justified in preventing harassment in one circumstance, it may prevent proper record keeping and awareness of human rights violations in the videos. another.
Bilash includes personally identifiable information in the videos to verify the identity of testimonials and their detained family members.
Those who testified voluntarily provided their personally identifiable information for the benefit of their families. By providing this information, Bilash believes it reinforces the veracity of their claims and makes it difficult for the Chinese authorities to label the videos as propaganda or false information.
YouTube also serves as an important archive of the videos (according to Bilash, this is the only complete archive), since they were targets of the Kazakh government and other authorities due to the sensitive nature of their advocacy.
Bilash’s first instinct upon learning of his YouTube account suspension was fear that the majority of his work had been lost due to policy misapplication.
While YouTube has since restored Bilash’s access to his account and made an effort to remove personally identifiable information from previously posted videos, Bilash fears this will not be the last time he will be targeted.
And he’s probably not wrong.
The mass play of community guidelines to target political enemies on social media is here to stay. Indeed, it is already commonplace.
Bad actors working together to wage war on a third party using a tech company’s own policies add a new element to the cat-and-mouse game of content moderation.
Unlike simpler cases, such as that of the Zoom executive, Xinjiang Jin direct work with Chinese Communist Party officials to shoot down and thwart Zoom calls on Tiananmen Square and other “unacceptable” matters to the party, new efforts are starting on obfuscation.
Previous attempts look like an army of Twitter accounts directly controlled by the Saudi regime trying to drown out information about Jamal Khashoggi’s death by amplifying pro-Crown Prince Mohammed bin Salman content.
But more recent cases, like Bilash’s, even though his targeting was not directly sponsored by an authoritarian government, highlight the mix of strategic intentions of state actors with activists, engaged citizens and agents of chaos. .
For example, during the resumption of violence in Gaza in May, a pro-Israel Facebook group with 77 million followers found itself on the target over 800,000 “hateful comments and messages [that] came in an hour, ”including numerous invocations from Hitler and quotes from Hitler.
Facebook shut down the page, Jerusalem Prayer Team, saying the site had violated its rules of “inauthentic behavior”, due to the 2 million “examples of hate speech” that accumulated in the comments.
The “success” of this pressure campaign – which is said to be led by radical Islamist groups and anti-Israel activists – will encourage similar Internet mobs to attempt to distort the pages, organizations and individuals with which they disagree.
While some protests may be organic, platforms will increasingly face bad actors who manipulate their vague and inconsistently enforced set of rules to restrict free speech.
Businesses need to be nimble enough to deal with this type of participatory manipulation without sacrificing the consistent and transparent application of community guidelines.
Platforms should recognize and recognize that bad actors actively abuse their rules to silence their perceived enemies. They largely create rule sets to aggressively counter these bad actors as closely as they scrutinize mainstream conservative discourse.
Due to their size, scale and reach, Big five companies will never be able to completely eradicate all the wrongdoing committed on their platforms and products. As such, these companies, the new custodians of information, must also articulate clear and consistently applied remedies for all users and organizations.
Additionally, the stories of the Atajurt Kazakh Human Rights and Jerusalem Prayer Team demonstrate the importance of efforts to empower users through privacy-preserving technical solutions and alternative platforms.
Bilash transferred his videos on YouTube competitor Odysee, a website built on a blockchain protocol that is quickly becoming a new hotbed for perspectives that break out of the mainstream narratives of today’s culture.
Diversification is the immediate way forward. Unless you convince the major platforms that moderation of content is a human rights issue, the best option to avoid permanent deletion is to use an alternative service provider that endorses and defends free speech.
This piece originally appeared in The daily signal