Content Warning

🗣️🗣️🗣️ Announcing new work from DAIR which is very close to my heart, 3 years in the making.

When #TigrayGenocide, the deadliest genocide of the 21st century thus far, started in November 2020, it was 1 month before I got fired from Google. Unlike Tigrayans whose sisters were being raped & parents murdered, I didn’t know exactly what was happening on the ground & who to believe. But I saw the genocidal speech targeting Tigrayans on social media, particularly from Eritreans, in Tigrinya. 🧵

Content Warning

Effective moderation of social media to curb genocidal content

"The 2020-2022 Tigray war is reported to be the deadliest armed conflict of the 21𝑠𝑡 century, with an estimated 600 to 800,000 documented deaths and more than 100 thousand victims of rape as a weapon of war. Social media platforms were instrumental in spreading genocidal content during the conflict, and failure to effectively moderate hateful content resulted in the murder of civilians. This work investigates the expertise and processes required to effectively moderate such genocidal content, and compares these findings to the expertise and processes prioritized by social media platforms."

@DAIR's page has a 10-minute video and a link to the CHI paper, and a statement - which I'll include excerpts from in a reply.

#moderation#tigraygenocide#tigray #genocide

🔇🔇🔇 Will you be Rightscon? If so, say hi. I'll be at the conference in person for the first time & will be in these 2 sessions, all on Wednesday February 26.

All of them have to do with tech facilitated genocide. Such is the time we're in.

➡️ At 10:15am in Room 102, find me at @7amleh's session, Tech Giants and Genocide: Indigenous Struggles for Digital Justice.

➡️ At 3:15pm in room 101c, find me at the session: Hate speech & information warfare in times of war: the case of Tigray War.

Content Warning

Social media & #TigrayGenocide is something @DAIR has been doing a ton of work on. Stay tuned for our CHI paper & short video on the topic.

Also on Wednesday, catch other DAIR people at these sessions.

➡️ At 9am in room 201D, Nyalleng is co-organizing, Size really does matter: rethinking AI development to avoid marginalizing the marginalized.

➡️ At 11:30am in room 102, Asmelash is participating in,
Putting people first: community-based approaches to accessible language technologies.