Content moderation on social media has been in public scrutiny recently. Takedowns often have unintended consequences which archival projects like Tattle can address. But archival projects are also susceptible to the kind of abuse that necessitates takedowns.
First, some background- One reason for popularity of chat apps is perception of privacy. Last year Facebook announced, "The future is private" and plans to unify its messaging products around ephemerality and encryption. On mobile apps, posting from social media to chat apps has always been easy. In some specific cases, platforms realizing the popularity of WhatsApp have created specific features to make content easier to share to WhatsApp.
Once content has been shared on a WhatsApp like app, it is difficult to track the number of times a post was shared, in the way one can on FB/Twitter like platforms. Even if content has been taken down by one of these platforms, it has already found its way to a chat app.
This is where the unintended consequences kick in- when someone wants to fact check that specific post circulating on WhatsApp, the original source (say FB post or Tweet) is no longer available. The other unintended consequence is that user generated content taken down by platforms (misinformation and hate speech) that is important in investigations of human rights violations, becomes unavailable and requires negotiations with platforms.
There is an inherent tension here- misinformation and hate speech should not be easily available but it should be available to people wanting to investigate/understand it. But judging intention is tricky.
One work-around is to save this content on sites NOT designed for virality. Motivated individuals who care to investigate content should have a resource to go to, but casual internet/social media browsers should not chance upon this content. This is precisely the motivation for us at Tattle to archive content from chat apps and regional social media platforms. We are of course not alone in this endeavor- groups such as The Syrian Archive and The Internet Archive have been leading the way for many years.
An important caveat- even if platforms are not designed for virality, there will be always be motivated misinformation peddlers who will use the resource to their gains. Earlier this year, after Covid misinformation was deleted by Facebook archives of these posts were recirculated instead. Wayback machine started adding a yellow-label to archived pages of content that was taken down.
Abusability is something that we at Tattle worry about all the time. Platform UI, Content Policy are all part of the response. It isn't a 'Done once Done right' answer and an area where we will have to be responsive to newer risks. A specific working group at Tattle 'The strategic content of viral content', works in this area. As always, we look forward to any inputs and ideas on this.