Washington, Aug 3: Facebook has announced that it will make two algorithms open-source which it uses to identify abusive or harmful content on its platform.
PDQ and TMK+PDQF are the two technologies which Facebook uses to identify child sexual exploitation, terrorist propaganda, and graphic violence. The duo of technologies stores files as digital hashes and compares them with known examples of harmful content, The Verge reported.
The algorithms have been released on Github and Facebook hopes that developers and other companies will make use of it to identify harmful content and add it to a shared database to curb online violence.
Comments
Add new comment