Facebook promises new AI tool will proactively detect revenge porn
Friday, 15 March 2019 Facebook is launching a new AI tool today that it says can proactively detect and flag intimate images and videos of someone posted without their consent. The system will be active on Facebook and Instagram, and, unlike current filters, it can detect “near-nude” content. This content is then flagged and sent to a human moderator for review.
Currently, users on Facebook and Instagram have to report such revenge porn themselves. Facebook says it hopes that the new system will better support victims by flagging images and videos for them.
"The new tool could help victims get ahead of unwanted content"
“Often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared,”...
Dara Pollak still moves gracefully, like the dancer she was before her left leg was shattered when she was hit by a car. Then 17, bedridden in the hospital, she was told she would never walk, much less..