As part of its larger commitment to combat “cyberflashing,” the dating app Bumble is open sourcing its AI tool that detects unsolicited lewd images. First debuted in 2019, Private Detector (let’s take a moment to let that name sink in) blurs out nudes that are sent through the Bumble app, giving the user on the receiving end the choice of whether to open the image.
“Even though the number of users sending lewd images on our apps is luckily a negligible minority — just 0.1% — our scale allows us to collect a best-in-the-industry dataset of both lewd and non-lewd images, tailored to achieve the best possible performances on the task,” the company wrote in a press release.
Now available on GitHub, a refined version of the AI is available for commercial use, distribution and modification. Though it’s not exactly cutting-edge technology to develop a model that detects nude images, it’s something that smaller companies probably don’t have the time to develop themselves. So, other dating apps (or any product where people might send dick pics, AKA the entire internet?) could feasibly integrate this technology into their own products, helping shield users from undesired lewd content.
Since releasing Private Detector, Bumble has also worked with U.S. legislators to enforce legal consequences for sending unsolicited nudes.
“There’s a need to address this issue beyond Bumble’s product ecosystem and engage in a larger conversation about how to address the issue of unsolicited lewd photos — also known as cyberflashing — to make the internet a safer and kinder place for everyone,” Bumble added.
When Bumble first introduced this AI, the company claimed it had 98% accuracy.
AI is getting better at generating porn. We might not be prepared for the consequences.
Comment