Technology

Dating app Bumble launches system to block unwanted nude photos from reaching you

Bumble has introduced a tool that uses technology based on artificial intelligence (AI) and is capable Detect nude images and inappropriate content In the chat section available on the platform.

The company announced in late April that it was developing a type of filter, private detectorwhich will be integrated into Bumble and will notify users in case of Suspicion of a picture of being obscene,

So, he said that this functionality would work with algorithms and it was trained by AI. Capture images in real time and quantify with 98% accuracy If they contain nudity or explicit sexual content.



Now the company has announced Releasing an open source version of this functionalitywith which he hopes to “help the technology community fight” cyber flashing“, as he explained in a statement published by his blog,

it is known as cyber flashing For the practice of sending pornographic images or sexual videos without consentthrough these platforms messenger service And it is considered a type of digital violence.

It is an AI-based tool that detects nudity and inappropriate content in chats.
This AI-based system is available to anyone who is interested and is open source
20bits

Bumble hopes to curb this practice with Private DetectorA functionality with which it wants to add a new layer Security for this quote application online For SmartphonesAccording to the document.

The company has stressed that the tool is released using high-volume data sets, through which it can “to better reflect extreme cases and other parts of the human being” and without its machine learning system. Wants to segregate samples of pornographic material –machine learning– Don’t mark them as abusive or likely to be banned.


Whitney Wolfe Heard rings the Nasdaq opening bell with her baby in her arms.

thus, Your system will be able to figure out which images you should remove from the platformlike socks, pictures of intimate organs, people without shirts and even armas, And which one should you keep?

photographs determined to be inappropriate by a private detector, will fade Feather ApplicationSo that the recipients of these images can choose whether to view it – long-pressing the images -, block it or report the person who sent it.

The company estimated that This AI-based system is available to anyone who is interested and is open sourceIt can therefore be downloaded through GitHub to be implemented in other web pages or services.

sign up for us News bulletin And get the latest technology news in your email.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button