Yesterday, we told you about how millions of pictures from specialized dating apps had been stored online without any kind of password protection.
Now it’s the turn of an AI “nudify” service.
A researcher, famous for finding unprotected cloud storage buckets, has uncovered an unprotected AWS bucket belonging to the nudify service.
The rising popularity of these nudify services apparently has caused a selection of companies without any security awareness to hop on the money train. Millions of people use these services to turn normal pictures into nude images, and it only takes a few minutes.
South Korean AI company GenNomis by AI-NOMIS or somebody acting at their behalf stored 93,485 images and json files with a total size of 47.8 GB in a non-password-protected nor encrypted, but publicly exposed database.
Looking at the service, GenNomis is an AI-powered image generation platform that allows users to transform text descriptions into images, create AI personas, turn images to videos, face-swap images, remove backgrounds, etc., and all that without restrictions. It also provides a marketplace, where users can buy and sell these images as “artwork.”
The researcher saw numerous pornographic images, including what appeared to be disturbing AI-generated portrayals of very young people. Even though the GenNomis guidelines prohibit explicit images of children and any other illegal activities, the researcher found many of them. That doesn’t mean they were available to buy on the platform, but they were at least created.
Some of the deepfakes are hard to discern from real images, and as such may lead to serious privacy, ethical, and legal risks. Not to mention the humiliation for the owners of those images or parts thereof who didn’t consent. Sadly, there are many examples where young people have taken their own lives over sextortion attempts.
The researcher contacted the company about what he had found. He told The Register:
“They took it down immediately with no reply.”
We’ve seen many cases where social media and other platforms have used the content of their users to train their AI. Some people have a tendency to shrug it off because they don’t see the dangers, but let us explain the possible problems.
In this case, it’s at the extreme end of what the content could be used for.
If you want to continue using social media platforms that is obviously your choice, but consider the above when uploading pictures of you, your loved ones, or even complete strangers.
We don’t just report on threats—we remove them
Cybersecurity risks should never spread beyond a headline. Keep threats off your devices by downloading Malwarebytes today.