AI on new flagship phone removes subjects’ clothes making them appear naked

AI on new flagship phone removes subjects’ clothes making them appear naked

Disclaimer: Content Source The content on this website is for informational purposes only. We would like to clarify that the information provided here is sourced from various publicly available outlets on the internet. None of the content on this website is authored, reviewed, or endorsed by our team.

AI features are becoming more prevalent in smartphones. A few years ago, depending on the scene being photographed, AI algorithms would be limited to automatically optimizing the settings of the phone’s cameras depending on the scene being photographed. Now, AI is used to edit photographs and videos, answer queries, allow phone conversations to take place in real-time among people who speak different languages, and more.

However, not all AI features are welcomed by smartphone users. According to Guang Ming Daily (via Gizmochina), the just-announced Huawei Pura 70 Ultra top-of-the-line flagship model has an AI-based object removal feature that works on photographs. Several posts on social media platform Weibo include examples showing how the object removal feature on the Pura 70 Ultra accidentally removes parts of the clothing worn by subjects in photographs revealing what appears to be the parts of their bodies covered by the clothing.
Huawei admits that there is a problem here and blames it on issues with the AI algorithm of the “smart AI retouching” feature. The company says that it will make the appropriate changes in future system updates. But until that happens, the feature can help create pornographic images. Keep in mind that what you see when the clothes are removed is not the subject’s real body. AI uses the skin color of the person being photographed to make the parts recreated by AI when the clothes are erased look more realistic.

Many talk about the dark side of AI and this would seem to be one of those situations. Hopefully, Huawei exterminates this bug quickly enough to prevent those with malicious intent from using these AI photos to humiliate or blackmail the subjects of these photographs.

Source link


No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *