Welcome to the IKCEST
DoNotPay launches Photo Ninja, which tweaks images to thwart reverse image searches and facial recognition tech without compromising the look of the photos

Robot lawyer app DoNotPay is rolling out a new feature that slightly alters photos so that artificial intelligence apps can’t identify who you are. Called Photo Ninja, the feature is intended to prevent photos of you uploaded online from being used for malicious purposes.

DoNotPay charges $3 per month and in exchange will do everything from contest parking tickets to cancel free trials before their renewal date — basically anything that can easily be automated, DoNotPay wants to take care of it for you.

Adversarial AI — With the new Photo Ninja feature, users upload a photo of themselves to DoNotPay and its algorithms insert hidden changes that confuse facial recognition tools. This type of masked picture can be referred to as an “adversarial example,” exploiting the way artificial intelligence algorithms work to disrupt their behavior. It’s a growing area of research as the role of AI continues to grow and the technology is exploited for potentially dangerous, or at least privacy-eroding, purposes.

Photo Ninja uses a novel series of steganography, detection perturbation, visible overlay, and several other AI-based enhancement processes to shield your images from reverse image searches without compromising the look of your photo,” says the company.

AI systems are trained to analyze pictures by looking at the pixel-level data, and adversarial examples can trick them by changing the pixel colors in a subtle enough way that the human eye doesn’t notice anything different but a computer fails to categorize the image as it usually would or interprets it as a wholly different image.

Anti-creep software — There are various reasons why you might want to use Photo Ninja. Before joining a dating service like Bumble, you could run your pictures through Photo Ninja so that weirdos can’t upload them to Google’s reverse image search and find your social media profiles without getting your consent, for instance.

As police agencies and retailers increasingly use facial recognition to surveil for criminal activity, they rely on databases of pictures culled from the internet to find suspects. Such invasive surveillance is controversial, not least because it’s often inaccurate and disproportionally harms minorities and women. If you’ve run all your pictures through Photo Ninja, DoNotPay claims programs like TinEye and Google’s reverse image search will be unable to match you. Hopefully, Clearview AI won’t be able to either.

Original Text (This is the original text for your reference.)

Robot lawyer app DoNotPay is rolling out a new feature that slightly alters photos so that artificial intelligence apps can’t identify who you are. Called Photo Ninja, the feature is intended to prevent photos of you uploaded online from being used for malicious purposes.

DoNotPay charges $3 per month and in exchange will do everything from contest parking tickets to cancel free trials before their renewal date — basically anything that can easily be automated, DoNotPay wants to take care of it for you.

Adversarial AI — With the new Photo Ninja feature, users upload a photo of themselves to DoNotPay and its algorithms insert hidden changes that confuse facial recognition tools. This type of masked picture can be referred to as an “adversarial example,” exploiting the way artificial intelligence algorithms work to disrupt their behavior. It’s a growing area of research as the role of AI continues to grow and the technology is exploited for potentially dangerous, or at least privacy-eroding, purposes.

Photo Ninja uses a novel series of steganography, detection perturbation, visible overlay, and several other AI-based enhancement processes to shield your images from reverse image searches without compromising the look of your photo,” says the company.

AI systems are trained to analyze pictures by looking at the pixel-level data, and adversarial examples can trick them by changing the pixel colors in a subtle enough way that the human eye doesn’t notice anything different but a computer fails to categorize the image as it usually would or interprets it as a wholly different image.

Anti-creep software — There are various reasons why you might want to use Photo Ninja. Before joining a dating service like Bumble, you could run your pictures through Photo Ninja so that weirdos can’t upload them to Google’s reverse image search and find your social media profiles without getting your consent, for instance.

As police agencies and retailers increasingly use facial recognition to surveil for criminal activity, they rely on databases of pictures culled from the internet to find suspects. Such invasive surveillance is controversial, not least because it’s often inaccurate and disproportionally harms minorities and women. If you’ve run all your pictures through Photo Ninja, DoNotPay claims programs like TinEye and Google’s reverse image search will be unable to match you. Hopefully, Clearview AI won’t be able to either.

Comments

    Something to say?

    Log in or Sign up for free

    Disclaimer: The translated content is provided by third-party translation service providers, and IKCEST shall not assume any responsibility for the accuracy and legality of the content.
    Translate engine
    Article's language
    English
    中文
    Pусск
    Français
    Español
    العربية
    Português
    Kikongo
    Dutch
    kiswahili
    هَوُسَ
    IsiZulu
    Action
    Related

    Report

    Select your report category*



    Reason*



    By pressing send, your feedback will be used to improve IKCEST. Your privacy will be protected.

    Submit
    Cancel