U.S. Markets closed
  • S&P Futures

    4,081.00
    -9.00 (-0.22%)
     
  • Dow Futures

    34,093.00
    -63.00 (-0.18%)
     
  • Nasdaq Futures

    12,112.75
    -39.25 (-0.32%)
     
  • Russell 2000 Futures

    1,935.70
    -3.70 (-0.19%)
     
  • Crude Oil

    79.22
    +0.35 (+0.44%)
     
  • Gold

    1,943.20
    -2.10 (-0.11%)
     
  • Silver

    23.78
    -0.06 (-0.23%)
     
  • EUR/USD

    1.0873
    +0.0007 (+0.0652%)
     
  • 10-Yr Bond

    3.5290
    -0.0220 (-0.62%)
     
  • Vix

    19.40
    -0.54 (-2.71%)
     
  • GBP/USD

    1.2316
    -0.0001 (-0.0086%)
     
  • USD/JPY

    130.1550
    +0.0870 (+0.0669%)
     
  • BTC-USD

    23,152.44
    +282.17 (+1.23%)
     
  • CMC Crypto 200

    524.33
    +5.54 (+1.07%)
     
  • FTSE 100

    7,771.70
    -13.17 (-0.17%)
     
  • Nikkei 225

    27,349.62
    +22.51 (+0.08%)
     

UPDATED: It's way too easy to trick Lensa AI into making NSFW images

Lensa AI can be tricked into leaving very little to the imagination. Illustration of a woman's bare shoulder...

Lensa has been climbing the app store hit lists with its avatar-generating AI that is making artists wave the red flag. Now there's another reason to fly the flag: As it turns out, it's possible -- and way too easy -- to use the platform to generate non-consensual soft porn.

TechCrunch has seen photo sets generated with the Lensa app that include images with breasts and nipples clearly visible in the images with faces of recognizable people. It seemed like the kind of thing that shouldn't have been possible, so we decided to try it ourselves. To verify that Lensa will create the images it perhaps shouldn't, we created two sets of Lensa avatars:

  • One set, based on 15 photos of a well-known actor.

  • Another set, based on the same 15 photos, but with an additional set of five photos added of the same actor's face, Photoshopped onto topless models.

The first set of images was in line with the AI avatars we've seen Lensa generate in the past. The second set, however, was a lot spicier than we were expecting. It turns out the AI takes those Photoshopped images as permission to go wild, and it appears it disables an NSFW filter. Out of the 100-image set, 11 were topless photos of higher quality (or, at least with higher stylistic consistency) than the poorly done edited topless photos the AI was given as input.

Generating saucy images of celebrities is one thing, and as illustrated by the source images we were able to find, there has long been people on the internet who are willing to collage some images together in Photoshop. Just because it's common doesn't make it right — in point of fact, celebrities absolutely deserve their privacy and should definitely not be made victims of non-consensual sexualized depictions. But so far, getting those to look realistic takes a lot of skill with photo editing tools along with hours, if not days, of work.

The big turning point, and the ethical nightmare, is the ease with which you can create near-photorealistic AI-generated art images by the hundreds without any tools other than a smartphone, an app and a few dollars.

The ease with which you can create images of anyone you can imagine (or, at least, anyone you have a handful of photos of), is terrifying. Adding NSFW content into the mix, and we are careening into some pretty murky territory very quickly: your friends or some random person you met in a bar and exchanged Facebook friend status with may not have given consent to someone generating soft-core porn of them.

It appears that if you have 10-15 "real" photos of a person and are willing to take the time to photoshop a handful of fakes, Lensa will gladly churn out a number of problematic images.

AI art generators are already churning out pornography by the thousands of images, exemplified by the likes of Unstable Diffusion and others. These platforms, and the unfettered proliferation of other so-called "deepfake" platforms, are turning into an ethical nightmare, are prompting the U.K. government to push for laws criminalizing the dissemination of non-consensual nude photos. This seems like a very good idea, but the internet is a hard-to-govern place at the best of times, and we're collectively facing a wall of legal, moral and ethical quandaries.

UPDATE: The Prisma Labs team replied to our concerns. The company highlights that if you specifically provoke the AI into generating NSFW images, it might, but that it is implementing filters to prevent this from happening accidentally. The jury is still out as to whether this will actually help people who are the victim of this sort of thing without their consent: