• will_a113@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 days ago

    AI Forensics, a European non-profit that investigates influential and opaque algorithms, found that nudity uploaded to Instagram and Facebook as a normal user was promptly removed for violating Meta’s Community Standards. The same exact visuals were not removed when they were uploaded as ads, showing that Meta has a different standard for enforcement when it’s getting paid to push images in front of users.

    Oh damn. This sounds like willful intent to me.