The logo for the Daily Beast's Obsessed website. It reads: 'Obsessed: What to Watch, Binge, See, & Skip'
DAILY BEAST
CrosswordNewsletters
  • Cheat Sheet
  • Obsessed
  • Politics
  • Crime
  • Entertainment
  • Media
  • Innovation
  • Opinion
  • World
  • U.S. News
  • Scouted
CHEAT SHEET
    POLITICS
    • Biden World
    • Elections
    • Opinion
    • National Security
    • Congress
    • Pay Dirt
    • The New Abnormal
    • Trumpland
    MEDIA
    • Confider
    • Daytime Talk
    • Late-Night
    • Fox News
    U.S. NEWS
    • Identities
    • Crime
    • Race
    • LGBT
    • Extremism
    • Coronavirus
    WORLD
    • Russia
    • Europe
    • China
    • Middle East
    INNOVATION
    • Science
    TRAVEL
      ENTERTAINMENT
      • TV
      • Movies
      • Music
      • Comedy
      • Sports
      • Sex
      • TDB's Obsessed
      • Awards Shows
      • The Last Laugh
      CULTURE
      • Power Trip
      • Fashion
      • Books
      • Royalist
      TECH
      • Disinformation
      SCOUTED
      • Clothing
      • Technology
      • Beauty
      • Home
      • Pets
      • Kitchen
      • Fitness
      • I'm Looking For
      BEST PICKS
      • Best VPNs
      • Best Gaming PCs
      • Best Air Fryers
      COUPONS
      • Vistaprint Coupons
      • Ulta Coupons
      • Office Depot Coupons
      • Adidas Promo Codes
      • Walmart Promo Codes
      • H&M Coupons
      • Spanx Promo Codes
      • StubHub Promo Codes
      Products
      NewslettersPodcastsCrosswordsSubscription
      FOLLOW US
      GOT A TIP?

      SEARCH

      HOMEPAGE
      0

      Network of ‘Moderators’ Decide What Photos Can Be Posted on Facebook, Google, Other Social Media

      New Yorker Banned!

      Photo moderators based everywhere from California to India decide what pictures are suitable for social-media posting. Eliza Shapiro on how their biases can affect what makes it online.

      Eliza Shapiro

      Updated Jul. 14, 2017 12:11AM ET / Published Sep. 12, 2012 4:45AM ET 

      David Paul Morris / Bloomberg via Getty Images

      When considering the next photo you’ll share with your Facebook friends, keep in mind: pictures of ear wax and visual or textual attacks on Turkey’s first president, Kemal Ataturk, are not okay; deep flesh wounds and excessive blood are.

      Facebook and other social media sites’ guidelines for inappropriate photo content are back in the spotlight after Facebook briefly banned The New Yorker’s official page. The magazine posted a cartoon of a naked man and woman that included two black dots: female nipples.

      According to documents obtained by Gawker in February, in Facebook’s extensive guidelines for photo moderation, male nipples are allowed, but female nipples aren’t. Digital and cartoon nudity is banned, but art nudity is accepted. A representative for Facebook declined to comment on whether those guidelines had changed since February.

      For one anonymous Facebook photo moderator, the two black dots seemed to cross the line.

      A representative for Facebook told The Daily Beast, “It was a mistake! Facebook is a place where almost a billion people share and click more than a trillion links a day.” Facebook declined to comment on the precise workings of their photo moderation system, and which internal and external employees are responsible for moderation.

      The incident shed light on a still-murky facet of the vast social media universe: a widely geographically dispersed group of photo moderators who work 24/7. They aren’t robots, they’re human beings, and their cultural differences may mean that a certain photo is banned in Hyderabad and accepted in Manila, or flagged for moderation in San Francisco but approved in Paris. A set of corporate guidelines can prove insufficient for a network of photo moderators with their own preferences and biases.

      Industry insiders and experts say the room for interpretation across time zones and cultures can be a problem for a system that requires a surplus of employees at any hour of the day or night.

      Vaughn Hester, who works at CrowdFlower, a San Francisco–based company that helps employers crowd-source projects and tasks to an eager virtual workforce, says the global nature of the business can mean that “asking moderators to flag photos that are ‘offensive’ can result in very different attitudes in terms of what constitutes offensive content versus permissible content.” Interpretation spans, Hester says, “among a group of 10 random contributors from 10 random countries.” Photo moderation, she says, is a “task that runs globally.”

      The system of photo moderation on sites from Facebook to Pinterest to Google is a complex system involving third parties and contractors of contractors.

      When a Facebook user, for example, posts a photo, that picture is directed through a complex hybrid of Facebook’s official User Operations Team and a vast network of independent photo moderators across the world. These freelance moderators are culled from a third-party job contracting site—until this May, Facebook used oDesk, a Silicon Valley–based startup that allows employers to hire freelance or contract workers, and for job-seekers to sign up for virtual tasks. oDesk could not provide comment on why the working relationship was ended by the time of publication.

      Hester says photo moderation is a “favorite task” of CrowdFlower job-seekers, who tend to be clustered in the United States and India. CrowdFlower works with a variety of online dating sites, such as Skout and the gay dating site Scruff, to approve photos before they even go live. Photo moderators generally make between $3 and $4 per hour of work, she says. There are no professional prerequisites for a photo moderator, just an official agreement that the moderator is comfortable dealing with “mature content.”

      That complex artery of photo moderation means certain photos can slip through the cracks, and others, like The New Yorker’s two black dots, are mistakenly removed, perhaps due to a misinterpretation from a far-flung contracted photo moderator.

      Panos Ipeirotis, a professor of information, operations, and management sciences at NYU’s Stern School of Business, says cultural divides have caused more than public-relations glitches for big-name companies, and a significant amount of research is being done to determine how to limit biases and increase objectivity in photo moderation. Ipeirotis says “biases that are systematic can be fixed"—that means realizing that “young men tend to be less strict” with photo moderation and moderators in India “flag images too often as being PG-13,” and working on minimizing cultural, racial, and gender biases.

      Facebook is hardly the only site that relies on a global web of photo moderation.

      Google’s social networking arm, Google +, uses a “combination of user reports and automated scanning to detect content that violates our policies,” according to a spokesperson for Google. What’s okay and not okay on Google + can read a bit vague: promotion of “illegal activities” and “malicious products,” for example, is banned, along with “sexually explicit material” and “personal and confidential information.”

      Pinterest, a site built almost exclusively on photo sharing, has adopted a set of guidelines based on what it calls “Pin Etiquette”: nudity in photographs is forbidden, as is “hateful content” or “content that actively promotes self-harm.”

      Facebook’s “Community Standards” use similar language, prohibiting “violence and threats,” “bullying and harassment,” and “graphic content.” All pornography is banned, as is most nudity, with a caveat: “We aspire to respect people’s right to share content of personal importance, whether those are photos of a sculpture like Michelangelo's David or family photos of a child breastfeeding.”

      With streamlined rhetoric but few specifics, it’s easy to see how a contracted photo moderator might make an error of interpretation when it’s 4 a.m. at Facebook HQ in Menlo Park, Calif.

      A spokesperson for Facebook further explained The New Yorker snafu by pointing to the extensive network of photo moderators across the world:

      “Our dedicated User Operations Team reviews millions of pieces of this content a day to help keep Facebook safe for all. Our policies are enforced by a team of reviewers in several offices across the globe. This team looks at hundreds of thousands of reports every week, and as you might expect, occasionally, we make a mistake and block a piece of content we shouldn’t have.”

      Eliza Shapiro

      @elizashapiro

      Got a tip? Send it to The Daily Beast here.

      READ THIS LIST

      DAILY BEAST
      • Cheat Sheet
      • Politics
      • Entertainment
      • Media
      • World
      • Innovation
      • U.S. News
      • Scouted
      • Travel
      • Subscription
      • Crossword
      • Newsletters
      • Podcasts
      • About
      • Contact
      • Tips
      • Jobs
      • Advertise
      • Help
      • Privacy
      • Code of Ethics & Standards
      • Diversity
      • Terms & Conditions
      • Copyright & Trademark
      • Sitemap
      • Best Picks
      • Coupons
      • Coupons:
      • Dick's Sporting Goods Coupons
      • HP Coupon Codes
      • Chewy Promo Codes
      • Nordstrom Rack Coupons
      • NordVPN Coupons
      • JCPenny Coupons
      • Nordstrom Coupons
      • Samsung Promo Coupons
      • Home Depot Coupons
      • Hotwire Promo Codes
      • eBay Coupons
      • Ashley Furniture Promo Codes
      © 2023 The Daily Beast Company LLC