Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.
Facebook has struggled at times to balance its desire to appear to be a champion of free speech while also keeping its social networking service relatively safe for families and a positive environment for a diverse set of people.
The result is that Facebook has been the subject of protests for things like removing photos of women breastfeeding or censoring content from extreme political groups.
To try to clear things up, Facebook issued new community standards today, which include an update to a category it calls “respectful behavior.” That includes nudity, hate speech, and graphic violence.
“What exactly do we mean by nudity, or what do we mean by hate speech?” the company wrote in a blog post. “While our policies and standards themselves are not changing, we have heard from people that it would be helpful to provide more clarity and examples, so we are doing so with today’s update.”
Here’s what Facebook says now about posting nudity on the site:
We remove photographs of people displaying genitals or focusing in on fully exposed buttocks. We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures, and other art that depicts nude figures. Restrictions on the display of both nudity and sexual activity also apply to digitally created content unless the content is posted for educational, humorous, or satirical purposes. Explicit images of sexual intercourse are prohibited. Descriptions of sexual acts that go into vivid detail may also be removed.
In the new guidelines, Facebook said it is attempting to enforce the policy uniformly around the world. But it also acknowledges that the issue remains challenging.
“As a result, our policies can sometimes be more blunt than we would like and restrict content shared for legitimate purposes,” Facebook says. “We are always working to get better at evaluating this content and enforcing our standards.”
In regards to hate speech, Facebook now says it defines it as speech “which includes content that directly attacks people based on their: Race, Ethnicity, National origin, Religious affiliation, Sexual orientation, Sex, gender, or gender identity, or Serious disabilities or diseases.”
Organizations and people dedicated to promoting hatred against these protected groups are not allowed a presence on Facebook. As with all of our standards, we rely on our community to report this content to us.
Finally, for graphic violence, Facebook says:
Facebook has long been a place where people share their experiences and raise awareness about important issues. Sometimes, those experiences and issues involve violence and graphic images of public interest or concern, such as human rights abuses or acts of terrorism. In many instances, when people share this type of content, they are condemning it or raising awareness about it. We remove graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence.
Of course, whether any of these clarify anything for anyone will be interesting to see. Certainly, all of these definitions remain highly subjective and it’s unlikely Facebook is going to resolve on its own debates that have raged on for ages.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.