News aggregation service Digg has unveiled a set of community guidelines it hopes will encourage “very open debates, very free discussions, and very searching dialogue” while also setting clear boundaries about what will and will not be tolerated by the community.
The policy appears to have been established in response to issues that have plagued other, similar platforms, such as Reddit. Digg chief executive Andrew McLaughlin explained in a post today that these guidelines have been prepared to pave the way for new products and features, specifically commenting. It’s likely that McLaughlin wanted to give the community time to become acquainted with the new rules in order to head any problems off before they start.
Naturally, content that will not be allowed on Digg includes slurs, epithets, and hateful speech. In addition, abusive names will not be permitted (must be rated G or PG, according to the company). The service won’t also tolerate gratuitous or explicitly sexual remarks, trademark or copyright infringement, spamming, harassment, privacy violations, illegal speech (fraud or phishing), or abusive mis-flagging of comments or users by people who claim that guidelines are being violated.
So basically, Digg has enumerated all the main things you would expect to see on a list such as this.
McLaughlin says that consequences for violating these guidelines include removal of comments or posts and disabling the username of the perpetrator(s). Affected users will be notified by email or through the web interface in order to give them a chance to address the issue. If there are repeat violations, Digg says it might “restrict, suspend, or terminate the account.”