Content moderation on PSN
Our global team of human moderators review reports of misconduct on PlayStation Network.
To help us remove inappropriate content from the network, SIE encourages its players to submit any content that breaches our Code of Conduct. Trained moderators will evaluate each report, remove any offensive content, and take action on the offender's account if we deem that a reported player violated our Terms of Service User Agreement.
Together with player reported content, we also use automated content detection tools to identify certain harmful content shared to PlayStation Network. This can cover the detection of profane language and harmful links that breach our Code of Conduct or the use of hash-matching tools to flag potential matches of harmful images. These hash-matching tools assign a unique digital signature to content, called a “hash”, and then compare the hash against a database of known signatures. Potential matches are either censored or sent to our human moderation team for further review.
In addition to taking action on an offender’s account, moderators may escalate reports for further review, which may include notifying law enforcement (or another appropriate government agency or appropriate authority) if the breach involves a threat to life or safety of yourself or others, or any other activity that we believe to be unlawful.