Content is moderated by a team of moderators that our team recruits through our offered positions. Moderators receive user and automated reports and review them manually to ensure each report is processed fairly and by human judgement. We try to minimize automated moderation as much as possible to prevent automated termination.
Once a report is made, a Moderator will be assigned to investigate it and determine the legitimacy of the report. If the content reported violates our Community Guidelines, the Moderator will be able to delete the content and issue a Community Strike against the violator. Both parties will be notified of the outcome of the report for both transparency and accountability. However, the violating party will not be revealed who reported the content. If the report is determined to not be legitimate, the reporting party will be notified of its denial and may offer more explanation if they believe the denial to be incorrect.
Our Moderation team is made up of hand-picked users we recruit through our offered positions. You can identify a Moderator through the Green Star that appears after their name. Moderators have special permissions within our community to help moderate content so that we do not have to rely on automated moderation (which can at times be highly inaccurate). If you are interested in becoming a Nerfite Moderator, you my apply on our Positions Page.
We are committed to keeping you safe. If you ever feel otherwise, please contact a Moderator or report the content/profile making you feel unsafe.
We want to recognize all of our current moderators, so this section may be a bit long, but it is the last section of this document. You may also use this list to find and contact a moderator. They are listed alphabetically by their display name (not their username).