Content Monitoring
The platform uses a combination of automated tools and human moderators to monitor content and interactions.
Proactive detection helps identify harmful or policy-violating content promptly.
Handling Reports and Complaints
Users can report inappropriate content or behavior via designated reporting tools.
All reports are reviewed confidentially by the moderation team.
False reporting or abuse of the reporting system is prohibited and may result in sanctions.
Support Team Role
Moderators enforce policies impartially and fairly.
They investigate reports, communicate with involved users, and decide on necessary actions.
The support team is also available to assist with user inquiries and appeals.
Investigation Process
Upon receiving a report, moderators gather evidence and evaluate policy compliance.
If a violation is confirmed, corrective actions such as content removal, warnings, suspensions, or bans are applied.
Users affected by enforcement will be notified of decisions and reasons.
Right to Appeal
Users may appeal moderation decisions within 7 calendar days after notification.
Appeals are reviewed by a senior moderation team member for fairness and accuracy.
The final decision after appeal is binding.