Content Moderation Policy
Effective Date: June 18, 2025
At Gradly, we believe in fostering a safe, inclusive, and intellectually engaging environment for students, educators, and professionals around the world. This Content Moderation Policy outlines how we handle user-generated content, the standards we apply, and what users can expect in terms of moderation and enforcement.
1. Moderation Objectives
Our goals in content moderation are to:
- Protect users from harm, harassment, and abuse
- Promote respectful and constructive discussions
- Uphold community standards and academic integrity
- Comply with legal and institutional regulations
2. Types of Content We Moderate
Content on Gradly includes:
- Posts and comments
- Messages and replies
- Media uploads (images, documents, videos)
- Usernames, bios, and profile details
- Community and space names/descriptions
3. Content That Violates Our Standards
We reserve the right to remove content that includes or promotes:
Hate Speech or Harassment
- Attacks based on race, religion, gender, nationality, sexual orientation, or disability
- Personal insults, bullying, or threats of violence
Explicit or Inappropriate Material
- Pornographic, sexually explicit, or violent imagery
- Profanity or vulgar content not suitable for an academic space
Misinformation or Academic Dishonesty
- Spreading false academic claims or misleading facts
- Posting or requesting exam answers, assignments, or plagiarized content
Illegal or Harmful Activity
- Drug use, self-harm, suicide promotion
- Fraud, hacking, or any unlawful behavior
Spam and Promotion
- Repetitive posting, clickbait, or misleading links
- Unapproved product or service promotions
4. Moderation Methods
We use a combination of:
Automated Detection
AI-assisted moderation helps flag content based on keywords, patterns, or violations of platform rules.
Human Review
A trained moderation team manually reviews flagged content and user reports to ensure accuracy and fairness.
Community Reporting
Users can flag content that violates platform rules. Reports are reviewed promptly, and actions are taken when necessary.
5. Enforcement Actions
Depending on the severity and nature of the violation, actions may include:
- Content removal or editing
- Warnings or temporary suspension
- Permanent account ban
- Restriction from certain spaces or communities
- Referral to law enforcement (for serious offenses)
6. Appeals Process
If your content is removed or your account is restricted, you will receive a notice with the reason. You may request a review by contacting moderation@gradly.com. We review appeals case-by-case and respond within 5 business days.
7. Transparency and Updates
Gradly may periodically publish moderation statistics (e.g., number of posts removed, reasons, actions taken) to our Transparency Center to uphold platform accountability.
This policy may be updated from time to time. Continued use of the platform implies acceptance of the latest version.
8. Contact Us
If you have questions or concerns about moderation on Gradly, reach out to:
support@gradly.org