Understanding Moderation
Campus provides a structured moderation system with:- User-generated reports
- Automated flagging
- Moderation queue
- Action logging
- Real-time updates
Moderation Permissions
Who can moderate content:| Role | Can Moderate |
|---|---|
| Content Author | Own posts and comments |
| Group Moderator | All content in their groups |
| Space Moderator | All content in their spaces |
| Space Owner | All content in their spaces |
The Moderation System
Moderation Cases
When content is flagged, a moderation case is created:Case States
Open
New case, awaiting review
- Appears in moderation queue
- SLA timer starts (15 minutes)
- Automatically assigned or available for claim
In Review
Moderator is actively investigating
- Assigned to specific moderator
- Evidence is being evaluated
- Actions are being considered
Escalated
Case requires higher-level attention
- Complex or sensitive situation
- Potential policy violation
- Requires administrator decision
Accessing the Moderation Dashboard
Only users with moderator or owner roles can access the moderation dashboard.
- Navigate to
/admin/moderation - View the moderation queue with active cases
- See case statistics in the overview cards
Dashboard Overview
The dashboard shows: Statistics Cards- Open cases count
- In review cases count
- Escalated cases count (highlighted in red)
- Resolved cases count (shown in green)
- All cases
- Open only
- In review only
- Escalated only
- Resolved only
Handling Reports
Viewing a Case
Review Evidence
Examine all evidence entries:
- Type of report (user report or auto-flag)
- Reason provided
- Content snapshot
- Number of flags from different users
View Source Content
Click “View Source” to see the content in context:
- For posts: Opens the post in the feed
- For comments: Navigates to the comment location
- For messages: Opens the message thread
Taking Action
Depending on your assessment:- No Action Needed
- Delete Content
- Start Review
- Escalate
If content doesn’t violate rules:
- Click “Resolve Case”
- Case is marked as resolved
- Reporter receives notification (optional)
SLA Management
Best practices for meeting SLA:- Check the queue regularly (multiple times per day)
- Assign multiple moderators for coverage
- Set up notifications for new cases
- Have clear guidelines for quick decisions
Automated Moderation
Auto-flagging
Campus automatically flags certain content: Video Post Flags Video posts are automatically flagged and sent for review with metadata:- Start in “in_review” state
- Require moderator approval
- Help catch policy violations early
Configuring Auto-flags
Administrators can configure automated flagging rules for:- Specific content types
- Keyword patterns
- Spam detection
- Link analysis
- User behavior patterns
User Reporting
How Users Report Content
Users can report problematic content:- Click “Report” on a post or comment
- Select a reason:
- Spam or misleading
- Harassment or hate speech
- Violence or dangerous content
- Inappropriate or offensive
- Other (with explanation)
- Optionally provide additional context
- Submit report
Handling Multiple Reports
When multiple users report the same content:- Evidence is appended to existing case
- Flag count is tracked
- Duplicate reports are deduplicated
- Higher priority in queue
Moderation Actions
Deleting Posts
To delete a post with proper logging:- Verifies you have permission
- Deletes the post
- Creates a moderation log entry
- Returns success status
Deleting Comments
To delete a comment:- Checks moderation permission
- Deletes the comment
- Logs the action with post context
- Maintains thread integrity
Moderation Logs
All moderation actions are logged:- Accountability
- Audit trails
- Pattern analysis
- Training and improvement
Moderation Best Practices
Be Consistent
Be Consistent
Apply rules uniformly:
- Document your decisions
- Create moderation guidelines
- Have regular team meetings
- Review cases together to calibrate
Act Quickly
Act Quickly
Respond to reports promptly:
- Check queue at least 3x daily
- Prioritize SLA breaches
- Handle clear violations immediately
- Escalate uncertain cases quickly
Communicate Clearly
Communicate Clearly
When taking action:
- Explain the violation (when appropriate)
- Reference specific rules
- Provide appeals process
- Be respectful but firm
Use Progressive Discipline
Use Progressive Discipline
For repeat offenders:
- First violation: Warning + delete content
- Second violation: Temporary restriction
- Third violation: Extended restriction
- Persistent violations: Permanent ban
Protect Reporter Privacy
Protect Reporter Privacy
Never reveal who reported content:
- Reports are anonymous
- Don’t mention “someone reported”
- Focus on the violation, not the report
Take Care of Yourself
Take Care of Yourself
Moderation can be emotionally taxing:
- Rotate difficult content among team
- Take breaks when needed
- Debrief with fellow moderators
- Know when to escalate
Notifications and Alerts
Moderator Notifications
Moderators receive notifications for: New Cases- When content is reported
- When auto-flags are triggered
- Real-time in the dashboard
- When a case is escalated
- Email notification to administrators
- Uses template:
moderation-escalated.html
- Daily digest of moderation activity
- Statistics and trends
- Uses template:
moderation-daily-summary.html
Configuring Notifications
Administrators can configure:- Who receives notifications
- Notification frequency
- Escalation thresholds
- Email templates
Moderation Reports and Analytics
Key Metrics
Track moderation effectiveness:- Average resolution time: How long cases stay open
- SLA compliance rate: Percentage resolved within 15 minutes
- Case volume trends: Increasing or decreasing reports
- Action breakdown: Deletions vs. no action vs. escalations
- Top reporters: Users who report most frequently
- Top reported users: Users most frequently reported
Using Analytics
Insights help you:- Identify problematic users or patterns
- Adjust moderation resources
- Update community guidelines
- Train new moderators
- Demonstrate moderation effectiveness
Common Scenarios
Spam Posts
Identification:
- Repeated content
- Promotional links
- Irrelevant to community
- Delete immediately
- Check for other spam from user
- Consider account restriction
Heated Arguments
Identification:
- Multiple reports
- Personal attacks
- Derailed conversation
- Review full thread
- Delete inflammatory comments
- Consider locking thread
- Message users privately
Inappropriate Content
Identification:
- NSFW material
- Offensive language
- Violates ToS
- Delete immediately
- Warn or suspend user
- Document for repeat offenders
Harassment
Identification:
- Targeted negativity
- Repeated unwanted contact
- Threatening language
- Escalate immediately
- Document all evidence
- Consider user suspension
- Notify administrators
Appeals Process
Users should have a way to appeal moderation decisions:Decision Communicated
Clearly explain the decision:
- If upheld: Why the original decision stands
- If reversed: Apologize and restore content if appropriate
Next Steps
Manage Spaces
Learn how to configure and manage your spaces
Manage Groups
Organize your community with groups and manage members