More and more charities are setting up online support communities – a space online for people who use their services to come together and share information, offer support and help each other to feel less alone. It’s almost expected now – if you don’t have a space like this, you may find people use your Facebook page or other social media pages to ask questions and support each other.
Communities are an excellent way to increase reach, help people connect and improve outcomes. Online community members (especially those who are established and ready to ‘give back’) are often more engaged with the charity and more likely to take part in focus groups, respond to surveys and even fundraise.
The need for moderation
But online communities need to be monitored and moderated. As an example, I recently conducted an online consultation for a charity in the process of setting up a new community. Participants identified a number of issues they had come across in badly moderated or unmoderated communities.
- Posts going unanswered – or answers being unbalanced with some people getting lots of responses and some people getting very few.
- People feeling unwelcome or overwhelmed.
- Posts being misinterpreted or misunderstood.
- Spam and trolls.
- Personal attacks.
- Judgmental or critical posts.
- Incorrect information and advice.
- Competition and comparison.
- Arguments that become too heated.
- Detailed discussion of suicide or graphic posts.
- Inappropriate sexual content.