More and more companies are incorporating users into content moderation processes in ways that go beyond simply allowing them to report objectionable content. The space of possibilities for incorporating users is expanding rapidly — On many modern services, content creators can moderate comments on what they post or users can become volunteer community moderators, but on others, users can build their own moderation tools to customize their experiences. Some companies have even formed long-term relationships with groups of experienced power-moderators, integrating them directly into the design and testing processes for new moderation tools. While these approaches have tremendous potential, they also carry significant risks. This presentation will highlight real examples of collaborations between companies and users to show how to structure collaborations that maximize value while avoiding common pitfalls.