A better future for Story Moderation and the first steps toward democracy for publishing on the internet.
The following is purely an experiment that won’t be launched till the earliest 2025 (over 1 year from now), if at all. We are sharing this publicly to garner your feedback and see if this is a system that would be desired among authors in the community. If this proves to be unpopular based on your feedback, then we will not launch this system at all/ever (we may even host an open author referendum to vote on implementing this system or not). With that said let’s share our vision for a more open and transparent way to run a publishing platform.
Author’s Note: with an unbelievable rise in hateful content on platforms such as X and more, we feel more emboldened than ever to create a space where authors and readers don’t experience hateful content but also are empowered to share dark and taboo stories that may be excluded elsewhere in our commitment to never censor romance authors. Our goal with the Ream Republic is to build an open system that makes us as a platform accountable to our community of readers and authors.
This is just a first draft. We aren’t launching the Ream Republic until 2025, over a year from now, and in that time will work with authors and readers in the community to ensure the integrity of this system and fairness for all. Specifically, we know there are plenty of unanswered questions in this document. The goal is to be able to host open calls with authors in the community over the next year to work through these answers together so we can build a better moderation system that supports us all. We won’t pretend to have all the answers— but we do believe that by working together and being open with the community early in this process, we can create something really special.
99.9%+ of authors and stories on Ream will not be subject to any form of review or moderation, and this memo is only meant to share how we will approach the 0.1% of content that is reported on the platform by readers and that is subject to review under our content guidelines. In short, this system is designed to protect the Ream community from bad actors. These guidelines are the Content Guidelines set forth by Ream here.
At Ream, we hold three values near and dear to our heart: Storytellers Rule the World, Readers First, and Community is King.
As we come just about one week away from launching Discovery on Ream we want to share something we think will make a huge impact on the future of publishing and how we moderate stories on the internet.
We never normally share future plans like this. However, in our mission to build the best publishing platform in the world for fiction authors, we understand that early decisions we make and the systems we build will have long-standing impacts on millions of readers and stories around the globe.
With this said…. here’s the problem.
Just like with our Payment system (the only platform in the world to have Managed and Direct options for authors for payouts) we have also sought to build the most robust Discovery system for authors and readers in the world. It’s so big, that it’s basically an entire relaunch of the Ream platform that combines novel forms of search and discovery never before seen for readers or authors.
With this system in place, we believe we can empower readers to find stories they love easier than ever before and for authors to reach readers who will love their stories more effectively.
But herein lies the question.
What stories and storytellers do we empower on the Ream platform?
At the surface, this may seem like a trivial question. After all, Ream is only one year old, we don’t need to worry about things like this. We can kick that can down the road. Isn’t that what every other social platform has done, whether it be a home for video or meeting new friends?
The problem is content moderation is broken on the internet.
For over a decade, the pipe dream has been to have an automated system that can automatically approve content that does or does not adhere to specific content guidelines.
Although this has worked somewhat decently with the most egregious offenders (think people using platforms to run blatant scams or illegal activities), it works horribly when making decisions and enforcing content around “gray areas”.
In short, any boundary we make between what IS and what IS NOT allowed is a cultural decision with political ramifications.
It’s one thing to say no hateful content is allowed on a platform (which, indeed, Ream does not allow hate speech or hateful content). But who gets to decide what is hateful and what is not? And even if a computer makes the decision, who has programmed that computer, and what was it trained on to make these decisions?
This is why even in a mad race to automate content moderation — humans are still an essential part of the process.
There are two prevailing models for how we moderate content on the internet, and they both suck.
Model #1: Big Corporation hires in-house content moderation to make decisions about what content does and doesn’t belong on the platform.
This model has two striking downsides.
- A group of moderators are underpaid and overworked for positions that regularly subject them to content that is often traumatizing and at the very least disturbing. In the process, we create a new class of people akin to “sin-eaters” or trash collectors on the internet that scrub away the worst of humanity. What if content moderation was about more than scrubbing away the bad and instead also uplifting and empowering a community you care about?
- People with little cultural context for specific niches of content make decisions that penalize voices that need to be heard, such as excluding romance authors from even sharing their stories online or allowing hateful content to proliferate that harms readers and authors on the platform.
Model #2: Libertarian moderation with individual creators and users in charge of blocking people they don’t want to hear from and tightly controlling who they follow.
At the surface, this may seem like a better solution. It’s certainly one favored by corporations that don’t have to dole out for moderation costs. But the trouble comes in when algorithms perpetuate hateful content and platforms allow hateful content to empower communities of people online.
This is really scary.
If Storytellers Rule the World, then WHICH Storytellers Rule the World has serious implications for the future of our world.
The question ultimately comes down to: whose decision is it to make to allow or not allow certain stories on a platform?
The prevailing model has either been for massive, distant corporations to do so or to allow any content to proliferate.
We have another plan.
It’s called the Ream Republic.
It’s the foundation of our Story Moderation system on Ream, and is a large-scale experiment in enacting a democratic system on the internet.
In short, the Ream Republic empowers authors and readers on the Ream platform to elect fellow community members to serve in roles that make key decisions about which stories do or do not violate Content and Genre Guidelines on Ream. To be extra clear, the Content Guidelines of Ream will not be changed through an electoral process but instead, the people who moderate the platform based on said Guidelines will be elected (here are Ream’s Content Guidelines).
Then, these community members (who will be paid for their work) will be elected again every year, allowing Ream to respond dynamically to the needs of the community and changing culture by giving YOU ALL a voice in how we enforce our Content Guidelines.
We plan to launch the first election and the beginning of the Ream Republic in 2025.
Yes, that’s right, 2025, a little over a year from now.
Why are we announcing this so soon? Because over the next year, we will be announcing updates and taking your feedback on how this system can better work for all of you, while also building out systems on the back-end that allow for this kind of Republic to come to life.
In the meantime, the Ream Team is small but mighty, and all moderation decisions will be filtered through the executive team, directly.
The first iteration of the Ream Republic in 2025 will be a first draft and just the beginning in our mission to create a future where Storytellers Rule the World.
Here are the principles that will guide all story moderation on Ream:
- Community is King. Ream shall not impose a singular enforcement of our content guidelines that can be exclusionary to important voices and groups on the platform. Similarly, individuals making decisions alone are never as strong as a community that empowers trusted leaders. That’s why we empower elected Ream community members to make these important decisions on behalf of the community.
- Storytellers Rule the World. Within the authors’ communities on the platform, Storytellers will always have the final say on what conversations their readers can engage in or not. They will have the tools to moderate these interactions and can create a safe, inclusive environment for their readers.
- Ream is an interconnected system of endless reader communities. In this way, no community interacts or operates in isolation. This is why discovery is so essential on a platform like Ream. We can help connect readers to different stories and communities and empower all of us to be stronger together. But with discovery core to everything on the Ream platform, it is even more important to ensure that how communities interact with one another and what communities and stories we allow to flourish on the platform are reviewed properly.
- Ream is not a home for everyone. It’s a home for fiction authors and readers. We won’t ever be apologetic about this. We love all creators, but we are creating Ream FOR fiction authors and readers. With a clear focus on the group of people we are serving, we can give you all a voice in shaping what kinds of stories we uplift, empower, and allow on Ream.
With these principles in place, we have designed a new system to change how Story Moderation works on the internet.
It’s radical. It’s exciting. And we think a massive step forward for the industry.
But how will it work?
In short, In 2025, a public election will be held open to all members of the Ream community. There will be two kinds of paid roles that readers and authors can run for election:
- Storylators (story + legislator): people who make Story Moderation decisions with responsibility for specific Shelves. Shelves are collections of related genres and subgenres. Ideally, Storylators have tons of knowledge and insight into the genres and subgenres they plan to be a Storylator for.
- Story Magistrates: a group of three authors or readers who review all Story Moderation Appeals from all Shelves and make a final decision on whether the Story violates Content or Genre Guidelines.
Story Magistrates and Storylators are both paid positions and will last for a term of one year at which time, another public election will be held.
There will be Guidelines that Story Magistrates and Storylators have to follow in both the election process and when serving in their elected roles and if these are violated, respective Storylators and Story Magistrates will be removed from their roles.
The responsibilities of Story Magistrates and Storylators are as follows. Note: only Stories reported by readers will be subject to review by Storylators and Story Magistrates meaning that 99.9% of content on the platform will never be subject to any form of review and will be freely published to all readers.
- For stories reported by readers: Enforcing the existing Content Guidelines and making decisions about moderation for specific stories from the platform that have been flagged for review.
- For stories reported as being miscategorized by readers: Enforcing the existing Genre Guidelines to ensure that stories that have been flagged for review are correctly categorized. This is designed to prevent blatant abuses of the Genre system on Ream that create a disruptive experience for readers. Stories incorrectly categorized will be delisted from Discovery until requisite changes are made but still available to existing Followers and Paid members of their Reams.
- And very importantly: this is not a way for Ream to outsource responsibility for moderation decisions. The Ream executive team will be able to review and veto any moderation decision to protect the best interests of the community. In addition, Ream will pay every Storylator and Story Magistrate the same hourly rate (which will be announced in the future before any public election) for their work.
The key requirements of Story Magistrates and Storylators are as follows:
- They must follow the Content Guidelines of Ream when engaging in moderation activity. This means upholding our commitments to not censor steamy romance authors and create a safe-inclusive space for readers free from hateful content. For reference, these are Ream’s Content Guidelines: https://reamstories.com/policy/content-guidelines
- They must follow a set Code of Conduct for Storylators and Story Magistrates which will be released before the first Ream Republic Election. This will help ensure elections proceed smoothly and fairly and that set systems and processes are followed when moderating stories on the platform.
- They must have a few hours available per week to engage in the responsibilities of Story Magistrates and Storylators.
To be extra clear, the purpose of Story Magistrates and Storylators is not to gate-keep content on the platform. It’s to empower elected leaders in the community to enforce the Content Guidelines of Ream and to ensure that stories are not miscategorized egregiously in ways that would harm the reader experience.
Instead of Ream as a platform making all these decisions behind closed doors and imposing them upon our ecosystem of authors and readers, we want to do this together with you. Together we can create an open, democratic system that can hold ourselves to the standards and values of the community through annual elections.
These elections will allow authors and readers to vote for Storylators for specific genres + subgenres AND Story Magistrates that allow the members of the Ream community to guide content moderation on the platform.
A note on elections: to ensure the highest integrity of elections we will have a number of rules and mechanisms in place to ensure accountability and transparency. Here are 6 core principles foundational to how elections will run:
- Members of specific Shelves will only be able to elect leaders of said Shelves. This ensures that people invested in the specific community are the ones who elect the Storylators for their shelves.
2. Fraud, bribing, and other election gaming mechanisms will be banned and meticulously eliminated as they are found.
3. If Ream determines an elected Storylators or Story Magistrate is making decisions not consistent with the company’s Content Guidelines or best interests of the community, they will be replaced at the sole discretion of Ream.
4. Only members of the Ream community can vote in elections. To be a member of the Ream community as an author you must have at least 10 Followers or 1 Paid Member for at least 6 months. OR as a reader you must have a paid subscription on the platform for at least 6 months.
5. There will be no public campaigning for elections and all candidate names will remain anonymous to ensure candidates that answer the questions for their position the most effectively are voted for by members of the Ream community. On the ballot, candidates will be listed in random order with their names kept anonymous and their answers to questions (think of this almost as a job interview in public). Then members will select the member they think is best fit for the specific role in question.
6. Ream members will only be able to vote for Shelves that they are a member of. Thus, authors can vote for Storylators in genres they write in and readers can vote for Storylators in genres that they have paid subscriptions in. This ensures people with knowledge and insight in their community are electing people to do moderation work.
We will release more detailed guidelines for the conduct of Storylators and Story Magistrates as we approach the election and official launch of the Ream Republic in 2025.
For now, you can see here an overview of what the process will look like:
And here’s an overview of the Story Moderation and Appeal Process on Ream:
Right when Discovery launches, we will share with you tips for maximizing your Discovery on Ream as an author.
in the meantime, this memo serves as a grand experiment. We know there are many unanswered questions, and we hope to be able to answer these together in conversations over the next year and welcome your feedback at any and all times.
It’s going to be a ton of fun! And we can’t wait.
This is by FAR our biggest launch yet. It’s a whole new Ream. And a whole new world for authors.
One where we put Readers First.
One where Community is King.
One where Storytellers Rule the World.
We will see you all soon for the big update! And in the meantime, I hope you are having an amazing end to the holidays!
With 💜,
Michael and the Ream Team