Incident Reporting System
As Product Safety and Integrity, we want to provide safer and healthier environments for communities. We are building the Incident Reporting System (IRS) to make it easy for users to find the right path to get help when experiencing a harmful incident.
Incident Reporting System ![]() | |
---|---|
Helping to get help in harmful incidents | |
Group: | Product Safety and Integrity |
Backlog: | Backlog |
Lead: | MAna (WMF) |
Updates: | Updates |
Background
editReporting and processing harmful incidents has been a topic of interest for Wikimedia communities for many years. With the Universal Code of Conduct, it became crucial to have a discussion about user reporting systems.
The way misconduct and policy violations are dealt with across Wikimedia spaces, wikis, and platforms has developed organically. Each community and group has their way of reporting and processing incidents. It happens via wiki talk pages, noticeboards, email, or private discussions on off-wiki communication channels (Discord, Telegram, and more).
For many users, it's been unclear what to do when an incident happens: where to go, who to talk to, how to report, what information to include in the report, how the report is processed, what happens afterwards, etc. Users must know how to flag an issue and where to go to get help. Some do not feel safe reporting incidents because of the complexity of the reporting process or due to privacy concerns. There is also very little information on what will happen once a report is made and what expectations the user should have.
Updates
editAugust 2025: Conclusions from the conversations and next steps
editAs we previously posted, we planned conversations with a few groups (Stewards, Arbitration Committees, and U4C members) to hear their comments on the development of the Incident Reporting System. Based on what we heard, we are designing a flow for reporting incidents that are not emergencies.
The new flow needs to be legally and community-acceptable, and designed to work across all wikis. It will reflect and complement, rather than duplicate or disrupt, existing practices of different communities. In consequence, we have decided to make it configurable via Community Configuration.
We are also exploring an improved list of violation types, and customizable "Get Support" pages based on the type of incident. In parallel, we are researching different wiki communities to better understand existing practices. This is to make the default setting reflect the typical needs well, and to make it address compliance requirements.
Lastly, we wanted to note here that as the previous Trust and Safety Product team, we have been merged with the former Product Security and formed Product Safety and Integrity. People previously involved in the IRS project have not changed their roles, including the owner (Madalina). However, due to this merger, Madalina now has support from Eric (the owner of the broader Safety and Security area) and Szymon (the Movement Communications person for the Safety and Security area). We encourage you to contact any of us if you'd like to talk about IRS. 🖖
Product & design specifications
editUse cases
editAs a user looking to get help with a harmful incident
- I want to be able to indicate the type of the incident I’m experiencing so that I can get help according to my issue.
- I want to get information about the help resources available that are relevant to the issue I’m experiencing.
- I want to be clearly informed when the help is provided by the community and not the Foundation so I can adjust my expectations accordingly.
As an administrator/functionary,
- I want to be able to customize the help pages that are presented to reporters according to the incident they’re experiencing and the policies of my wiki.
- I want to be able to customize different formats (links, text) from Community Configuration so that I can present the right information to the user seeking help.
- of a not large wiki, I want to have a standard/default template that I can use in case my wiki doesn’t have established policies or processes.
Emergency incidents
editImmediate threats of harm are reported via the emergency reporting flow. The user can submit a report when they become aware of a situation involving serious risk to their (or someone else’s) safety. Once submitted, these reports are sent to the Trust and Safety team at the Wikimedia Foundation who are responsible for reviewing and responding to such issues.
Non-emergency incidents
editSituations that do not involve an immediate threat of harm but require attention, such as harassment, are reported via the non-emergency flow. In this case, users are guided to a help page that provides clear information on how to get support for their specific concern. The goal is to ensure users receive the right support in a way that reflects the needs and practices of the local community.
This help page can be customized by each community using Community Configuration, allowing them to include guidance that aligns with their local policies and processes.
Deployment timeline
edit- First and limited version on Portuguese Wikipedia – December 2024
- Next pilot wikis – TBD
- Final deployment (estimated deadline) – TBD
Contact
edit
Frequently asked questions
editIs there data available about how many incidents are reported per year?
Right now there is not a lot of clear data we can use. There are a couple of reasons for this. First, issues are reported in various ways and those differ from community to community. Capturing that data completely and cleanly is highly complicated and would be very time consuming. Second, the interpretation of issues also differs. Some things that are interpreted as harassment are just wiki business (e.g. deleting a promotional article). Review of harassment may also need cultural or community context. We cannot automate and visualize data or count it objectively. The incident reporting system is an opportunity to solve some of these data needs.
How is harassment being defined?
Please see the definitions in the Universal Code of Conduct.
How many staff and volunteers will be needed to support the IRS?
Currently the magnitude of the problem is not known. So the amount of people needed to support this is not known. Experimenting with the minimum viable product will provide some insight into the number of people needed to support the IRS.
What questions are you trying to answer with the first version of the product?
Here are the questions we need to answer:
- What kind of reports will people file?
- How many people will file reports?
- How many people would we need in order to process them?
- How big is this problem?
- Can we get a clearer picture of the magnitude of harassment issues? Can we get some data around the number of reports? Is harassment underreported or overreported?
- Are people currently not reporting harassment because it doesn’t happen or because they don’t know how?
- Will this be a lot to handle with our current setup, or not?
- How many are valid complaints compared to people who don't understand the wiki process? Can we distinguish/filter valid complaints, and filter invalid reports to save volunteer or staff time?
- Will we receive lots of reports filed by people who are upset that their edits were reverted or their page was deleted? What will we do with them?
How does the Wikimedia movement compare to how other big platforms like Facebook/Reddit handle harassment?
While we do not have any identical online affinity groups, the Wikimedia movement is most often connected with Facebook and Reddit in regard to how we handle harassment. What is important to consider is nobody has resolved harassment. Other platforms struggle with content moderation, and often they have paid staff who try to deal with it. Two huge differences between us and Reddit and Facebook are the globally collaborative nature of our projects and how communities work to resolve harassment at the community-level.
Is the Foundation trying to change existing community processes?
Our plan for the IRS is not to change any community process. The goal is to connect to existing processes. The ultimate goals are to:
- Make it easier for people who experience harassment to get help.
- Eliminate situations in which people do not report because they don’t know how to report harassment.
- Ensure harassment reports reach the right entities that handle them per local community processes.
- Ensure responders receive good reports and redirect unfounded complaints and issues to be handled elsewhere.
Pre-project research
editThe following document is a completed review of research from 2015–2022 the Wikimedia Foundation has done on online harassment on Wikimedia projects. In this review we’ve identified major themes, insights, and areas of concern and provided direct links to the literature.
Our team has been studying previous research and community consultations to inform our work. We revisited the Community health initiative User reporting system proposal and the User reporting system consultation of 2019. We have also been trying to map out some of the conflict resolution flows across wikis to understand how communities are currently managing conflicts. Below is a map of the Italian Wikipedia conflict resolution flow. It has notes on opportunities for automation.
-
A synthesis of research from 2015–2022 that identifies major themes in the problem space as well as user needs, challenges, considerations, and previous work.
-
On Italian Wikipedia, there's a 3-step policy in place for conflict resolution. This map visualizes this process and tries to identify opportunities for automation for both editors and admins.