Highest Rated Comments


Ladsgroup_Wiki11 karma

Hey, I'm Amir and I work with Aaron in the scoring platform team. One thing about most AI models is that they return a number between zero and one as probability of an edit being vandalism. When the probability is higher than, let's say 90%, it can be automatically reverted but in some cases like sneaky vandalism, it gives a higher probability to the edit for other reasons (for example because age or edit count of the user is not much) but not more than 90%. And these edits all gets highlighted for human review which still requires some amount of human time but still reduces the backlog for them by filtering out obviously good ones.

Ladsgroup_Wiki7 karma

Hey, I'm Amir and I work with Aaron in the scoring platform team. This is mostly related to each Wiki's different policies and topic of the article. For example for English Wikipedia there is a page explaining on when the page needs to be protected by volunteers (to clarify, protection is done by volunteers and staff won't do this directly unless in extreme cases). As a rule of thumb, when I have my volunteer hat on, I usually protect after three vandalizing edits.