[go: up one dir, main page]

Page MenuHomePhabricator

Apply editing rate limits for all users
Closed, ResolvedPublic

Description

While investigating a site issue, Tim found that a single editor was making 4 edits/second, which was causing impact to the site.

We rate limit edits for anons and new users (8 edits/min), but established users are not throttled. Should users have a limit (something pretty high, like 60 edits/min)? This is significantly more than the guideline for bots (https://meta.wikimedia.org/wiki/Bot_policy#Edit_throttle_and_peak_hours), but would ensure that a legitimate user isn't blocked when editing quickly.

Event Timeline

bzimport raised the priority of this task from to Medium.Nov 22 2014, 2:14 AM
bzimport set Reference to bz54515.
bzimport added a subscriber: Unknown Object (MLST).

Not sure who would "decide" on such a request (which probably translates to "Might need to be rbought up in more places than Bugzilla for consensus")...

There is also the noratelimit user right for sysop and some other user groups.
When the user is part of that group, it will not help to set a limit. Bots have always that right.

Maybe we should reconsider who we hand out noratelimit to?
Do we want to announce this change?

Do we want to announce this change?

Think so. But is this going to happen anytime soon?

This isn't the first time we've had an editor affect the site by editing too fast. I think we should apply some limit, just to make it less likely that someone accidentally harms the site. 60/min seems high to me, since the average time to save a page is >1s, so to hit that you likely have to have multiple processes editing in parallel.

60/min seems high to me, since the average time to save a page is >1s, so to hit that you likely have to have multiple processes editing in parallel.

But a regular user can hit this rate with normal, "manual" editing -- namely, by opening many pages in separate tabs in their browser, editing each one, then going through repeatedly saving and switching to the next tab. This can easily hit a 60/min rate (not for a full minute, usually, but for shorter stretches, like 20-30 seconds). I do this kind of thing a lot. Assuming this "throttling" will merely delay page saves, as opposed to directing the user to a "try again later" sort of error page, then it shouldn't be a problem. If the user will be getting some sort of error, perhaps up the limit a bit, just to be safe. And if the user is going to be getting blocked, then the limit should be much, much higher.

So, to move on this task, we need to identify pattern and gather edit rates statistics.

There is already a patch for this, but recived a lot of critic: https://gerrit.wikimedia.org/r/#/c/280002

Change 280002 had a related patch set uploaded (by Dereckson):
Apply rate limit to edits for normal users

https://gerrit.wikimedia.org/r/280002

There is already a patch for this, but recived a lot of critic: https://gerrit.wikimedia.org/r/#/c/280002

The limit of 50 every 10 minutes is probably too low.

An intelligent limit should be designed.

As a first conservative measure, I would recommand 1000 edits per minute.

That would allow heavy operations like mass DR on Wikimedia Commons.

Added the SRE tag under the assumption that it's an operation question whether such a change is needed at all. In any case, we'd need some careful analysis of how fast editors have to work.

Change 316980 had a related patch set uploaded (by Dereckson):
Throttle user edits to 1000 per minute

https://gerrit.wikimedia.org/r/316980

"Should users have a limit? " → What problem would it solve, exactly?

Wouldn't a limit cause problems to tools like Cat-A-Lot or QuickStatements?

I've asked feedback on IRC and French Wikipedia / Commons French community village pump on this issue.

First, users raised the issue of automated tools like quoted by Ash_Crow, but also general maintenance operation for non admins.

The concept of a limit is considered as too harmful for contributors and counterproductive.

Then, they asked intelligently WHY such limit. Primarily, the interest to individualize by contributor the edit limit is perhaps debatable.

A personal opinion is we should inquire if a very high limit could be helpful against DDoS and spambots.

Change 316980 abandoned by Dereckson:
Throttle user edits to 1000 per minute

Reason:
No established need for that.

https://gerrit.wikimedia.org/r/316980

Change 280002 abandoned by Chad:
Apply rate limit to edits for normal users

Reason:
Abandoning all config changes > 1y old

https://gerrit.wikimedia.org/r/280002

Tagging for the interest of stewards, especially as they often have to do all the work of cleaning up after high-rate spam/abuse edit attacks.

Erm, so I guess goodbye Cat-a-lot and VFC? Why was this decision made despite what @Dereckson wrote above?

They should be rewrited to add a limit of no more than 90 edits per minute.

This would kill their purpose, some deal with thousands of pages, rewriting them with such throttle would make them unusable for any practical purposes.

I see that this task was created in 2013 because there were some hardware issues or something. I do not think it would be wrong to assume that our infrastructure has improved over half the decade to dissolve the concern.

rewriting them with such throttle would make them unusable for any practical purposes.

If they became "unusable for any practical purposes" then I'd be interested to hear why.

I see that this task was created in 2013 because there were some hardware issues or something.

I do not see "some hardware issues or something" mentioned anywhere.

Why was this decision made

Probably because we have seen crazy editing rates and DoS attacks in the past from attackers using established accounts.

I thought security team and autopatrolled users on Commons has reached a compromise — T194864: Raise the rate limit for autopatrollers on Commons

I think we are mixing up some things here:
1: @Johan please note that this issue was already resolved quite a while ago (in May), and this ticket was resolved because of that fix in may. This ticket was added to User-notice in 2015 however, and I'm not sure if @Bawolff thinks it is a good idea to announce this so long after the fact, and also per WP:BEANS.
2: @Base and @revi: This limit applies to all 'base' level users, but certain groups have a higher rate limit, like the autopatrollers on Commons and bots. If there are other groups that should be exempt from this, please identify them.
3: @Bugreporter "They should be rewrited to add a limit of no more than 90 edits per minute." Well, you should never write a specific rate limit into such a high volume script, unless you have read said rate limit from the user's user info. Ideally however you read the api's error messages https://www.mediawiki.org/wiki/API:Errors_and_warnings#Standard_error_messages and when you encounter rate limiting, you apply an exponential backoff algorithm. More ideally... we implement that in the JS api modules T200411: Implement backoff algo in mediawiki.Api module so no one has to do this on their own.
4: @Base @revi Please understand that we are under constant attack from various people and that platform safety trumps occasional inconvenience of a user. If a script breaks/errors out due to rate limiting, the script needs to be fixed to take into account rate limiting. If a script 'takes too long', then it should show the user how long and give the user an option to break off the action.

I was merely informing @Base that his concern was (mostly) resolved. I already know the behind the scenes stuff that caused expedition of this task. (i.e. T192668 also happened elsewhere)