-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Heroku deployment - RuntimeError - ERR max number of clients reached #117
Comments
You've hit the max number of Redis connections allowed by your plan: https://addons.heroku.com/redistogo On 9 Apr 2012, at 04:10, Trung Lêreply@reply.github.com wrote:
|
@joneslee85 @mperham is right here. I solved it by having just one worker and using a sidekiq.yml where the concurrency is about 25 connections. The connection pool on the redis to go mini I think it about 50 connections. https://github.com/mperham/sidekiq/blob/master/examples/config.yml |
Sidekiq server defaults to (concurrency + 2) redis connections per process. Sidekiq client running in your Rails process defaults to 5 connections per process. |
@mperham Thanks, I got it resolved. |
I'm using Redistogo Nano on Heroku. After clicking around the web client, takes a few clicks, but eventually get the max clients reached error.
Here's the log output: https://gist.github.com/3843341 Is there anything else I should post? |
@kbighorse Did you specify the config file path for the worker in your Procfile? worker: bundle exec sidekiq -C config/sidekiq.yml either that, or specify the number of connections directly: worker: bundle exec sidekiq -c 3 |
I also have this problem on Heroku with Redisto Nano, even with concurrency set to 1. |
Me too :( |
web: bundle exec thin start -p $PORT this is Procfile I have with running sidekiq and Redistogo Nano. I had to specify environment '-e production' to make it work on heroku |
Hi, I'm getting the same problem here... I copied Redis-to-Go conf file and run Redis locally with the same restrictions, 10 max connections. Is it right? |
@uchoaaa Yes, that's the client connection. Set the client pool size to 1, as explained in the wiki: Sidekiq.configure_client do |config|
config.redis = { :size => 1 }
end |
If you have a max of 10 connections, you want:
If you have 3 unicorn processes, you can do this: # three unicorns = 3 connections
Sidekiq.configure_client do |config|
config.redis = { :size => 1 }
end
# so one sidekiq can have 7 connections
Sidekiq.configure_server do |config|
config.redis = { :size => 7 }
end You will want to use concurrency about 10 or so since you only have 5 connections for the workers (remember, 2 connections are required for the server internals). NOTE This assumes that your application is using Redis for Sidekiq only. If you are using Redis for other things, you'll need to ensure those connections are accounted for also. |
Adding the pool size in an initializer seemed to fix this. Thanks! I'll post back if it in fact doesn't... |
Thanks @mperham, it seems ok now. One more question, if I set the pool size to 1 and concurrency to 5 Sidekiq Client will create 5 threads and all 5 threads will share only one Redis connection, right? If so, this scenario could generate some issues, like race condition? |
The client is your Rails app process. It is single threaded so you only need one connection per process. Concurrency has nothing to do with the client. The client does default to 5 connections per process, only because ActiveRecord's database connection pool also defaults to 5. In practice, :size => 1 should be ok. The Sidekiq server is multi-threaded and runs many worker threads. The concurrency option adjusts the number of worker threads in the server process. The server will never allow less than 3 connections because you need 2 for sidekiq and at least one to share for the workers. If you set concurrency to 5, the server will default to creating 7 redis connections but you can set the connection pool size as low as 3 if necessary. |
@mperham man, you are awesome! Thanks, it's so clear now! |
Sorry, I'm still a bit confused, lots of "assumings", "ifs" and "at leasts" in this thread. Here is what is currently producing the max clients error for me: Procfile:
config/initializers/sidekiq.yml:
Ran 2 perform_async calls with 2 hits to /sidekiq, then reloaded /sidekiq in the web UI 2 more times and got "Internal Server Error" and the log shows the max clients error reached. Also got the same result with concurrency set to 3. Maybe someone could distinguish the following nouns for me in the case of a Unicorn/Redistogo setup with 10 max connections: connection, thread (concurrency seems to be somewhat the same thing here?), worker, process. Which are static, and what causes which to increase or decrease as workers are run and the web UI is hit? And what happens when you say |
Ah, just checked my (Unicorn worker process) * (Client size) + (Server size) > 10 # error! |
Change :size => 7 to something lower like 4 or 5. Does that fix it? RedisToGo limits your connections. All that matters are the # of processes and the size of the connection pool within those processes. Workers = Threads and they share the connections within the pool so you don't want a pool size of 2 when using concurrency of 25 as those 25 threads/workers will all fight over 2 connections. worker=2 means you are increasing your server process count, so you need to multiple your server connection pool size by that amount. Pool size of 7 means 14 total connections used by 2 sidekiq processes. |
@kbighorse You're basically right: (Unicorn process count * client pool size) + (sidekiq process count * server pool size) |
Awesome, so everything looks good with these settings; Redistogo Nano, 10 connections config/unicorn.rb:
config/initializers/sidekiq.yml:
Procfile:
Summary:
If I'm not maximizing my resources, let me know, right now I'm satisfied that it works. I'm guessing I could do server pool size of 6 and unicorn worker_processes of 4, which would give me 4 client connections + 6 server connections = 10. I do forget what increasing what the concurrency would do, though. |
Concurrency gives you more workers = doing more things at the same time. They fight over the redis connections though so you generally want concurrency no more than (server pool size - 2) * 2 or so, which works out to 6 in your case. So concurrency of 5 or 6 sounds right in your case. |
Just wanted to mention I've made a small tool to do the above calculations for you. Maybe it's more something for the wiki, but here it is http://manuel.manuelles.nl/sidekiq-heroku-redis-calc/ |
Just a heads up, there is also a limit on AR connections to pg as well. On Nov 15, 2012, at 5:48 PM, Manuel van Rijn <notifications@github.commailto:notifications@github.com> Just wanted to mention I've made a small tool to do the above calculations for you. Maybe it's more something for the wiki, but here it is http://manuel.manuelles.nl/sidekiq-heroku-redis-calc/ — |
thanks @seivan I'll add a link how to fix this to the tool |
Thanks guys. Setting the initializer with a pool size of 5 worked for me. 7 seemed to max it. Sidekiq.configure_server do |config| |
Is there a good document that explains all this? I can't seem to get a combination of settings that works for me. I'm running 'thin' on Heroku with just one worker process. Whether it works or not seems random. procfile: worker: bundle exec sidekiq -e production -c 3 sidekiq.yml: Sidekiq.configure_client do |config| Sidekiq.configure_server do |config| |
I have the exact same issue as @FlopTheNuts. I don't fully understand how all the parts combine together, but based on what I've read the settings that FlopTheNuts has should work and they don't I get
I have Redis mini with 50 connections and the CleaDB Drift plan with 30 connections to the Mysql database. Redis server = 4 Mysql connections should be ok 3 * 5 = 15 but I don't know about the rest. I am running Rails 3.2 with Thin so no mutlithreads. I've been trying all sorts of combinations that should work according to what I've read but none of them do. Even setting the concurrency value to 6 which @mperham says follows this formula: server pool size -2 * 2
Constantly gives me the following error and lots of failed jobs that never get run again:
Any help would be greatly appreciated I'm going crazy with this. thanks. |
I'm not sure if you got to the bottom of this, but for me, I had to On Wed, Jun 5, 2013 at 5:18 AM, Alex notifications@github.com wrote:
|
…th client and server. The server needs (concurrency+2) connections; currently concurrency is 3 threads, so the size is set to 5. The client needs just 1 connection (by default Rails gives a size of 5, which is overkill). This is per-process; currently Puma uses 2 workers (=2 processes), which means a total of 2 connections from the web server. The biggest win here is limiting the size of the client pool, which was too big with the default settings. This may help resolving recent issues with sidekiq not being able to get a redis connection. For more about this see: sidekiq/sidekiq#117
See sidekiq/sidekiq#117 Also sneak in adding `quiet_assets` globally
Hello I've read few issues and wiki pages. I still don't get it. We had the error yesterday ( What we have :
We have 5 dynos
We use Hirefire to scale and yesterday it fails with
What is the way to calculate how to stay under 80 connections? Thanks in advance |
Heroku support told us to check also the timeout :
Don't know if it helps for me my issue. Any advices ? |
Sidekiq 4 now takes up to concurrency + 5 connections. Web = 4 * 3 = 12 Sidekiq never closes connections. It pools them so they can be reused when needed. It only opens a connection if needed so a worker process will only use 15 connections if it is really busy. |
Thanks a lot @mperham for the help. When you write
So It means I have to add this 40 to my web count ? `Web = 4 * 3 + 40 = 52 ? Sorry quite out of thread. |
No, the 20 puma threads share the 3 client connections per process. That's the point of a connection pool. |
Great. Thanks @mperham |
* Use right Sendgrid format * Use Redis in Sidekiq * Fix Sidekiq config for Heroku in Procfile - sidekiq/sidekiq#117 * Bugfix email address generation * Skip auth token for webhook
* Trying solution as discussed at sidekiq/sidekiq#117
someone, please add this math to the docs on concurrency so it's easier to calculate the concurrencies required.
|
@adiakritos Why don't you add it? |
I added this info here https://github.com/mperham/sidekiq/wiki/FAQ#how-to-calcultate-the-number-or-redis-connection-used-by-sidekiq Feel free to remove the link to the blog post if you think it should not be present on the wiki. |
@mperham I would but I'm not entirely sure of what I'm talking about yet. Here's a possible draft to add into the wiki, otherwise feel free to leave it here for discussion. Basically, this is my understanding of the whole setup and how to calculate concurrency in the simplest possible setup. Use this calculator to figure out how to tune concurrency for your specific configuration. The different components to understand when setting up your config are the following:
The first thing to get is that your web process is pushing things to the Redis storage, which is where your queue is physically located. Once that's clear, then it's easier to see that your Sidekiq process will also connect to Redis to pull things "off" the queue. So it's: [Web Process] ---> [REDIS] <---- [Sidekiq] Ok, now lets assume that you're running one Dyno in Heroku for your web process with 5 threads. Each of those threads counts as a single connection to redis. So you'd have 1 connection per thread to the redis servier, just from the 1 Dyno / Web process. If you have 2 Dynos running your code, each process with 5 threads, now you have 10 threads, which is 10 connections to Redis. So, Web Process Connections To Redis = Web Dynos x Threads x Redis Connections (1) Boom. So now we're telling Redis, listen we're gonna need you to handle at least 5 connections total from me at any given time. Next, let's imagine the Redis service we're using only allows 20 total simultaneous connections. At this point we know we're using up 5 of that Redis plan's 20 total possible connections. So we only have 15 left to connect to Redis with. Ok, so now what about the Sidekiq process? Well, by default, the Sidekiq process will command 2 connections to Redis. That means that our Redis service with 20 total connections is giving 5 connections to Web Process, and 2 to Sidekiq process. That's 7 total connections used up on our 20 connection plan. So we have a remaining 13 available connections to Redis. This means you can set your Sidekiq Concurrencies to 13 in this case. A formula for this situation is as follows:
|
@adiakritos That's all correct. Sidekiq requires concurrency + 2 connections per process. Client processes can use less connections, as low as 1. If your web process has 10-20 threads in it, I would recommend leaving the connection pool unsized since the default size is 5 -- the 10-20 client web threads will share 5 redis connections. That should be fine for 99% of apps. Since @benoittgt already added content to the FAQ, I've leave it as is. |
Hi there
Firstly, please excuse me for posting on GH Issues. Because I am not so sure if this is an issue of Sidekiq or just something I did wrong on my setup.
I've deployed a simple Rails 3.2.2 app with Sidekiq Monitor app mount to
/sidekiq
. I did not have any custom configuration, all are default Sidekiq.My heroku has 2 worker processes:
Initially the both 2 sidekiq processes are up and running normally. When I browse
/sidekiq
, I got error in heroku logs:and my worker process status turn to
crashed
:and my web process complains about max number of clients reached:
I am new to Sidekiq and Heroku so please let me know if I do something stupid here.
The text was updated successfully, but these errors were encountered: