[Mongrel] Memory leaks in my site

Piet Hadermann piet.hadermann at seagha.com
Thu Mar 8 04:24:40 EST 2007

Quick fix could be to use HAProxy for loadbalancing and setting the max
number of connections per mongrel to 1.
Added bonus here is that all requests get queued op at HAProxy (which is
very conservative on memory use) and routed to the first available
mongrel process instead of getting queued op at mongrel level.
I read nginx would in the near future (or maybe already) have the option
to limit the number of simultaneous proxied connections. But HAProxy is
the only tool that can do this that I have experience with (still
wondering why the Rails community seems to favor Pound so much).


	From: mongrel-users-bounces at rubyforge.org
[mailto:mongrel-users-bounces at rubyforge.org] On Behalf Of Alexey
	Sent: donderdag 8 maart 2007 7:36
	To: mongrel-users at rubyforge.org
	Subject: Re: [Mongrel] Memory leaks in my site
	It also doesn't leak any memory at all *when it is not
overloaded*. E.g., under maximum non-concurent load (single-threaded
test client that fires the next request immediately upon receiving a
response to the previous one), it stays up forever.
	When Mongrel + "Hello, World" is overloaded, there is a memory
leak to the tune of 6 Mb per hour. I have yet to figure out where is it
coming from. 
	Yes. Meantime, the recipe apparently is "serve static stuff
through an upstream web server, and use smaller values of --num--procs".
Mongrel that only receives dynamic requests is, essentialy, a
single-threaded process, anyway. The only reason to have more than one
(1) thread is so that other requests can queue up while it's doing
something that takes time. Cool. 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://rubyforge.org/pipermail/mongrel-users/attachments/20070308/88ef70f1/attachment.html 

More information about the Mongrel-users mailing list